Reputation: 11895
According to the docs:
For Step type, choose Spark application.
But in Amazon EMR -> Clusters -> mycluster -> Steps -> Add step -> Step type, the only options are:
Upvotes: 3
Views: 7575
Reputation: 862
There are two ways to add EMR spark steps:
- Using command-runner.jar (custom application)
spark-submit --class org.apache.spark.examples.SparkPi /usr/lib/spark/examples/jars/spark-examples.jar 10
Using aws cli to do the same
aws emr add-steps --cluster-id j-xxxxxxxx --steps Name="add emr step to run spark",Jar="command-runner.jar",Args=[spark-submit,--class,org.apache.spark.examples.SparkPi,/usr/lib/spark/examples/jars/spark-examples.jar,10]
Upvotes: 3
Reputation: 1362
You can use command-runner.jar for your use case. For the step type let it be Custom Jar from the options that you have. Check out this image for detail.
You can read more about command-runner.jar command-runner-usage
Upvotes: 1
Reputation: 11895
I don't have a Spark Application option because I created a Core Hadoop cluster.
When I created the cluster, under Software configuration, I should have chosen Spark, then I would have had the Spark application option under Step type.
Upvotes: 1