8749f2e87a
### What changes were proposed in this pull request? The PR checks for the emptiness of `--py-files` value and uses it only if it is not empty. ### Why are the changes needed? There is a bug in Mesos cluster mode REST Submission API. It is using `--py-files` option without specifying any value for the conf `spark.submit.pyFiles` by the user. ### Does this PR introduce _any_ user-facing change? No ### How was this patch tested? * Submitting an application to a Mesos cluster: `curl -X POST http://localhost:7077/v1/submissions/create --header "Content-Type:application/json" --data '{ "action": "CreateSubmissionRequest", "appResource": "file:///opt/spark-3.0.0-bin-3.2.0/examples/jars/spark-examples_2.12-3.0.0.jar", "clientSparkVersion": "3.0.0", "appArgs": ["30"], "environmentVariables": {}, "mainClass": "org.apache.spark.examples.SparkPi", "sparkProperties": { "spark.jars": "file:///opt/spark-3.0.0-bin-3.2.0/examples/jars/spark-examples_2.12-3.0.0.jar", "spark.driver.supervise": "false", "spark.executor.memory": "512m", "spark.driver.memory": "512m", "spark.submit.deployMode": "cluster", "spark.app.name": "SparkPi", "spark.master": "mesos://localhost:5050" }}'` * It should be able to pick the correct class and run the job successfully. Closes #29499 from farhan5900/SPARK-32675. Authored-by: farhan5900 <farhan5900@gmail.com> Signed-off-by: Dongjoon Hyun <dongjoon@apache.org> |
||
---|---|---|
.. | ||
src | ||
pom.xml |