spark-instrumented-optimizer/bin
Marcelo Vanzin e4561e1c55 [SPARK-25897][K8S] Hook up k8s integration tests to sbt build.
The integration tests can now be run in sbt if the right profile
is enabled, using the "test" task under the respective project.

This avoids having to fall back to maven to run the tests, which
invalidates all your compiled stuff when you go back to sbt, making
development way slower than it should.

There's also a task to run the tests directly without refreshing
the docker images, which is helpful if you just made a change to
the submission code which should not affect the code in the images.

The sbt tasks currently are not very customizable; there's some
very minor things you can set in the sbt shell itself, but otherwise
it's hardcoded to run on minikube.

I also had to make some slight adjustments to the IT code itself,
mostly to remove assumptions about the existing harness.

Tested on sbt and maven.

Closes #22909 from vanzin/SPARK-25897.

Authored-by: Marcelo Vanzin <vanzin@cloudera.com>
Signed-off-by: Marcelo Vanzin <vanzin@cloudera.com>
2018-11-07 13:19:31 -08:00
..
beeline [SPARK-1267][SPARK-18129] Allow PySpark to be pip installed 2016-11-16 14:22:15 -08:00
beeline.cmd [SPARK-21877][DEPLOY, WINDOWS] Handle quotes in Windows command scripts 2017-10-06 23:38:47 +09:00
docker-image-tool.sh [SPARK-25897][K8S] Hook up k8s integration tests to sbt build. 2018-11-07 13:19:31 -08:00
find-spark-home [MINOR] Fix a bunch of typos 2018-01-02 07:10:19 +09:00
find-spark-home.cmd [SPARK-22597][SQL] Add spark-sql cmd script for Windows users 2017-11-24 19:55:26 +01:00
load-spark-env.cmd [SPARK-22466][SPARK SUBMIT] export SPARK_CONF_DIR while conf is default 2017-11-09 14:33:08 +09:00
load-spark-env.sh [SPARK-22466][SPARK SUBMIT] export SPARK_CONF_DIR while conf is default 2017-11-09 14:33:08 +09:00
pyspark [SPARK-25891][PYTHON] Upgrade to Py4J 0.10.8.1 2018-10-31 09:55:03 -07:00
pyspark.cmd [SPARK-21877][DEPLOY, WINDOWS] Handle quotes in Windows command scripts 2017-10-06 23:38:47 +09:00
pyspark2.cmd [SPARK-25891][PYTHON] Upgrade to Py4J 0.10.8.1 2018-10-31 09:55:03 -07:00
run-example [SPARK-1267][SPARK-18129] Allow PySpark to be pip installed 2016-11-16 14:22:15 -08:00
run-example.cmd [SPARK-22597][SQL] Add spark-sql cmd script for Windows users 2017-11-24 19:55:26 +01:00
spark-class [SPARK-20546][DEPLOY] spark-class gets syntax error in posix mode 2017-05-05 11:36:51 +01:00
spark-class.cmd [SPARK-21877][DEPLOY, WINDOWS] Handle quotes in Windows command scripts 2017-10-06 23:38:47 +09:00
spark-class2.cmd [SPARK-22495] Fix setup of SPARK_HOME variable on Windows 2017-11-23 12:47:38 +09:00
spark-shell [SPARK-25906][SHELL] Documents '-I' option (from Scala REPL) in spark-shell 2018-11-06 10:39:58 +08:00
spark-shell.cmd [SPARK-21877][DEPLOY, WINDOWS] Handle quotes in Windows command scripts 2017-10-06 23:38:47 +09:00
spark-shell2.cmd [SPARK-25906][SHELL] Documents '-I' option (from Scala REPL) in spark-shell 2018-11-06 10:39:58 +08:00
spark-sql [SPARK-1267][SPARK-18129] Allow PySpark to be pip installed 2016-11-16 14:22:15 -08:00
spark-sql.cmd [SPARK-22597][SQL] Add spark-sql cmd script for Windows users 2017-11-24 19:55:26 +01:00
spark-sql2.cmd [SPARK-22597][SQL] Add spark-sql cmd script for Windows users 2017-11-24 19:55:26 +01:00
spark-submit [SPARK-1267][SPARK-18129] Allow PySpark to be pip installed 2016-11-16 14:22:15 -08:00
spark-submit.cmd [SPARK-21877][DEPLOY, WINDOWS] Handle quotes in Windows command scripts 2017-10-06 23:38:47 +09:00
spark-submit2.cmd [SPARK-11518][DEPLOY, WINDOWS] Handle spaces in Windows command scripts 2016-02-10 09:54:22 +00:00
sparkR [SPARK-1267][SPARK-18129] Allow PySpark to be pip installed 2016-11-16 14:22:15 -08:00
sparkR.cmd [SPARK-21877][DEPLOY, WINDOWS] Handle quotes in Windows command scripts 2017-10-06 23:38:47 +09:00
sparkR2.cmd [SPARK-22597][SQL] Add spark-sql cmd script for Windows users 2017-11-24 19:55:26 +01:00