spark-instrumented-optimizer/bin
Reynold Xin 61b427d4b1 [SPARK-5193][SQL] Remove Spark SQL Java-specific API.
After the following patches, the main (Scala) API is now usable for Java users directly.

https://github.com/apache/spark/pull/4056
https://github.com/apache/spark/pull/4054
https://github.com/apache/spark/pull/4049
https://github.com/apache/spark/pull/4030
https://github.com/apache/spark/pull/3965
https://github.com/apache/spark/pull/3958

Author: Reynold Xin <rxin@databricks.com>

Closes #4065 from rxin/sql-java-api and squashes the following commits:

b1fd860 [Reynold Xin] Fix Mima
6d86578 [Reynold Xin] Ok one more attempt in fixing Python...
e8f1455 [Reynold Xin] Fix Python again...
3e53f91 [Reynold Xin] Fixed Python.
83735da [Reynold Xin] Fix BigDecimal test.
e9f1de3 [Reynold Xin] Use scala BigDecimal.
500d2c4 [Reynold Xin] Fix Decimal.
ba3bfa2 [Reynold Xin] Updated javadoc for RowFactory.
c4ae1c5 [Reynold Xin] [SPARK-5193][SQL] Remove Spark SQL Java-specific API.
2015-01-16 21:09:06 -08:00
..
beeline SPARK-3337 Paranoid quoting in shell to allow install dirs with spaces within. 2014-09-08 10:24:15 -07:00
beeline.cmd [SPARK-4683][SQL] Add a beeline.cmd to run on Windows 2014-12-04 10:21:03 -08:00
compute-classpath.cmd [SPARK-4048] Enhance and extend hadoop-provided profile. 2015-01-08 17:15:13 -08:00
compute-classpath.sh [SPARK-4048] Enhance and extend hadoop-provided profile. 2015-01-08 17:15:13 -08:00
load-spark-env.sh Support cross building for Scala 2.11 2014-11-11 21:36:48 -08:00
pyspark [SPARK-4415] [PySpark] JVM should exit after Python exit 2014-11-14 20:14:33 -08:00
pyspark.cmd moved user scripts to bin folder 2013-09-23 12:46:48 +08:00
pyspark2.cmd [SPARK-4415] [PySpark] JVM should exit after Python exit 2014-11-14 20:14:33 -08:00
run-example Support cross building for Scala 2.11 2014-11-11 21:36:48 -08:00
run-example.cmd moved user scripts to bin folder 2013-09-23 12:46:48 +08:00
run-example2.cmd [SPARK-3775] Not suitable error message in spark-shell.cmd 2014-10-03 13:09:48 -07:00
spark-class [SPARK-5193][SQL] Remove Spark SQL Java-specific API. 2015-01-16 21:09:06 -08:00
spark-class.cmd sbin/spark-class* -> bin/spark-class* 2014-01-03 15:08:01 +05:30
spark-class2.cmd [SPARK-3775] Not suitable error message in spark-shell.cmd 2014-10-03 13:09:48 -07:00
spark-shell [SPARK-4161]Spark shell class path is not correctly set if "spark.driver.extraClassPath" is set in defaults.conf 2014-12-10 12:26:42 -08:00
spark-shell.cmd [SPARK-3943] Some scripts bin\*.cmd pollutes environment variables in Windows 2014-10-14 18:50:14 -07:00
spark-shell2.cmd [SPARK-3060] spark-shell.cmd doesn't accept application options in Windows OS 2014-12-19 19:22:42 -08:00
spark-sql [SPARK-4623]Add the some error infomation if using spark-sql in yarn-cluster mode 2014-11-30 16:19:41 -08:00
spark-submit [SPARK-4990][Deploy]to find default properties file, search SPARK_CONF_DIR first 2015-01-09 17:10:02 -08:00
spark-submit.cmd [SPARK-3943] Some scripts bin\*.cmd pollutes environment variables in Windows 2014-10-14 18:50:14 -07:00
spark-submit2.cmd [SPARK-4990][Deploy]to find default properties file, search SPARK_CONF_DIR first 2015-01-09 17:10:02 -08:00
utils.sh [SPARK-3774] typo comment in bin/utils.sh 2014-10-03 13:12:37 -07:00
windows-utils.cmd [SPARK-3060] spark-shell.cmd doesn't accept application options in Windows OS 2014-12-19 19:22:42 -08:00