spark-instrumented-optimizer/dev
Sandy Ryza 912563aa35 SPARK-4338. [YARN] Ditch yarn-alpha.
Sorry if this is a little premature with 1.2 still not out the door, but it will make other work like SPARK-4136 and SPARK-2089 a lot easier.

Author: Sandy Ryza <sandy@cloudera.com>

Closes #3215 from sryza/sandy-spark-4338 and squashes the following commits:

1c5ac08 [Sandy Ryza] Update building Spark docs and remove unnecessary newline
9c1421c [Sandy Ryza] SPARK-4338. Ditch yarn-alpha.
2014-12-09 11:02:43 -08:00
..
audit-release [Release] Bring audit scripts up-to-date 2014-11-12 16:35:39 -08:00
create-release [HOTFIX] Fixing two issues with the release script. 2014-12-04 12:11:41 -08:00
change-version-to-2.10.sh Support cross building for Scala 2.11 2014-11-11 21:36:48 -08:00
change-version-to-2.11.sh Support cross building for Scala 2.11 2014-11-11 21:36:48 -08:00
check-license SPARK-3745 - fix check-license to properly download and check jar 2014-09-30 13:11:25 -07:00
github_jira_sync.py SPARK-2596 HOTFIX: Deal with non-existent JIRAs. 2014-07-19 20:06:28 -07:00
lint-python SPARK-3337 Paranoid quoting in shell to allow install dirs with spaces within. 2014-09-08 10:24:15 -07:00
lint-scala [SPARK-2627] [PySpark] have the build enforce PEP 8 automatically 2014-08-06 12:58:24 -07:00
merge_spark_pr.py SPARK-4507: PR merge script should support closing multiple JIRA tickets 2014-11-29 23:12:10 -05:00
mima [SPARK-3433][BUILD] Fix for Mima false-positives with @DeveloperAPI and @Experimental annotations. 2014-09-15 21:14:00 -07:00
README.md Merge pull request #565 from pwendell/dev-scripts. Closes #565. 2014-02-08 23:13:34 -08:00
run-tests Support cross building for Scala 2.11 2014-11-11 21:36:48 -08:00
run-tests-codes.sh [SPARK-3479] [Build] Report failed test category 2014-10-06 14:19:06 -07:00
run-tests-jenkins [SPARK-4000][Build] Uploads HiveCompatibilitySuite logs 2014-11-10 16:17:52 -08:00
scalastyle SPARK-4338. [YARN] Ditch yarn-alpha. 2014-12-09 11:02:43 -08:00

Spark Developer Scripts

This directory contains scripts useful to developers when packaging, testing, or committing to Spark.

Many of these scripts require Apache credentials to work correctly.