spark-instrumented-optimizer/dev
Nicholas Chammas c429126066 [Build] Diff from branch point
Sometimes Jenkins posts [spurious reports of new classes being added](https://github.com/apache/spark/pull/2339#issuecomment-56570170). I believe this stems from diffing the patch against `master`, as opposed to against `master...`, which starts from the commit the PR was branched from.

This patch fixes that behavior.

Author: Nicholas Chammas <nicholas.chammas@gmail.com>

Closes #2512 from nchammas/diff-only-commits-ahead and squashes the following commits:

c065599 [Nicholas Chammas] comment typo fix
a453c67 [Nicholas Chammas] diff from branch point
2014-09-24 11:33:58 -07:00
..
audit-release [SPARK-2784][SQL] Deprecate hql() method in favor of a config option, 'spark.sql.dialect' 2014-08-03 12:28:29 -07:00
create-release BUILD: Adding back CDH4 as per user requests 2014-08-29 22:24:35 -07:00
check-license SPARK-3337 Paranoid quoting in shell to allow install dirs with spaces within. 2014-09-08 10:24:15 -07:00
github_jira_sync.py SPARK-2596 HOTFIX: Deal with non-existent JIRAs. 2014-07-19 20:06:28 -07:00
lint-python SPARK-3337 Paranoid quoting in shell to allow install dirs with spaces within. 2014-09-08 10:24:15 -07:00
lint-scala [SPARK-2627] [PySpark] have the build enforce PEP 8 automatically 2014-08-06 12:58:24 -07:00
merge_spark_pr.py [SPARK-3425] do not set MaxPermSize for OpenJDK 1.8 2014-09-15 10:57:59 -07:00
mima [SPARK-3433][BUILD] Fix for Mima false-positives with @DeveloperAPI and @Experimental annotations. 2014-09-15 21:14:00 -07:00
README.md Merge pull request #565 from pwendell/dev-scripts. Closes #565. 2014-02-08 23:13:34 -08:00
run-tests [Build] Fix passing of args to sbt 2014-09-19 15:44:47 -07:00
run-tests-jenkins [Build] Diff from branch point 2014-09-24 11:33:58 -07:00
scalastyle SPARK-3337 Paranoid quoting in shell to allow install dirs with spaces within. 2014-09-08 10:24:15 -07:00

Spark Developer Scripts

This directory contains scripts useful to developers when packaging, testing, or committing to Spark.

Many of these scripts require Apache credentials to work correctly.