spark-instrumented-optimizer/dev
Nicholas Chammas 69c3f441a9 [SPARK-3479] [Build] Report failed test category
This PR allows SparkQA (i.e. Jenkins) to report in its posts to GitHub what category of test failed, if one can be determined.

The failure categories are:
* general failure
* RAT checks failed
* Scala style checks failed
* Python style checks failed
* Build failed
* Spark unit tests failed
* PySpark unit tests failed
* MiMa checks failed

This PR also fixes the diffing logic used to determine if a patch introduces new classes.

Author: Nicholas Chammas <nicholas.chammas@gmail.com>

Closes #2606 from nchammas/report-failed-test-category and squashes the following commits:

d67df03 [Nicholas Chammas] report what test category failed
2014-10-06 14:19:06 -07:00
..
audit-release [SPARK-2784][SQL] Deprecate hql() method in favor of a config option, 'spark.sql.dialect' 2014-08-03 12:28:29 -07:00
create-release BUILD: Adding back CDH4 as per user requests 2014-08-29 22:24:35 -07:00
check-license SPARK-3745 - fix check-license to properly download and check jar 2014-09-30 13:11:25 -07:00
github_jira_sync.py SPARK-2596 HOTFIX: Deal with non-existent JIRAs. 2014-07-19 20:06:28 -07:00
lint-python SPARK-3337 Paranoid quoting in shell to allow install dirs with spaces within. 2014-09-08 10:24:15 -07:00
lint-scala [SPARK-2627] [PySpark] have the build enforce PEP 8 automatically 2014-08-06 12:58:24 -07:00
merge_spark_pr.py HOTFIX: Fix unicode error in merge script. 2014-10-05 13:22:40 -07:00
mima [SPARK-3433][BUILD] Fix for Mima false-positives with @DeveloperAPI and @Experimental annotations. 2014-09-15 21:14:00 -07:00
README.md Merge pull request #565 from pwendell/dev-scripts. Closes #565. 2014-02-08 23:13:34 -08:00
run-tests [SPARK-3479] [Build] Report failed test category 2014-10-06 14:19:06 -07:00
run-tests-codes.sh [SPARK-3479] [Build] Report failed test category 2014-10-06 14:19:06 -07:00
run-tests-jenkins [SPARK-3479] [Build] Report failed test category 2014-10-06 14:19:06 -07:00
scalastyle SPARK-3337 Paranoid quoting in shell to allow install dirs with spaces within. 2014-09-08 10:24:15 -07:00

Spark Developer Scripts

This directory contains scripts useful to developers when packaging, testing, or committing to Spark.

Many of these scripts require Apache credentials to work correctly.