spark-instrumented-optimizer/dev
Cheng Lian 534b231417 [SPARK-4000][Build] Uploads HiveCompatibilitySuite logs
This is a follow up of #2845. In addition to unit-tests.log files, also upload failure output files generated by `HiveCompatibilitySuite` to Jenkins master. These files can be very helpful to debug Hive compatibility test failures.

/cc pwendell marmbrus

Author: Cheng Lian <lian@databricks.com>

Closes #2993 from liancheng/upload-hive-compat-logs and squashes the following commits:

8e6247f [Cheng Lian] Uploads HiveCompatibilitySuite logs
2014-11-10 16:17:52 -08:00
..
audit-release [SPARK-2784][SQL] Deprecate hql() method in favor of a config option, 'spark.sql.dialect' 2014-08-03 12:28:29 -07:00
create-release BUILD: Adding back CDH4 as per user requests 2014-08-29 22:24:35 -07:00
check-license SPARK-3745 - fix check-license to properly download and check jar 2014-09-30 13:11:25 -07:00
github_jira_sync.py SPARK-2596 HOTFIX: Deal with non-existent JIRAs. 2014-07-19 20:06:28 -07:00
lint-python SPARK-3337 Paranoid quoting in shell to allow install dirs with spaces within. 2014-09-08 10:24:15 -07:00
lint-scala [SPARK-2627] [PySpark] have the build enforce PEP 8 automatically 2014-08-06 12:58:24 -07:00
merge_spark_pr.py HOTFIX: Fix unicode error in merge script. 2014-10-05 13:22:40 -07:00
mima [SPARK-3433][BUILD] Fix for Mima false-positives with @DeveloperAPI and @Experimental annotations. 2014-09-15 21:14:00 -07:00
README.md Merge pull request #565 from pwendell/dev-scripts. Closes #565. 2014-02-08 23:13:34 -08:00
run-tests [SPARK-3573][MLLIB] Make MLlib's Vector compatible with SQL's SchemaRDD 2014-11-03 22:29:48 -08:00
run-tests-codes.sh [SPARK-3479] [Build] Report failed test category 2014-10-06 14:19:06 -07:00
run-tests-jenkins [SPARK-4000][Build] Uploads HiveCompatibilitySuite logs 2014-11-10 16:17:52 -08:00
scalastyle [SPARK-3997][Build]scalastyle should output the error location 2014-10-26 16:24:50 -07:00

Spark Developer Scripts

This directory contains scripts useful to developers when packaging, testing, or committing to Spark.

Many of these scripts require Apache credentials to work correctly.