spark-instrumented-optimizer/dev
hyukjinkwon ceaf77ae43 [SPARK-18692][BUILD][DOCS] Test Java 8 unidoc build on Jenkins
## What changes were proposed in this pull request?

This PR proposes to run Spark unidoc to test Javadoc 8 build as Javadoc 8 is easily re-breakable.

There are several problems with it:

- It introduces little extra bit of time to run the tests. In my case, it took 1.5 mins more (`Elapsed :[94.8746569157]`). How it was tested is described in "How was this patch tested?".

- > One problem that I noticed was that Unidoc appeared to be processing test sources: if we can find a way to exclude those from being processed in the first place then that might significantly speed things up.

  (see  joshrosen's [comment](https://issues.apache.org/jira/browse/SPARK-18692?focusedCommentId=15947627&page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel#comment-15947627))

To complete this automated build, It also suggests to fix existing Javadoc breaks / ones introduced by test codes as described above.

There fixes are similar instances that previously fixed. Please refer https://github.com/apache/spark/pull/15999 and https://github.com/apache/spark/pull/16013

Note that this only fixes **errors** not **warnings**. Please see my observation https://github.com/apache/spark/pull/17389#issuecomment-288438704 for spurious errors by warnings.

## How was this patch tested?

Manually via `jekyll build` for building tests. Also, tested via running `./dev/run-tests`.

This was tested via manually adding `time.time()` as below:

```diff
     profiles_and_goals = build_profiles + sbt_goals

     print("[info] Building Spark unidoc (w/Hive 1.2.1) using SBT with these arguments: ",
           " ".join(profiles_and_goals))

+    import time
+    st = time.time()
     exec_sbt(profiles_and_goals)
+    print("Elapsed :[%s]" % str(time.time() - st))
```

produces

```
...
========================================================================
Building Unidoc API Documentation
========================================================================
...
[info] Main Java API documentation successful.
...
Elapsed :[94.8746569157]
...

Author: hyukjinkwon <gurwls223@gmail.com>

Closes #17477 from HyukjinKwon/SPARK-18692.
2017-04-12 12:38:48 +01:00
..
create-release [SPARK-20102] Fix nightly packaging and RC packaging scripts w/ two minor build fixes 2017-03-27 10:23:28 -07:00
deps [SPARK-19464][CORE][YARN][TEST-HADOOP2.6] Remove support for Hadoop 2.5 and earlier 2017-02-08 12:20:07 +00:00
sparktestsupport [SPARK-19505][PYTHON] AttributeError on Exception.message in Python3 2017-04-11 12:18:31 -07:00
tests [SPARK-10359] Enumerate dependencies in a file and diff against it for new pull requests 2015-12-30 12:47:42 -08:00
.gitignore [SPARK-6219] Reuse pep8.py 2015-04-18 16:46:28 -07:00
.rat-excludes [SPARK-19517][SS] KafkaSource fails to initialize partition offsets 2017-02-17 11:44:18 -08:00
appveyor-guide.md [SPARK-17200][PROJECT INFRA][BUILD][SPARKR] Automate building and testing on Windows (currently SparkR only) 2016-09-08 08:26:59 -07:00
appveyor-install-dependencies.ps1 [SPARK-19550][BUILD][CORE][WIP] Remove Java 7 support 2017-02-16 12:32:45 +00:00
change-scala-version.sh [SPARK-9250] Make change-scala-version more helpful w.r.t. valid Scala versions 2015-07-24 17:09:33 +01:00
change-version-to-2.10.sh [SPARK-9304] [BUILD] Improve backwards compatibility of SPARK-8401 2015-07-25 11:05:08 +01:00
change-version-to-2.11.sh [SPARK-9304] [BUILD] Improve backwards compatibility of SPARK-8401 2015-07-25 11:05:08 +01:00
check-license [SPARK-13596][BUILD] Move misc top-level build files into appropriate subdirs 2016-03-07 14:48:02 -08:00
checkstyle-suppressions.xml [MINOR] Fix Java Lint errors introduced by #13286 and #13280 2016-06-08 14:51:00 +01:00
checkstyle.xml [SPARK-18073][DOCS][WIP] Migrate wiki to spark.apache.org web site 2016-11-23 11:25:47 +00:00
github_jira_sync.py [SPARK-19002][BUILD][PYTHON] Check pep8 against all Python scripts 2017-01-02 15:23:19 +00:00
lint-java [SPARK-16967] move mesos to module 2016-08-26 12:25:22 -07:00
lint-python [SPARK-19002][BUILD][PYTHON] Check pep8 against all Python scripts 2017-01-02 15:23:19 +00:00
lint-r [SPARK-10328] [SPARKR] Fix generic for na.omit 2015-08-28 00:37:50 -07:00
lint-r.R [SPARK-14074][SPARKR] Specify commit sha1 ID when using install_github to install intr package. 2016-03-23 07:57:03 -07:00
lint-scala [SPARK-2627] [PySpark] have the build enforce PEP 8 automatically 2014-08-06 12:58:24 -07:00
make-distribution.sh [SPARK-20123][BUILD] SPARK_HOME variable might have spaces in it(e.g. $SPARK… 2017-04-02 15:31:13 +01:00
merge_spark_pr.py [SPARK-19002][BUILD][PYTHON] Check pep8 against all Python scripts 2017-01-02 15:23:19 +00:00
mima [SPARK-19550][HOTFIX][BUILD] Use JAVA_HOME/bin/java if JAVA_HOME is set in dev/mima 2017-02-16 18:43:38 +00:00
pip-sanity-check.py [SPARK-19064][PYSPARK] Fix pip installing of sub components 2017-01-25 14:43:39 -08:00
README.md Merge pull request #565 from pwendell/dev-scripts. Closes #565. 2014-02-08 23:13:34 -08:00
requirements.txt [SPARK-19064][PYSPARK] Fix pip installing of sub components 2017-01-25 14:43:39 -08:00
run-pip-tests [SPARK-19955][PYSPARK] Jenkins Python Conda based test. 2017-03-29 11:41:17 -07:00
run-tests [SPARK-5161] Parallelize Python test execution 2015-06-29 21:32:40 -07:00
run-tests-jenkins [SPARK-19955][PYSPARK] Jenkins Python Conda based test. 2017-03-29 11:41:17 -07:00
run-tests-jenkins.py [SPARK-19464][BUILD][HOTFIX] run-tests should use hadoop2.6 2017-02-08 21:28:04 +00:00
run-tests.py [SPARK-18692][BUILD][DOCS] Test Java 8 unidoc build on Jenkins 2017-04-12 12:38:48 +01:00
scalastyle [SPARK-16967] move mesos to module 2016-08-26 12:25:22 -07:00
test-dependencies.sh [SPARK-19550][BUILD][CORE][WIP] Remove Java 7 support 2017-02-16 12:32:45 +00:00
tox.ini [SPARK-13596][BUILD] Move misc top-level build files into appropriate subdirs 2016-03-07 14:48:02 -08:00

Spark Developer Scripts

This directory contains scripts useful to developers when packaging, testing, or committing to Spark.

Many of these scripts require Apache credentials to work correctly.