spark-instrumented-optimizer/docs
Sean Owen 15462e1a8f [SPARK-28004][UI] Update jquery to 3.4.1
## What changes were proposed in this pull request?

We're using an old-ish jQuery, 1.12.4, and should probably update for Spark 3 to keep up in general, but also to keep up with CVEs. In fact, we know of at least one resolved in only 3.4.0+ (https://nvd.nist.gov/vuln/detail/CVE-2019-11358). They may not affect Spark, but, if the update isn't painful, maybe worthwhile in order to make future 3.x updates easier.

jQuery 1 -> 2 doesn't sound like a breaking change, as 2.0 is supposed to maintain compatibility with 1.9+ (https://blog.jquery.com/2013/04/18/jquery-2-0-released/)

2 -> 3 has breaking changes: https://jquery.com/upgrade-guide/3.0/. It's hard to evaluate each one, but the most likely area for problems is in ajax(). However, our usage of jQuery (and plugins) is pretty simple.

Update jquery to 3.4.1; update jquery blockUI and mustache to latest

## How was this patch tested?

Manual testing of docs build (except R docs), worker/master UI, spark application UI.
Note: this really doesn't guarantee it works, as our tests can't test javascript, and this is merely anecdotal testing, although I clicked about every link I could find. There's a risk this breaks a minor part of the UI; it does seem to work fine in the main.

Closes #24843 from srowen/SPARK-28004.

Authored-by: Sean Owen <sean.owen@databricks.com>
Signed-off-by: Dongjoon Hyun <dhyun@apple.com>
2019-06-14 22:19:20 -07:00
..
_data [SPARK-26215][SQL] Define reserved/non-reserved keywords based on the ANSI SQL standard 2019-02-23 08:38:47 +09:00
_includes [SPARK-24499][SQL][DOC] Split the page of sql-programming-guide.html to multiple separate pages 2018-10-18 11:59:06 -07:00
_layouts [SPARK-28004][UI] Update jquery to 3.4.1 2019-06-14 22:19:20 -07:00
_plugins [SPARK-25956] Make Scala 2.12 as default Scala version in Spark 3.0 2018-11-14 16:22:23 -08:00
css [MINOR][DOC] Fix some typos and grammar issues 2018-04-06 13:37:08 +08:00
img [SPARK-22648][K8S] Spark on Kubernetes - Documentation 2017-12-21 17:21:11 -08:00
js [SPARK-28004][UI] Update jquery to 3.4.1 2019-06-14 22:19:20 -07:00
_config.yml [SPARK-26266][BUILD] Update to Scala 2.12.8 2018-12-08 05:59:53 -06:00
api.md [SPARK-26918][DOCS] All .md should have ASF license header 2019-03-30 19:49:45 -05:00
building-spark.md [MINOR][DOCS] Tighten up some key links to the project and download pages to use HTTPS 2019-05-21 10:56:42 -07:00
cloud-integration.md [SPARK-26918][DOCS] All .md should have ASF license header 2019-03-30 19:49:45 -05:00
cluster-overview.md [SPARK-26918][DOCS] All .md should have ASF license header 2019-03-30 19:49:45 -05:00
configuration.md [SPARK-27760][CORE] Spark resources - change user resource config from .count to .amount 2019-06-06 14:16:05 -05:00
contributing-to-spark.md [SPARK-26918][DOCS] All .md should have ASF license header 2019-03-30 19:49:45 -05:00
graphx-programming-guide.md [SPARK-26918][DOCS] All .md should have ASF license header 2019-03-30 19:49:45 -05:00
hadoop-provided.md [SPARK-26918][DOCS] All .md should have ASF license header 2019-03-30 19:49:45 -05:00
hardware-provisioning.md [MINOR][DOCS] Fix some broken links in docs 2019-04-13 22:27:25 +09:00
index.md [SPARK-27942][DOCS][PYTHON] Note that Python 2.7 is deprecated in Spark documentation 2019-06-04 07:59:25 -07:00
job-scheduling.md [SPARK-26918][DOCS] All .md should have ASF license header 2019-03-30 19:49:45 -05:00
ml-advanced.md [MINOR][DOCS] Fix some broken links in docs 2019-04-13 22:27:25 +09:00
ml-ann.md [SPARK-26918][DOCS] All .md should have ASF license header 2019-03-30 19:49:45 -05:00
ml-classification-regression.md [SPARK-26918][DOCS] All .md should have ASF license header 2019-03-30 19:49:45 -05:00
ml-clustering.md [SPARK-26918][DOCS] All .md should have ASF license header 2019-03-30 19:49:45 -05:00
ml-collaborative-filtering.md [SPARK-26918][DOCS] All .md should have ASF license header 2019-03-30 19:49:45 -05:00
ml-datasource.md [SPARK-26918][DOCS] All .md should have ASF license header 2019-03-30 19:49:45 -05:00
ml-decision-tree.md [SPARK-26918][DOCS] All .md should have ASF license header 2019-03-30 19:49:45 -05:00
ml-ensembles.md [SPARK-26918][DOCS] All .md should have ASF license header 2019-03-30 19:49:45 -05:00
ml-features.md [SPARK-26918][DOCS] All .md should have ASF license header 2019-03-30 19:49:45 -05:00
ml-frequent-pattern-mining.md [SPARK-26918][DOCS] All .md should have ASF license header 2019-03-30 19:49:45 -05:00
ml-guide.md [SPARK-26918][DOCS] All .md should have ASF license header 2019-03-30 19:49:45 -05:00
ml-linear-methods.md [SPARK-26918][DOCS] All .md should have ASF license header 2019-03-30 19:49:45 -05:00
ml-migration-guides.md [SPARK-26918][DOCS] All .md should have ASF license header 2019-03-30 19:49:45 -05:00
ml-pipeline.md [SPARK-26918][DOCS] All .md should have ASF license header 2019-03-30 19:49:45 -05:00
ml-statistics.md [SPARK-26918][DOCS] All .md should have ASF license header 2019-03-30 19:49:45 -05:00
ml-survival-regression.md [SPARK-26918][DOCS] All .md should have ASF license header 2019-03-30 19:49:45 -05:00
ml-tuning.md [SPARK-26918][DOCS] All .md should have ASF license header 2019-03-30 19:49:45 -05:00
mllib-classification-regression.md [SPARK-26918][DOCS] All .md should have ASF license header 2019-03-30 19:49:45 -05:00
mllib-clustering.md [MINOR][DOCS] Fix some broken links in docs 2019-04-13 22:27:25 +09:00
mllib-collaborative-filtering.md [SPARK-26918][DOCS] All .md should have ASF license header 2019-03-30 19:49:45 -05:00
mllib-data-types.md [SPARK-26918][DOCS] All .md should have ASF license header 2019-03-30 19:49:45 -05:00
mllib-decision-tree.md [SPARK-26918][DOCS] All .md should have ASF license header 2019-03-30 19:49:45 -05:00
mllib-dimensionality-reduction.md [SPARK-26918][DOCS] All .md should have ASF license header 2019-03-30 19:49:45 -05:00
mllib-ensembles.md [SPARK-26918][DOCS] All .md should have ASF license header 2019-03-30 19:49:45 -05:00
mllib-evaluation-metrics.md [SPARK-26918][DOCS] All .md should have ASF license header 2019-03-30 19:49:45 -05:00
mllib-feature-extraction.md [SPARK-26918][DOCS] All .md should have ASF license header 2019-03-30 19:49:45 -05:00
mllib-frequent-pattern-mining.md [SPARK-26918][DOCS] All .md should have ASF license header 2019-03-30 19:49:45 -05:00
mllib-guide.md [SPARK-26918][DOCS] All .md should have ASF license header 2019-03-30 19:49:45 -05:00
mllib-isotonic-regression.md [SPARK-26918][DOCS] All .md should have ASF license header 2019-03-30 19:49:45 -05:00
mllib-linear-methods.md [SPARK-26918][DOCS] All .md should have ASF license header 2019-03-30 19:49:45 -05:00
mllib-migration-guides.md [SPARK-26918][DOCS] All .md should have ASF license header 2019-03-30 19:49:45 -05:00
mllib-naive-bayes.md [SPARK-26918][DOCS] All .md should have ASF license header 2019-03-30 19:49:45 -05:00
mllib-optimization.md [SPARK-26918][DOCS] All .md should have ASF license header 2019-03-30 19:49:45 -05:00
mllib-pmml-model-export.md [SPARK-26918][DOCS] All .md should have ASF license header 2019-03-30 19:49:45 -05:00
mllib-statistics.md [SPARK-26918][DOCS] All .md should have ASF license header 2019-03-30 19:49:45 -05:00
monitoring.md [SPARK-27773][FOLLOWUP][DOC] Add numCaughtExceptions metric to monitoring doc 2019-06-04 08:40:32 -07:00
programming-guide.md [SPARK-26918][DOCS] All .md should have ASF license header 2019-03-30 19:49:45 -05:00
quick-start.md [SPARK-26918][DOCS] All .md should have ASF license header 2019-03-30 19:49:45 -05:00
rdd-programming-guide.md [SPARK-27942][DOCS][PYTHON] Note that Python 2.7 is deprecated in Spark documentation 2019-06-04 07:59:25 -07:00
README.md [SPARK-27794][R][DOCS] Use https URL for CRAN repo 2019-05-22 14:28:21 -07:00
running-on-kubernetes.md [SPARK-27362][K8S] Resource Scheduling support for k8s 2019-05-31 15:26:14 -05:00
running-on-mesos.md [SPARK-26918][DOCS] All .md should have ASF license header 2019-03-30 19:49:45 -05:00
running-on-yarn.md [SPARK-26918][DOCS] All .md should have ASF license header 2019-03-30 19:49:45 -05:00
security.md [SPARK-26918][DOCS] All .md should have ASF license header 2019-03-30 19:49:45 -05:00
spark-standalone.md [SPARK-26918][DOCS] All .md should have ASF license header 2019-03-30 19:49:45 -05:00
sparkr.md [SPARK-27834][SQL][R][PYTHON] Make separate PySpark/SparkR vectorization configurations 2019-06-03 10:01:37 +09:00
sql-data-sources-avro.md [MINOR][DOC] Avro data source documentation change 2019-06-04 16:17:53 -07:00
sql-data-sources-binaryFile.md [SPARK-27627][SQL] Make option "pathGlobFilter" as a general option for all file sources 2019-05-09 08:41:43 +09:00
sql-data-sources-hive-tables.md [SPARK-27970][SQL] Support Hive 3.0 metastore 2019-06-07 15:24:07 -07:00
sql-data-sources-jdbc.md [SPARK-26918][DOCS] All .md should have ASF license header 2019-03-30 19:49:45 -05:00
sql-data-sources-json.md [SPARK-26918][DOCS] All .md should have ASF license header 2019-03-30 19:49:45 -05:00
sql-data-sources-load-save-functions.md [SPARK-27627][SQL] Make option "pathGlobFilter" as a general option for all file sources 2019-05-09 08:41:43 +09:00
sql-data-sources-orc.md [SPARK-26918][DOCS] All .md should have ASF license header 2019-03-30 19:49:45 -05:00
sql-data-sources-parquet.md [SPARK-26918][DOCS] All .md should have ASF license header 2019-03-30 19:49:45 -05:00
sql-data-sources-troubleshooting.md [SPARK-26918][DOCS] All .md should have ASF license header 2019-03-30 19:49:45 -05:00
sql-data-sources.md [SPARK-27472] add user guide for binary file data source 2019-04-29 08:58:56 -07:00
sql-distributed-sql-engine.md [SPARK-26918][DOCS] All .md should have ASF license header 2019-03-30 19:49:45 -05:00
sql-getting-started.md [SPARK-26918][DOCS] All .md should have ASF license header 2019-03-30 19:49:45 -05:00
sql-keywords.md [SPARK-27949][SQL] Support SUBSTRING(str FROM n1 [FOR n2]) syntax 2019-06-10 09:05:10 -07:00
sql-migration-guide-hive-compatibility.md [SPARK-27970][SQL] Support Hive 3.0 metastore 2019-06-07 15:24:07 -07:00
sql-migration-guide-upgrade.md [SPARK-21136][SQL] Disallow FROM-only statements and show better warnings for Hive-style single-from statements 2019-06-11 18:30:56 -07:00
sql-migration-guide.md [SPARK-26918][DOCS] All .md should have ASF license header 2019-03-30 19:49:45 -05:00
sql-performance-tuning.md [SPARK-27225][SQL] Implement join strategy hints 2019-04-12 00:14:37 +08:00
sql-programming-guide.md [SPARK-26918][DOCS] All .md should have ASF license header 2019-03-30 19:49:45 -05:00
sql-pyspark-pandas-with-arrow.md [SPARK-27834][SQL][R][PYTHON] Make separate PySpark/SparkR vectorization configurations 2019-06-03 10:01:37 +09:00
sql-reference.md [SPARK-27414][SQL] make it clear that date type is timezone independent 2019-04-10 16:39:28 +08:00
storage-openstack-swift.md [SPARK-26918][DOCS] All .md should have ASF license header 2019-03-30 19:49:45 -05:00
streaming-custom-receivers.md [SPARK-26918][DOCS] All .md should have ASF license header 2019-03-30 19:49:45 -05:00
streaming-kafka-0-10-integration.md [SPARK-26918][DOCS] All .md should have ASF license header 2019-03-30 19:49:45 -05:00
streaming-kafka-integration.md [SPARK-26918][DOCS] All .md should have ASF license header 2019-03-30 19:49:45 -05:00
streaming-kinesis-integration.md [SPARK-26918][DOCS] All .md should have ASF license header 2019-03-30 19:49:45 -05:00
streaming-programming-guide.md [SPARK-26918][DOCS] All .md should have ASF license header 2019-03-30 19:49:45 -05:00
structured-streaming-kafka-integration.md [SPARK-27687][SS] Rename Kafka consumer cache capacity conf and document caching 2019-05-15 10:42:09 -07:00
structured-streaming-programming-guide.md [MINOR][DOC] ForeachBatch doc fix. 2019-05-25 00:03:59 +09:00
submitting-applications.md [SPARK-26918][DOCS] All .md should have ASF license header 2019-03-30 19:49:45 -05:00
tuning.md [SPARK-26918][DOCS] All .md should have ASF license header 2019-03-30 19:49:45 -05:00

license
Licensed to the Apache Software Foundation (ASF) under one or more contributor license agreements. See the NOTICE file distributed with this work for additional information regarding copyright ownership. The ASF licenses this file to You under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at http://www.apache.org/licenses/LICENSE-2.0 Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License.

Welcome to the Spark documentation!

This readme will walk you through navigating and building the Spark documentation, which is included here with the Spark source code. You can also find documentation specific to release versions of Spark at https://spark.apache.org/documentation.html.

Read on to learn more about viewing documentation in plain text (i.e., markdown) or building the documentation yourself. Why build it yourself? So that you have the docs that correspond to whichever version of Spark you currently have checked out of revision control.

Prerequisites

The Spark documentation build uses a number of tools to build HTML docs and API docs in Scala, Java, Python, R and SQL.

You need to have Ruby and Python installed. Also install the following libraries:

$ sudo gem install jekyll jekyll-redirect-from pygments.rb
$ sudo pip install Pygments
# Following is needed only for generating API docs
$ sudo pip install sphinx pypandoc mkdocs
$ sudo Rscript -e 'install.packages(c("knitr", "devtools", "rmarkdown"), repos="https://cloud.r-project.org/")'
$ sudo Rscript -e 'devtools::install_version("roxygen2", version = "5.0.1", repos="https://cloud.r-project.org/")'
$ sudo Rscript -e 'devtools::install_version("testthat", version = "1.0.2", repos="https://cloud.r-project.org/")'

Note: If you are on a system with both Ruby 1.9 and Ruby 2.0 you may need to replace gem with gem2.0.

Note: Other versions of roxygen2 might work in SparkR documentation generation but RoxygenNote field in $SPARK_HOME/R/pkg/DESCRIPTION is 5.0.1, which is updated if the version is mismatched.

Generating the Documentation HTML

We include the Spark documentation as part of the source (as opposed to using a hosted wiki, such as the github wiki, as the definitive documentation) to enable the documentation to evolve along with the source code and be captured by revision control (currently git). This way the code automatically includes the version of the documentation that is relevant regardless of which version or release you have checked out or downloaded.

In this directory you will find text files formatted using Markdown, with an ".md" suffix. You can read those text files directly if you want. Start with index.md.

Execute jekyll build from the docs/ directory to compile the site. Compiling the site with Jekyll will create a directory called _site containing index.html as well as the rest of the compiled files.

$ cd docs
$ jekyll build

You can modify the default Jekyll build as follows:

# Skip generating API docs (which takes a while)
$ SKIP_API=1 jekyll build

# Serve content locally on port 4000
$ jekyll serve --watch

# Build the site with extra features used on the live page
$ PRODUCTION=1 jekyll build

API Docs (Scaladoc, Javadoc, Sphinx, roxygen2, MkDocs)

You can build just the Spark scaladoc and javadoc by running build/sbt unidoc from the $SPARK_HOME directory.

Similarly, you can build just the PySpark docs by running make html from the $SPARK_HOME/python/docs directory. Documentation is only generated for classes that are listed as public in __init__.py. The SparkR docs can be built by running $SPARK_HOME/R/create-docs.sh, and the SQL docs can be built by running $SPARK_HOME/sql/create-docs.sh after building Spark first.

When you run jekyll build in the docs directory, it will also copy over the scaladoc and javadoc for the various Spark subprojects into the docs directory (and then also into the _site directory). We use a jekyll plugin to run build/sbt unidoc before building the site so if you haven't run it (recently) it may take some time as it generates all of the scaladoc and javadoc using Unidoc. The jekyll plugin also generates the PySpark docs using Sphinx, SparkR docs using roxygen2 and SQL docs using MkDocs.

NOTE: To skip the step of building and copying over the Scala, Java, Python, R and SQL API docs, run SKIP_API=1 jekyll build. In addition, SKIP_SCALADOC=1, SKIP_PYTHONDOC=1, SKIP_RDOC=1 and SKIP_SQLDOC=1 can be used to skip a single step of the corresponding language. SKIP_SCALADOC indicates skipping both the Scala and Java docs.