spark-instrumented-optimizer/docs/_config.yml
DB Tsai ad853c5678
[SPARK-25956] Make Scala 2.12 as default Scala version in Spark 3.0
## What changes were proposed in this pull request?

This PR makes Spark's default Scala version as 2.12, and Scala 2.11 will be the alternative version. This implies that Scala 2.12 will be used by our CI builds including pull request builds.

We'll update the Jenkins to include a new compile-only jobs for Scala 2.11 to ensure the code can be still compiled with Scala 2.11.

## How was this patch tested?

existing tests

Closes #22967 from dbtsai/scala2.12.

Authored-by: DB Tsai <d_tsai@apple.com>
Signed-off-by: Dongjoon Hyun <dongjoon@apache.org>
2018-11-14 16:22:23 -08:00

24 lines
615 B
YAML

highlighter: pygments
markdown: kramdown
gems:
- jekyll-redirect-from
# For some reason kramdown seems to behave differently on different
# OS/packages wrt encoding. So we hard code this config.
kramdown:
entity_output: numeric
include:
- _static
- _modules
# These allow the documentation to be updated with newer releases
# of Spark, Scala, and Mesos.
SPARK_VERSION: 3.0.0-SNAPSHOT
SPARK_VERSION_SHORT: 3.0.0
SCALA_BINARY_VERSION: "2.12"
SCALA_VERSION: "2.12.7"
MESOS_VERSION: 1.0.0
SPARK_ISSUE_TRACKER_URL: https://issues.apache.org/jira/browse/SPARK
SPARK_GITHUB_URL: https://github.com/apache/spark