spark-instrumented-optimizer/docs/_plugins/copy_api_dirs.rb
Davies Liu 7af3818c6b [SPARK-6806] [SPARKR] [DOCS] Fill in SparkR examples in programming guide
sqlCtx -> sqlContext

You can check the docs by:

```
$ cd docs
$ SKIP_SCALADOC=1 jekyll serve
```
cc shivaram

Author: Davies Liu <davies@databricks.com>

Closes #5442 from davies/r_docs and squashes the following commits:

7a12ec6 [Davies Liu] remove rdd in R docs
8496b26 [Davies Liu] remove the docs related to RDD
e23b9d6 [Davies Liu] delete R docs for RDD API
222e4ff [Davies Liu] Merge branch 'master' into r_docs
89684ce [Davies Liu] Merge branch 'r_docs' of github.com:davies/spark into r_docs
f0a10e1 [Davies Liu] address comments from @shivaram
f61de71 [Davies Liu] Update pairRDD.R
3ef7cf3 [Davies Liu] use + instead of function(a,b) a+b
2f10a77 [Davies Liu] address comments from @cafreeman
9c2a062 [Davies Liu] mention R api together with Python API
23f751a [Davies Liu] Fill in SparkR examples in programming guide
2015-05-23 00:01:40 -07:00

98 lines
2.9 KiB
Ruby

#
# Licensed to the Apache Software Foundation (ASF) under one or more
# contributor license agreements. See the NOTICE file distributed with
# this work for additional information regarding copyright ownership.
# The ASF licenses this file to You under the Apache License, Version 2.0
# (the "License"); you may not use this file except in compliance with
# the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#
require 'fileutils'
include FileUtils
if not (ENV['SKIP_API'] == '1')
if not (ENV['SKIP_SCALADOC'] == '1')
# Build Scaladoc for Java/Scala
puts "Moving to project root and building API docs."
curr_dir = pwd
cd("..")
puts "Running 'build/sbt -Pkinesis-asl compile unidoc' from " + pwd + "; this may take a few minutes..."
puts `build/sbt -Pkinesis-asl compile unidoc`
puts "Moving back into docs dir."
cd("docs")
# Copy over the unified ScalaDoc for all projects to api/scala.
# This directory will be copied over to _site when `jekyll` command is run.
source = "../target/scala-2.10/unidoc"
dest = "api/scala"
puts "Making directory " + dest
mkdir_p dest
# From the rubydoc: cp_r('src', 'dest') makes src/dest, but this doesn't.
puts "cp -r " + source + "/. " + dest
cp_r(source + "/.", dest)
# Append custom JavaScript
js = File.readlines("./js/api-docs.js")
js_file = dest + "/lib/template.js"
File.open(js_file, 'a') { |f| f.write("\n" + js.join()) }
# Append custom CSS
css = File.readlines("./css/api-docs.css")
css_file = dest + "/lib/template.css"
File.open(css_file, 'a') { |f| f.write("\n" + css.join()) }
# Copy over the unified JavaDoc for all projects to api/java.
source = "../target/javaunidoc"
dest = "api/java"
puts "Making directory " + dest
mkdir_p dest
puts "cp -r " + source + "/. " + dest
cp_r(source + "/.", dest)
end
# Build Sphinx docs for Python
puts "Moving to python/docs directory and building sphinx."
cd("../python/docs")
puts `make html`
puts "Moving back into home dir."
cd("../../")
puts "Making directory api/python"
mkdir_p "docs/api/python"
puts "cp -r python/docs/_build/html/. docs/api/python"
cp_r("python/docs/_build/html/.", "docs/api/python")
# Build SparkR API docs
puts "Moving to R directory and building roxygen docs."
cd("R")
puts `./create-docs.sh`
puts "Moving back into home dir."
cd("../")
puts "Making directory api/R"
mkdir_p "docs/api/R"
puts "cp -r R/pkg/html/. docs/api/R"
cp_r("R/pkg/html/.", "docs/api/R")
end