c636b87dc2
sqlCtx -> sqlContext
You can check the docs by:
```
$ cd docs
$ SKIP_SCALADOC=1 jekyll serve
```
cc shivaram
Author: Davies Liu <davies@databricks.com>
Closes #5442 from davies/r_docs and squashes the following commits:
7a12ec6 [Davies Liu] remove rdd in R docs
8496b26 [Davies Liu] remove the docs related to RDD
e23b9d6 [Davies Liu] delete R docs for RDD API
222e4ff [Davies Liu] Merge branch 'master' into r_docs
89684ce [Davies Liu] Merge branch 'r_docs' of github.com:davies/spark into r_docs
f0a10e1 [Davies Liu] address comments from @shivaram
f61de71 [Davies Liu] Update pairRDD.R
3ef7cf3 [Davies Liu] use + instead of function(a,b) a+b
2f10a77 [Davies Liu] address comments from @cafreeman
9c2a062 [Davies Liu] mention R api together with Python API
23f751a [Davies Liu] Fill in SparkR examples in programming guide
(cherry picked from commit 7af3818c6b
)
Signed-off-by: Shivaram Venkataraman <shivaram@cs.berkeley.edu>
34 lines
1.3 KiB
R
34 lines
1.3 KiB
R
#
|
|
# Licensed to the Apache Software Foundation (ASF) under one or more
|
|
# contributor license agreements. See the NOTICE file distributed with
|
|
# this work for additional information regarding copyright ownership.
|
|
# The ASF licenses this file to You under the Apache License, Version 2.0
|
|
# (the "License"); you may not use this file except in compliance with
|
|
# the License. You may obtain a copy of the License at
|
|
#
|
|
# http://www.apache.org/licenses/LICENSE-2.0
|
|
#
|
|
# Unless required by applicable law or agreed to in writing, software
|
|
# distributed under the License is distributed on an "AS IS" BASIS,
|
|
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
|
# See the License for the specific language governing permissions and
|
|
# limitations under the License.
|
|
#
|
|
|
|
.First <- function() {
|
|
home <- Sys.getenv("SPARK_HOME")
|
|
.libPaths(c(file.path(home, "R", "lib"), .libPaths()))
|
|
Sys.setenv(NOAWT=1)
|
|
|
|
# Make sure SparkR package is the last loaded one
|
|
old <- getOption("defaultPackages")
|
|
options(defaultPackages = c(old, "SparkR"))
|
|
|
|
sc <- SparkR::sparkR.init(Sys.getenv("MASTER", unset = ""))
|
|
assign("sc", sc, envir=.GlobalEnv)
|
|
sqlContext <- SparkR::sparkRSQL.init(sc)
|
|
assign("sqlContext", sqlContext, envir=.GlobalEnv)
|
|
cat("\n Welcome to SparkR!")
|
|
cat("\n Spark context is available as sc, SQL context is available as sqlContext\n")
|
|
}
|