7930209614
Updated Spark Streaming Programming Guide
Here is the updated version of the Spark Streaming Programming Guide. This is still a work in progress, but the major changes are in place. So feedback is most welcome.
In general, I have tried to make the guide to easier to understand even if the reader does not know much about Spark. The updated website is hosted here -
http://www.eecs.berkeley.edu/~tdas/spark_docs/streaming-programming-guide.html
The major changes are:
- Overview illustrates the usecases of Spark Streaming - various input sources and various output sources
- An example right after overview to quickly give an idea of what Spark Streaming program looks like
- Made Java API and examples a first class citizen like Scala by using tabs to show both Scala and Java examples (similar to AMPCamp tutorial's code tabs)
- Highlighted the DStream operations updateStateByKey and transform because of their powerful nature
- Updated driver node failure recovery text to highlight automatic recovery in Spark standalone mode
- Added information about linking and using the external input sources like Kafka and Flume
- In general, reorganized the sections to better show the Basic section and the more advanced sections like Tuning and Recovery.
Todos:
- Links to the docs of external Kafka, Flume, etc
- Illustrate window operation with figure as well as example.
Author: Tathagata Das <tathagata.das1565@gmail.com>
== Merge branch commits ==
commit 18ff10556570b39d672beeb0a32075215cfcc944
Author: Tathagata Das <tathagata.das1565@gmail.com>
Date: Tue Jan 28 21:49:30 2014 -0800
Fixed a lot of broken links.
commit 34a5a6008dac2e107624c7ff0db0824ee5bae45f
Author: Tathagata Das <tathagata.das1565@gmail.com>
Date: Tue Jan 28 18:02:28 2014 -0800
Updated github url to use SPARK_GITHUB_URL variable.
commit f338a60ae8069e0a382d2cb170227e5757cc0b7a
Author: Tathagata Das <tathagata.das1565@gmail.com>
Date: Mon Jan 27 22:42:42 2014 -0800
More updates based on Patrick and Harvey's comments.
commit 89a81ff25726bf6d26163e0dd938290a79582c0f
Author: Tathagata Das <tathagata.das1565@gmail.com>
Date: Mon Jan 27 13:08:34 2014 -0800
Updated docs based on Patricks PR comments.
commit d5b6196b532b5746e019b959a79ea0cc013a8fc3
Author: Tathagata Das <tathagata.das1565@gmail.com>
Date: Sun Jan 26 20:15:58 2014 -0800
Added spark.streaming.unpersist config and info on StreamingListener interface.
commit e3dcb46ab83d7071f611d9b5008ba6bc16c9f951
Author: Tathagata Das <tathagata.das1565@gmail.com>
Date: Sun Jan 26 18:41:12 2014 -0800
Fixed docs on StreamingContext.getOrCreate.
commit 6c29524639463f11eec721e4d17a9d7159f2944b
Author: Tathagata Das <tathagata.das1565@gmail.com>
Date: Thu Jan 23 18:49:39 2014 -0800
Added example and figure for window operations, and links to Kafka and Flume API docs.
commit f06b964a51bb3b21cde2ff8bdea7d9785f6ce3a9
Author: Tathagata Das <tathagata.das1565@gmail.com>
Date: Wed Jan 22 22:49:12 2014 -0800
Fixed missing endhighlight tag in the MLlib guide.
commit 036a7d46187ea3f2a0fb8349ef78f10d6c0b43a9
Merge: eab351d a1cd185
Author: Tathagata Das <tathagata.das1565@gmail.com>
Date: Wed Jan 22 22:17:42 2014 -0800
Merge remote-tracking branch 'apache/master' into docs-update
commit eab351d05c0baef1d4b549e1581310087158d78d
Author: Tathagata Das <tathagata.das1565@gmail.com>
Date: Wed Jan 22 22:17:15 2014 -0800
Update Spark Streaming Programming Guide.
68 lines
2.2 KiB
Ruby
68 lines
2.2 KiB
Ruby
#
|
|
# Licensed to the Apache Software Foundation (ASF) under one or more
|
|
# contributor license agreements. See the NOTICE file distributed with
|
|
# this work for additional information regarding copyright ownership.
|
|
# The ASF licenses this file to You under the Apache License, Version 2.0
|
|
# (the "License"); you may not use this file except in compliance with
|
|
# the License. You may obtain a copy of the License at
|
|
#
|
|
# http://www.apache.org/licenses/LICENSE-2.0
|
|
#
|
|
# Unless required by applicable law or agreed to in writing, software
|
|
# distributed under the License is distributed on an "AS IS" BASIS,
|
|
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
|
# See the License for the specific language governing permissions and
|
|
# limitations under the License.
|
|
#
|
|
|
|
require 'fileutils'
|
|
include FileUtils
|
|
|
|
if not (ENV['SKIP_API'] == '1' or ENV['SKIP_SCALADOC'] == '1')
|
|
# Build Scaladoc for Java/Scala
|
|
core_projects = ["core", "examples", "repl", "bagel", "graphx", "streaming", "mllib"]
|
|
external_projects = ["flume", "kafka", "mqtt", "twitter", "zeromq"]
|
|
|
|
projects = core_projects + external_projects.map { |project_name| "external/" + project_name }
|
|
|
|
puts "Moving to project root and building scaladoc."
|
|
curr_dir = pwd
|
|
cd("..")
|
|
|
|
puts "Running sbt/sbt doc from " + pwd + "; this may take a few minutes..."
|
|
puts `sbt/sbt doc`
|
|
|
|
puts "Moving back into docs dir."
|
|
cd("docs")
|
|
|
|
# Copy over the scaladoc from each project into the docs directory.
|
|
# This directory will be copied over to _site when `jekyll` command is run.
|
|
projects.each do |project_name|
|
|
source = "../" + project_name + "/target/scala-2.10/api"
|
|
dest = "api/" + project_name
|
|
|
|
puts "echo making directory " + dest
|
|
mkdir_p dest
|
|
|
|
# From the rubydoc: cp_r('src', 'dest') makes src/dest, but this doesn't.
|
|
puts "cp -r " + source + "/. " + dest
|
|
cp_r(source + "/.", dest)
|
|
end
|
|
|
|
# Build Epydoc for Python
|
|
puts "Moving to python directory and building epydoc."
|
|
cd("../python")
|
|
puts `epydoc --config epydoc.conf`
|
|
|
|
puts "Moving back into docs dir."
|
|
cd("../docs")
|
|
|
|
puts "echo making directory pyspark"
|
|
mkdir_p "pyspark"
|
|
|
|
puts "cp -r ../python/docs/. api/pyspark"
|
|
cp_r("../python/docs/.", "api/pyspark")
|
|
|
|
cd("..")
|
|
end
|