spark-instrumented-optimizer/docs/storage-openstack-swift.md
Reynold Xin eddfeddac1 [SPARK-938][doc] Add OpenStack Swift support
See compiled doc at
http://people.apache.org/~rxin/tmp/openstack-swift/_site/storage-openstack-swift.html

This is based on #1010. Closes #1010.

Author: Reynold Xin <rxin@apache.org>
Author: Gil Vernik <gilv@il.ibm.com>

Closes #2298 from rxin/openstack-swift and squashes the following commits:

ff4e394 [Reynold Xin] Two minor comments from Patrick.
279f6de [Reynold Xin] core-sites -> core-site
dfb8fea [Reynold Xin] Updated based on Gil's suggestion.
846f5cb [Reynold Xin] Added a link from overview page.
0447c9f [Reynold Xin] Removed sample code.
e9c3761 [Reynold Xin] Merge pull request #1010 from gilv/master
9233fef [Gil Vernik] Fixed typos
6994827 [Gil Vernik] Merge pull request #1 from rxin/openstack
ac0679e [Reynold Xin] Fixed an unclosed tr.
47ce99d [Reynold Xin] Merge branch 'master' into openstack
cca7192 [Gil Vernik] Removed white spases from pom.xml
99f095d [Reynold Xin] Pending openstack changes.
eb22295 [Reynold Xin] Merge pull request #1010 from gilv/master
39a9737 [Gil Vernik] Spark integration with Openstack Swift
c977658 [Gil Vernik] Merge branch 'master' of https://github.com/gilv/spark
2aba763 [Gil Vernik] Fix to docs/openstack-integration.md
9b625b5 [Gil Vernik] Merge branch 'master' of https://github.com/gilv/spark
eff538d [Gil Vernik] SPARK-938 - Openstack Swift object storage support
ce483d7 [Gil Vernik] SPARK-938 - Openstack Swift object storage support
b6c37ef [Gil Vernik] Openstack Swift support
2014-09-07 20:56:04 -07:00

4.8 KiB

layout title
global Accessing OpenStack Swift from Spark

Spark's support for Hadoop InputFormat allows it to process data in OpenStack Swift using the same URI formats as in Hadoop. You can specify a path in Swift as input through a URI of the form swift://container.PROVIDER/path. You will also need to set your Swift security credentials, through core-site.xml or via SparkContext.hadoopConfiguration. Current Swift driver requires Swift to use Keystone authentication method.

Configuring Swift for Better Data Locality

Although not mandatory, it is recommended to configure the proxy server of Swift with list_endpoints to have better data locality. More information is available here.

Dependencies

The Spark application should include hadoop-openstack dependency. For example, for Maven support, add the following to the pom.xml file:

{% highlight xml %} ... org.apache.hadoop hadoop-openstack 2.3.0 ... {% endhighlight %}

Configuration Parameters

Create core-site.xml and place it inside Spark's conf directory. There are two main categories of parameters that should to be configured: declaration of the Swift driver and the parameters that are required by Keystone.

Configuration of Hadoop to use Swift File system achieved via

Property NameValue
fs.swift.impl org.apache.hadoop.fs.swift.snative.SwiftNativeFileSystem

Additional parameters required by Keystone (v2.0) and should be provided to the Swift driver. Those parameters will be used to perform authentication in Keystone to access Swift. The following table contains a list of Keystone mandatory parameters. PROVIDER can be any name.

Property NameMeaningRequired
fs.swift.service.PROVIDER.auth.url Keystone Authentication URL Mandatory
fs.swift.service.PROVIDER.auth.endpoint.prefix Keystone endpoints prefix Optional
fs.swift.service.PROVIDER.tenant Tenant Mandatory
fs.swift.service.PROVIDER.username Username Mandatory
fs.swift.service.PROVIDER.password Password Mandatory
fs.swift.service.PROVIDER.http.port HTTP port Mandatory
fs.swift.service.PROVIDER.region Keystone region Mandatory
fs.swift.service.PROVIDER.public Indicates if all URLs are public Mandatory

For example, assume PROVIDER=SparkTest and Keystone contains user tester with password testing defined for tenant test. Then core-site.xml should include:

{% highlight xml %} fs.swift.impl org.apache.hadoop.fs.swift.snative.SwiftNativeFileSystem fs.swift.service.SparkTest.auth.url http://127.0.0.1:5000/v2.0/tokens fs.swift.service.SparkTest.auth.endpoint.prefix endpoints fs.swift.service.SparkTest.http.port 8080 fs.swift.service.SparkTest.region RegionOne fs.swift.service.SparkTest.public true fs.swift.service.SparkTest.tenant test fs.swift.service.SparkTest.username tester fs.swift.service.SparkTest.password testing {% endhighlight %}

Notice that fs.swift.service.PROVIDER.tenant, fs.swift.service.PROVIDER.username, fs.swift.service.PROVIDER.password contains sensitive information and keeping them in core-site.xml is not always a good approach. We suggest to keep those parameters in core-site.xml for testing purposes when running Spark via spark-shell. For job submissions they should be provided via sparkContext.hadoopConfiguration.