[SPARK-8282] [SPARKR] Make number of threads used in RBackend configurable

Read number of threads for RBackend from configuration.

[SPARK-8282] #comment Linking with JIRA

Author: Hossein <hossein@databricks.com>

Closes #6730 from falaki/SPARK-8282 and squashes the following commits:

33b3d98 [Hossein] Documented new config parameter
70f2a9c [Hossein] Fixing import
ec44225 [Hossein] Read number of threads for RBackend from configuration
This commit is contained in:
Hossein 2015-06-10 13:18:48 -07:00 committed by Andrew Or
parent 38112905bc
commit 30ebf1a233
2 changed files with 15 additions and 2 deletions

View file

@ -29,7 +29,7 @@ import io.netty.channel.socket.nio.NioServerSocketChannel
import io.netty.handler.codec.LengthFieldBasedFrameDecoder
import io.netty.handler.codec.bytes.{ByteArrayDecoder, ByteArrayEncoder}
import org.apache.spark.Logging
import org.apache.spark.{Logging, SparkConf}
/**
* Netty-based backend server that is used to communicate between R and Java.
@ -41,7 +41,8 @@ private[spark] class RBackend {
private[this] var bossGroup: EventLoopGroup = null
def init(): Int = {
bossGroup = new NioEventLoopGroup(2)
val conf = new SparkConf()
bossGroup = new NioEventLoopGroup(conf.getInt("spark.r.numRBackendThreads", 2))
val workerGroup = bossGroup
val handler = new RBackendHandler(this)

View file

@ -1495,6 +1495,18 @@ Apart from these, the following properties are also available, and may be useful
</tr>
</table>
#### SparkR
<table class="table">
<tr><th>Property Name</th><th>Default</th><th>Meaning</th></tr>
<tr>
<td><code>spark.r.numRBackendThreads</code></td>
<td>2</td>
<td>
Number of threads used by RBackend to handle RPC calls from SparkR package.
</td>
</tr>
</table>
#### Cluster Managers
Each cluster manager in Spark has additional configuration options. Configurations
can be found on the pages for each mode: