[SPARK-14881] [PYTHON] [SPARKR] pyspark and sparkR shell default log level should match spark-shell/Scala

## What changes were proposed in this pull request?

Change default logging to WARN for pyspark shell and sparkR shell for a much cleaner environment.

## How was this patch tested?

Manually running pyspark and sparkR shell

Author: felixcheung <felixcheung_m@hotmail.com>

Closes #12648 from felixcheung/pylogging.
This commit is contained in:
felixcheung 2016-04-24 22:51:18 -07:00 committed by Davies Liu
parent 6ab4d9e0c7
commit c752b6c5ec
2 changed files with 5 additions and 0 deletions

View file

@ -32,6 +32,8 @@ import org.apache.spark.util.Utils
* This process is launched (via SparkSubmit) by the PySpark driver (see java_gateway.py).
*/
private[spark] object PythonGatewayServer extends Logging {
initializeLogIfNecessary(true)
def main(args: Array[String]): Unit = Utils.tryOrExit {
// Start a GatewayServer on an ephemeral port
val gatewayServer: GatewayServer = new GatewayServer(null, 0)

View file

@ -94,6 +94,8 @@ private[spark] class RBackend {
}
private[spark] object RBackend extends Logging {
initializeLogIfNecessary(true)
def main(args: Array[String]): Unit = {
if (args.length < 1) {
// scalastyle:off println
@ -101,6 +103,7 @@ private[spark] object RBackend extends Logging {
// scalastyle:on println
System.exit(-1)
}
val sparkRBackend = new RBackend()
try {
// bind to random port