From 05c6b8acdc24bc6f982a63dfb5cca21cc9993312 Mon Sep 17 00:00:00 2001 From: Kousuke Saruta Date: Tue, 29 Jun 2021 21:25:31 +0000 Subject: [PATCH] [SPARK-35921][BUILD] ${spark.yarn.isHadoopProvided} in config.properties is not edited if build with SBT ### What changes were proposed in this pull request? This PR changes `SparkBuild.scala` to edit `config.properties` in `yarn` sub-module in build with SBT like as build with Maven does. ### Why are the changes needed? yarn sub-module contains config.properties. ``` spark.yarn.isHadoopProvided = ${spark.yarn.isHadoopProvided} ``` The `${spark.yarn.isHadoopProvided}` part is replaced with `true` or `false` in build depending on whether Hadoop is provided or not (specified by -Phadoop-provided). The edited config.properties will be loaded at runtime to control how to populate Hadoop-related classpath. If we build with Maven, these process works but doesn't with SBT. If we build with SBT and deploy apps on YARN, the following warning appears and classpath is not populated correctly. ``` 21/06/29 10:51:20 WARN config.package: Can not load the default value of `spark.yarn.isHadoopProvided` from `org/apache/spark/deploy/yarn/config.properties` with error, java.lang.IllegalArgumentException: For input string: "${spark.yarn.isHadoopProvided}". Using `false` as a default value. ``` ### Does this PR introduce _any_ user-facing change? No. ### How was this patch tested? Built with SBT and extracted `config.properties` from the build artifact and confirmed `${spark.yarn.isHadoopProvided} was correctly edited with `true` or `false`. ``` cat org/apache/spark/deploy/yarn/config.properties spark.yarn.isHadoopProvided = false # In case build with -Pyarn and without -Phadoop-provided spark.yarn.isHadoopProvided = true # In case build with -Pyarn and -Phadoop-provided ``` I also confirmed the warning message shown above no longer appears. Closes #33121 from sarutak/sbt-yarn-config-properties. Authored-by: Kousuke Saruta Signed-off-by: DB Tsai --- project/SparkBuild.scala | 21 ++++++++++++++++++++- 1 file changed, 20 insertions(+), 1 deletion(-) diff --git a/project/SparkBuild.scala b/project/SparkBuild.scala index de3871833b..b1d66686c9 100644 --- a/project/SparkBuild.scala +++ b/project/SparkBuild.scala @@ -802,11 +802,30 @@ object Hive { } object YARN { + val genConfigProperties = TaskKey[Unit]("gen-config-properties", + "Generate config.properties which contains a setting whether Hadoop is provided or not") + val propFileName = "config.properties" + val hadoopProvidedProp = "spark.yarn.isHadoopProvided" + lazy val settings = Seq( excludeDependencies --= Seq( ExclusionRule(organization = "com.sun.jersey"), ExclusionRule("javax.servlet", "javax.servlet-api"), - ExclusionRule("javax.ws.rs", "jsr311-api")) + ExclusionRule("javax.ws.rs", "jsr311-api")), + Compile / unmanagedResources := + (Compile / unmanagedResources).value.filter(!_.getName.endsWith(s"$propFileName")), + genConfigProperties := { + val file = (Compile / classDirectory).value / s"org/apache/spark/deploy/yarn/$propFileName" + val isHadoopProvided = SbtPomKeys.effectivePom.value.getProperties.get(hadoopProvidedProp) + IO.write(file, s"$hadoopProvidedProp = $isHadoopProvided") + }, + Compile / copyResources := (Def.taskDyn { + val c = (Compile / copyResources).value + Def.task { + (Compile / genConfigProperties).value + c + } + }).value ) }