f9efdeea8c
### What changes were proposed in this pull request? Move `spark.yarn.isHadoopProvided` to Spark parent pom, so that under `resource-managers/yarn` we can make `hadoop-3.2` as the default profile. ### Why are the changes needed? Currently under `resource-managers/yarn` there are 3 maven profiles : `hadoop-provided`, `hadoop-2.7`, and `hadoop-3.2`, of which `hadoop-3.2` is activated by default (via `activeByDefault`). The activation, however, doesn't work when there is other explicitly activated profiles. In specific, if users build Spark with `hadoop-provided`, maven will fail because it can't find Hadoop 3.2 related dependencies, which are defined in the `hadoop-3.2` profile section. To fix the issue, this proposes to move the `hadoop-provided` section to the parent pom. Currently this is only used to define a property `spark.yarn.isHadoopProvided`, and it shouldn't matter where we define it. ### Does this PR introduce _any_ user-facing change? No ### How was this patch tested? Tested via running the command: ``` build/mvn clean package -DskipTests -B -Pmesos -Pyarn -Pkubernetes -Pscala-2.12 -Phadoop-provided ``` which was failing before this PR but is succeeding with it. Also checked active profiles with the command: ``` build/mvn -Pyarn -Phadoop-provided help:active-profiles ``` and it shows that `hadoop-3.2` is active for `spark-yarn` module now. Closes #34110 from sunchao/SPARK-36835-followup2. Authored-by: Chao Sun <sunchao@apple.com> Signed-off-by: Gengliang Wang <gengliang@apache.org> |
||
---|---|---|
.. | ||
kubernetes | ||
mesos | ||
yarn |