[SPARK-36547][BUILD] Downgrade scala-maven-plugin to 4.3.0
### What changes were proposed in this pull request? When preparing Spark 3.2.0 RC1, I hit the same issue of https://github.com/apache/spark/pull/31031. ``` [INFO] Compiling 21 Scala sources and 3 Java sources to /opt/spark-rm/output/spark-3.1.0-bin-hadoop2.7/resource-managers/yarn/target/scala-2.12/test-classes ... [ERROR] ## Exception when compiling 24 sources to /opt/spark-rm/output/spark-3.1.0-bin-hadoop2.7/resource-managers/yarn/target/scala-2.12/test-classes java.lang.SecurityException: class "javax.servlet.SessionCookieConfig"'s signer information does not match signer information of other classes in the same package java.lang.ClassLoader.checkCerts(ClassLoader.java:891) java.lang.ClassLoader.preDefineClass(ClassLoader.java:661) ``` This PR is to apply the same fix again by downgrading scala-maven-plugin to 4.3.0 ### Why are the changes needed? To unblock the release process. ### Does this PR introduce _any_ user-facing change? No ### How was this patch tested? Build test Closes #33791 from gengliangwang/downgrade. Authored-by: Gengliang Wang <gengliang@apache.org> Signed-off-by: Gengliang Wang <gengliang@apache.org>
This commit is contained in:
parent
a0b24019ed
commit
f0775d215e
3
pom.xml
3
pom.xml
|
@ -2574,7 +2574,8 @@
|
|||
<plugin>
|
||||
<groupId>net.alchim31.maven</groupId>
|
||||
<artifactId>scala-maven-plugin</artifactId>
|
||||
<version>4.5.3</version>
|
||||
<!-- SPARK-36547: Please don't upgrade the version below, otherwise there will be an error on building Hadoop 2.7 package -->
|
||||
<version>4.3.0</version>
|
||||
<executions>
|
||||
<execution>
|
||||
<id>eclipse-add-source</id>
|
||||
|
|
Loading…
Reference in a new issue