spark-instrumented-optimizer/appveyor.yml
HyukjinKwon 92cabf6306 [SPARK-28759][BUILD] Upgrade scala-maven-plugin to 4.2.0 and fix build profile on AppVeyor
### What changes were proposed in this pull request?

This PR proposes to upgrade scala-maven-plugin from 3.4.4 to 4.2.0.

Upgrade to 4.1.1 was reverted due to unexpected build failure on AppVeyor.

The root cause seems to be an issue specific to AppVeyor - loading the system library 'kernel32.dll' seems being failed.

```
Suppressed: java.lang.NoClassDefFoundError: Could not initialize class com.sun.jna.platform.win32.Kernel32
        at sbt.internal.io.WinMilli$.getHandle(Milli.scala:264)
        at sbt.internal.io.WinMilli$.getModifiedTimeNative(Milli.scala:289)
        at sbt.internal.io.WinMilli$.getModifiedTimeNative(Milli.scala:260)
        at sbt.internal.io.MilliNative.getModifiedTime(Milli.scala:61)
        at sbt.internal.io.Milli$.getModifiedTime(Milli.scala:360)
        at sbt.io.IO$.$anonfun$getModifiedTimeOrZero$1(IO.scala:1373)
        at scala.runtime.java8.JFunction0$mcJ$sp.apply(JFunction0$mcJ$sp.java:23)
        at sbt.internal.io.Retry$.liftedTree2$1(Retry.scala:38)
        at sbt.internal.io.Retry$.impl$1(Retry.scala:38)
        at sbt.internal.io.Retry$.apply(Retry.scala:52)
        at sbt.internal.io.Retry$.apply(Retry.scala:24)
        at sbt.io.IO$.getModifiedTimeOrZero(IO.scala:1373)
        at sbt.internal.inc.caching.ClasspathCache$.fromCacheOrHash$1(ClasspathCache.scala:44)
        at sbt.internal.inc.caching.ClasspathCache$.$anonfun$hashClasspath$1(ClasspathCache.scala:53)
        at scala.collection.parallel.mutable.ParArray$Map.leaf(ParArray.scala:659)
        at scala.collection.parallel.Task.$anonfun$tryLeaf$1(Tasks.scala:53)
        at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
        at scala.util.control.Breaks$$anon$1.catchBreak(Breaks.scala:67)
        at scala.collection.parallel.Task.tryLeaf(Tasks.scala:56)
        at scala.collection.parallel.Task.tryLeaf$(Tasks.scala:50)
        at scala.collection.parallel.mutable.ParArray$Map.tryLeaf(ParArray.scala:650)
        at scala.collection.parallel.AdaptiveWorkStealingTasks$WrappedTask.internal(Tasks.scala:170)
        ... 25 more
```

By setting `-Djna.nosys=true`, it directly loads the library from the jar instead of system's.

In this way, the build seems working fine.

### Why are the changes needed?

It upgrades the plugin to fix bugs and fixes the CI build.

### Does this PR introduce any user-facing change?

No.

### How was this patch tested?

It was tested at https://github.com/apache/spark/pull/25497

Closes #25633 from HyukjinKwon/SPARK-28759.

Authored-by: HyukjinKwon <gurwls223@apache.org>
Signed-off-by: Dongjoon Hyun <dhyun@apple.com>
2019-08-30 09:39:15 -07:00

73 lines
3 KiB
YAML

# Licensed to the Apache Software Foundation (ASF) under one or more
# contributor license agreements. See the NOTICE file distributed with
# this work for additional information regarding copyright ownership.
# The ASF licenses this file to You under the Apache License, Version 2.0
# (the "License"); you may not use this file except in compliance with
# the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
version: "{build}-{branch}"
shallow_clone: true
platform: x64
configuration: Debug
branches:
only:
- master
only_commits:
files:
- appveyor.yml
- dev/appveyor-install-dependencies.ps1
- R/
- sql/core/src/main/scala/org/apache/spark/sql/api/r/
- core/src/main/scala/org/apache/spark/api/r/
- mllib/src/main/scala/org/apache/spark/ml/r/
- core/src/test/scala/org/apache/spark/deploy/SparkSubmitSuite.scala
- bin/*.cmd
cache:
- C:\Users\appveyor\.m2
install:
# Install maven and dependencies
- ps: .\dev\appveyor-install-dependencies.ps1
# Required package for R unit tests
- cmd: R -e "install.packages(c('knitr', 'rmarkdown', 'e1071', 'survival'), repos='https://cloud.r-project.org/')"
# Here, we use the fixed version of testthat. For more details, please see SPARK-22817.
# As of devtools 2.1.0, it requires testthat higher then 2.1.1 as a dependency. SparkR test requires testthat 1.0.2.
# Therefore, we don't use devtools but installs it directly from the archive including its dependencies.
- cmd: R -e "install.packages(c('crayon', 'praise', 'R6'), repos='https://cloud.r-project.org/')"
- cmd: R -e "install.packages('https://cloud.r-project.org/src/contrib/Archive/testthat/testthat_1.0.2.tar.gz', repos=NULL, type='source')"
- cmd: R -e "packageVersion('knitr'); packageVersion('rmarkdown'); packageVersion('testthat'); packageVersion('e1071'); packageVersion('survival')"
build_script:
# '-Djna.nosys=true' is required to avoid kernel32.dll load failure.
# See SPARK-28759.
- cmd: mvn -DskipTests -Psparkr -Phive -Djna.nosys=true package
environment:
NOT_CRAN: true
# See SPARK-27848. Currently installing some dependent packages causes
# "(converted from warning) unable to identify current timezone 'C':" for an unknown reason.
# This environment variable works around to test SparkR against a higher version.
R_REMOTES_NO_ERRORS_FROM_WARNINGS: true
test_script:
- cmd: .\bin\spark-submit2.cmd --driver-java-options "-Dlog4j.configuration=file:///%CD:\=/%/R/log4j.properties" --conf spark.hadoop.fs.defaultFS="file:///" R\pkg\tests\run-all.R
notifications:
- provider: Email
on_build_success: false
on_build_failure: false
on_build_status_changed: false