spark-instrumented-optimizer/appveyor.yml
hyukjinkwon 3210121fed [MINOR][BUILD] Remove -Phive-thriftserver profile within appveyor.yml
## What changes were proposed in this pull request?

This PR propose to remove `-Phive-thriftserver` profile which seems not affecting the SparkR tests in AppVeyor.

Originally wanted to check if there's a meaningful build time decrease but seems not. It will have but seems not meaningfully decreased.

## How was this patch tested?

AppVeyor tests:

```
[00:40:49] Attaching package: 'SparkR'
[00:40:49]
[00:40:49] The following objects are masked from 'package:testthat':
[00:40:49]
[00:40:49]     describe, not
[00:40:49]
[00:40:49] The following objects are masked from 'package:stats':
[00:40:49]
[00:40:49]     cov, filter, lag, na.omit, predict, sd, var, window
[00:40:49]
[00:40:49] The following objects are masked from 'package:base':
[00:40:49]
[00:40:49]     as.data.frame, colnames, colnames<-, drop, endsWith, intersect,
[00:40:49]     rank, rbind, sample, startsWith, subset, summary, transform, union
[00:40:49]
[00:40:49] Spark package found in SPARK_HOME: C:\projects\spark\bin\..
[00:41:43] basic tests for CRAN: .............
[00:41:43]
[00:41:43] DONE ===========================================================================
[00:41:43] binary functions: Spark package found in SPARK_HOME: C:\projects\spark\bin\..
[00:42:05] ...........
[00:42:05] functions on binary files: Spark package found in SPARK_HOME: C:\projects\spark\bin\..
[00:42:10] ....
[00:42:10] broadcast variables: Spark package found in SPARK_HOME: C:\projects\spark\bin\..
[00:42:12] ..
[00:42:12] functions in client.R: .....
[00:42:30] test functions in sparkR.R: ..............................................
[00:42:30] include R packages: Spark package found in SPARK_HOME: C:\projects\spark\bin\..
[00:42:31]
[00:42:31] JVM API: Spark package found in SPARK_HOME: C:\projects\spark\bin\..
[00:42:31] ..
[00:42:31] MLlib classification algorithms, except for tree-based algorithms: Spark package found in SPARK_HOME: C:\projects\spark\bin\..
[00:48:48] ......................................................................
[00:48:48] MLlib clustering algorithms: Spark package found in SPARK_HOME: C:\projects\spark\bin\..
[00:50:12] .....................................................................
[00:50:12] MLlib frequent pattern mining: Spark package found in SPARK_HOME: C:\projects\spark\bin\..
[00:50:18] .....
[00:50:18] MLlib recommendation algorithms: Spark package found in SPARK_HOME: C:\projects\spark\bin\..
[00:50:27] ........
[00:50:27] MLlib regression algorithms, except for tree-based algorithms: Spark package found in SPARK_HOME: C:\projects\spark\bin\..
[00:56:00] ................................................................................................................................
[00:56:00] MLlib statistics algorithms: Spark package found in SPARK_HOME: C:\projects\spark\bin\..
[00:56:04] ........
[00:56:04] MLlib tree-based algorithms: Spark package found in SPARK_HOME: C:\projects\spark\bin\..
[00:58:20] ..............................................................................................
[00:58:20] parallelize() and collect(): Spark package found in SPARK_HOME: C:\projects\spark\bin\..
[00:58:20] .............................
[00:58:20] basic RDD functions: Spark package found in SPARK_HOME: C:\projects\spark\bin\..
[01:03:35] ............................................................................................................................................................................................................................................................................................................................................................................................................................................
[01:03:35] SerDe functionality: Spark package found in SPARK_HOME: C:\projects\spark\bin\..
[01:03:39] ...............................
[01:03:39] partitionBy, groupByKey, reduceByKey etc.: Spark package found in SPARK_HOME: C:\projects\spark\bin\..
[01:04:20] ....................
[01:04:20] functions in sparkR.R: ....
[01:04:20] SparkSQL functions: Spark package found in SPARK_HOME: C:\projects\spark\bin\..
[01:04:50] ........................................................................................................................................-chgrp: 'APPVYR-WIN\None' does not match expected pattern for group
[01:04:50] Usage: hadoop fs [generic options] -chgrp [-R] GROUP PATH...
[01:04:50] -chgrp: 'APPVYR-WIN\None' does not match expected pattern for group
[01:04:50] Usage: hadoop fs [generic options] -chgrp [-R] GROUP PATH...
[01:04:51] -chgrp: 'APPVYR-WIN\None' does not match expected pattern for group
[01:04:51] Usage: hadoop fs [generic options] -chgrp [-R] GROUP PATH...
[01:06:13] ............................................................................................................................................................................................................................................................................................................................................................-chgrp: 'APPVYR-WIN\None' does not match expected pattern for group
[01:06:13] Usage: hadoop fs [generic options] -chgrp [-R] GROUP PATH...
[01:06:14] .-chgrp: 'APPVYR-WIN\None' does not match expected pattern for group
[01:06:14] Usage: hadoop fs [generic options] -chgrp [-R] GROUP PATH...
[01:06:14] ....-chgrp: 'APPVYR-WIN\None' does not match expected pattern for group
[01:06:14] Usage: hadoop fs [generic options] -chgrp [-R] GROUP PATH...
[01:12:30] ...................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................
[01:12:30] Structured Streaming: Spark package found in SPARK_HOME: C:\projects\spark\bin\..
[01:14:27] ..........................................
[01:14:27] tests RDD function take(): Spark package found in SPARK_HOME: C:\projects\spark\bin\..
[01:14:28] ................
[01:14:28] the textFile() function: Spark package found in SPARK_HOME: C:\projects\spark\bin\..
[01:14:44] .............
[01:14:44] functions in utils.R: Spark package found in SPARK_HOME: C:\projects\spark\bin\..
[01:14:46] ............................................
[01:14:46] Windows-specific tests: .
[01:14:46]
[01:14:46] DONE ===========================================================================
[01:15:29] Build success
```

Author: hyukjinkwon <gurwls223@apache.org>

Closes #21894 from HyukjinKwon/wip-build.
2018-07-30 10:01:18 +08:00

64 lines
2.2 KiB
YAML

# Licensed to the Apache Software Foundation (ASF) under one or more
# contributor license agreements. See the NOTICE file distributed with
# this work for additional information regarding copyright ownership.
# The ASF licenses this file to You under the Apache License, Version 2.0
# (the "License"); you may not use this file except in compliance with
# the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
version: "{build}-{branch}"
shallow_clone: true
platform: x64
configuration: Debug
branches:
only:
- master
only_commits:
files:
- appveyor.yml
- dev/appveyor-install-dependencies.ps1
- R/
- sql/core/src/main/scala/org/apache/spark/sql/api/r/
- core/src/main/scala/org/apache/spark/api/r/
- mllib/src/main/scala/org/apache/spark/ml/r/
- core/src/test/scala/org/apache/spark/deploy/SparkSubmitSuite.scala
- bin/*.cmd
cache:
- C:\Users\appveyor\.m2
install:
# Install maven and dependencies
- ps: .\dev\appveyor-install-dependencies.ps1
# Required package for R unit tests
- cmd: R -e "install.packages(c('knitr', 'rmarkdown', 'devtools', 'e1071', 'survival'), repos='http://cran.us.r-project.org')"
# Here, we use the fixed version of testthat. For more details, please see SPARK-22817.
- cmd: R -e "devtools::install_version('testthat', version = '1.0.2', repos='http://cran.us.r-project.org')"
- cmd: R -e "packageVersion('knitr'); packageVersion('rmarkdown'); packageVersion('testthat'); packageVersion('e1071'); packageVersion('survival')"
build_script:
- cmd: mvn -DskipTests -Psparkr -Phive package
environment:
NOT_CRAN: true
test_script:
- cmd: .\bin\spark-submit2.cmd --driver-java-options "-Dlog4j.configuration=file:///%CD:\=/%/R/log4j.properties" --conf spark.hadoop.fs.defaultFS="file:///" R\pkg\tests\run-all.R
notifications:
- provider: Email
on_build_success: false
on_build_failure: false
on_build_status_changed: false