spark-instrumented-optimizer/repl/pom.xml

114 lines
4 KiB
XML
Raw Normal View History

2012-11-20 18:50:07 -05:00
<?xml version="1.0" encoding="UTF-8"?>
<!--
~ Licensed to the Apache Software Foundation (ASF) under one or more
~ contributor license agreements. See the NOTICE file distributed with
~ this work for additional information regarding copyright ownership.
~ The ASF licenses this file to You under the Apache License, Version 2.0
~ (the "License"); you may not use this file except in compliance with
~ the License. You may obtain a copy of the License at
~
~ http://www.apache.org/licenses/LICENSE-2.0
~
~ Unless required by applicable law or agreed to in writing, software
~ distributed under the License is distributed on an "AS IS" BASIS,
~ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
~ See the License for the specific language governing permissions and
~ limitations under the License.
-->
2012-11-20 18:50:07 -05:00
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<parent>
<groupId>org.apache.spark</groupId>
<artifactId>spark-parent</artifactId>
<version>1.1.0-SNAPSHOT</version>
2012-11-27 18:43:30 -05:00
<relativePath>../pom.xml</relativePath>
2012-11-20 18:50:07 -05:00
</parent>
<groupId>org.apache.spark</groupId>
<artifactId>spark-repl_2.10</artifactId>
2012-11-20 18:50:07 -05:00
<packaging>jar</packaging>
<name>Spark Project REPL</name>
<url>http://spark.apache.org/</url>
2012-11-20 18:50:07 -05:00
<properties>
[SPARK-1776] Have Spark's SBT build read dependencies from Maven. Patch introduces the new way of working also retaining the existing ways of doing things. For example build instruction for yarn in maven is `mvn -Pyarn -PHadoop2.2 clean package -DskipTests` in sbt it can become `MAVEN_PROFILES="yarn, hadoop-2.2" sbt/sbt clean assembly` Also supports `sbt/sbt -Pyarn -Phadoop-2.2 -Dhadoop.version=2.2.0 clean assembly` Author: Prashant Sharma <prashant.s@imaginea.com> Author: Patrick Wendell <pwendell@gmail.com> Closes #772 from ScrapCodes/sbt-maven and squashes the following commits: a8ac951 [Prashant Sharma] Updated sbt version. 62b09bb [Prashant Sharma] Improvements. fa6221d [Prashant Sharma] Excluding sql from mima 4b8875e [Prashant Sharma] Sbt assembly no longer builds tools by default. 72651ca [Prashant Sharma] Addresses code reivew comments. acab73d [Prashant Sharma] Revert "Small fix to run-examples script." ac4312c [Prashant Sharma] Revert "minor fix" 6af91ac [Prashant Sharma] Ported oldDeps back. + fixes issues with prev commit. 65cf06c [Prashant Sharma] Servelet API jars mess up with the other servlet jars on the class path. 446768e [Prashant Sharma] minor fix 89b9777 [Prashant Sharma] Merge conflicts d0a02f2 [Prashant Sharma] Bumped up pom versions, Since the build now depends on pom it is better updated there. + general cleanups. dccc8ac [Prashant Sharma] updated mima to check against 1.0 a49c61b [Prashant Sharma] Fix for tools jar a2f5ae1 [Prashant Sharma] Fixes a bug in dependencies. cf88758 [Prashant Sharma] cleanup 9439ea3 [Prashant Sharma] Small fix to run-examples script. 96cea1f [Prashant Sharma] SPARK-1776 Have Spark's SBT build read dependencies from Maven. 36efa62 [Patrick Wendell] Set project name in pom files and added eclipse/intellij plugins. 4973dbd [Patrick Wendell] Example build using pom reader.
2014-07-10 14:03:37 -04:00
<sbt.project.name>repl</sbt.project.name>
2012-11-20 18:50:07 -05:00
<deb.install.path>/usr/share/spark</deb.install.path>
<deb.user>root</deb.user>
</properties>
<dependencies>
<dependency>
<groupId>org.apache.spark</groupId>
2013-12-15 15:39:58 -05:00
<artifactId>spark-core_${scala.binary.version}</artifactId>
<version>${project.version}</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
2013-12-15 15:39:58 -05:00
<artifactId>spark-bagel_${scala.binary.version}</artifactId>
<version>${project.version}</version>
<scope>runtime</scope>
</dependency>
2013-08-21 16:50:24 -04:00
<dependency>
<groupId>org.apache.spark</groupId>
2013-12-15 15:39:58 -05:00
<artifactId>spark-mllib_${scala.binary.version}</artifactId>
2013-08-21 16:50:24 -04:00
<version>${project.version}</version>
<scope>runtime</scope>
</dependency>
SPARK-2632, SPARK-2576. Fixed by only importing what is necessary during class definition. Without this patch, it imports everything available in the scope. ```scala scala> val a = 10l val a = 10l a: Long = 10 scala> import a._ import a._ import a._ scala> case class A(a: Int) // show case class A(a: Int) // show class $read extends Serializable { def <init>() = { super.<init>; () }; class $iwC extends Serializable { def <init>() = { super.<init>; () }; class $iwC extends Serializable { def <init>() = { super.<init>; () }; import org.apache.spark.SparkContext._; class $iwC extends Serializable { def <init>() = { super.<init>; () }; val $VAL5 = $line5.$read.INSTANCE; import $VAL5.$iw.$iw.$iw.$iw.a; class $iwC extends Serializable { def <init>() = { super.<init>; () }; import a._; class $iwC extends Serializable { def <init>() = { super.<init>; () }; class $iwC extends Serializable { def <init>() = { super.<init>; () }; case class A extends scala.Product with scala.Serializable { <caseaccessor> <paramaccessor> val a: Int = _; def <init>(a: Int) = { super.<init>; () } } }; val $iw = new $iwC.<init> }; val $iw = new $iwC.<init> }; val $iw = new $iwC.<init> }; val $iw = new $iwC.<init> }; val $iw = new $iwC.<init> }; val $iw = new $iwC.<init> } object $read extends scala.AnyRef { def <init>() = { super.<init>; () }; val INSTANCE = new $read.<init> } defined class A ``` With this patch, it just imports only the necessary. ```scala scala> val a = 10l val a = 10l a: Long = 10 scala> import a._ import a._ import a._ scala> case class A(a: Int) // show case class A(a: Int) // show class $read extends Serializable { def <init>() = { super.<init>; () }; class $iwC extends Serializable { def <init>() = { super.<init>; () }; class $iwC extends Serializable { def <init>() = { super.<init>; () }; case class A extends scala.Product with scala.Serializable { <caseaccessor> <paramaccessor> val a: Int = _; def <init>(a: Int) = { super.<init>; () } } }; val $iw = new $iwC.<init> }; val $iw = new $iwC.<init> } object $read extends scala.AnyRef { def <init>() = { super.<init>; () }; val INSTANCE = new $read.<init> } defined class A scala> ``` This patch also adds a `:fallback` mode on being enabled it will restore the spark-shell's 1.0.0 behaviour. Author: Prashant Sharma <scrapcodes@gmail.com> Author: Yin Huai <huai@cse.ohio-state.edu> Author: Prashant Sharma <prashant.s@imaginea.com> Closes #1635 from ScrapCodes/repl-fix-necessary-imports and squashes the following commits: b1968d2 [Prashant Sharma] Added toschemaRDD to test case. 0b712bb [Yin Huai] Add a REPL test to test importing a method. 02ad8ff [Yin Huai] Add a REPL test for importing SQLContext.createSchemaRDD. ed6d0c7 [Prashant Sharma] Added a fallback mode, incase users run into issues while using repl. b63d3b2 [Prashant Sharma] SPARK-2632, SPARK-2576. Fixed by only importing what is necessary during class definition.
2014-08-01 01:57:13 -04:00
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_${scala.binary.version}</artifactId>
<version>${project.version}</version>
<scope>test</scope>
</dependency>
2012-11-20 18:50:07 -05:00
<dependency>
<groupId>org.eclipse.jetty</groupId>
<artifactId>jetty-server</artifactId>
</dependency>
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-compiler</artifactId>
2013-09-26 01:18:24 -04:00
<version>${scala.version}</version>
2012-11-20 18:50:07 -05:00
</dependency>
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-reflect</artifactId>
<version>${scala.version}</version>
</dependency>
2012-11-20 18:50:07 -05:00
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>jline</artifactId>
2013-09-26 01:18:24 -04:00
<version>${scala.version}</version>
2012-11-20 18:50:07 -05:00
</dependency>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>jul-to-slf4j</artifactId>
</dependency>
<dependency>
<groupId>org.scalatest</groupId>
2013-12-15 15:39:58 -05:00
<artifactId>scalatest_${scala.binary.version}</artifactId>
2012-11-20 18:50:07 -05:00
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.scalacheck</groupId>
2013-12-15 15:39:58 -05:00
<artifactId>scalacheck_${scala.binary.version}</artifactId>
2012-11-20 18:50:07 -05:00
<scope>test</scope>
</dependency>
</dependencies>
<build>
2013-12-15 15:39:58 -05:00
<outputDirectory>target/scala-${scala.binary.version}/classes</outputDirectory>
<testOutputDirectory>target/scala-${scala.binary.version}/test-classes</testOutputDirectory>
2012-11-20 18:50:07 -05:00
<plugins>
<plugin>
<groupId>org.scalatest</groupId>
<artifactId>scalatest-maven-plugin</artifactId>
<configuration>
<environmentVariables>
<SPARK_HOME>${basedir}/..</SPARK_HOME>
</environmentVariables>
</configuration>
</plugin>
</plugins>
</build>
</project>