spark-instrumented-optimizer/sql/hive
jeanlyn f0e4040202 [SPARK-8379] [SQL] avoid speculative tasks write to the same file
The issue link [SPARK-8379](https://issues.apache.org/jira/browse/SPARK-8379)
Currently,when we insert data to the dynamic partition with speculative tasks we will get the Exception
```
org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.hdfs.server.namenode.LeaseExpiredException):
Lease mismatch on /tmp/hive-jeanlyn/hive_2015-06-15_15-20-44_734_8801220787219172413-1/-ext-10000/ds=2015-06-15/type=2/part-00301.lzo
owned by DFSClient_attempt_201506031520_0011_m_000189_0_-1513487243_53
but is accessed by DFSClient_attempt_201506031520_0011_m_000042_0_-1275047721_57
```
This pr try to write the data to temporary dir when using dynamic parition  avoid the speculative tasks writing the same file

Author: jeanlyn <jeanlyn92@gmail.com>

Closes #6833 from jeanlyn/speculation and squashes the following commits:

64bbfab [jeanlyn] use FileOutputFormat.getTaskOutputPath to get the path
8860af0 [jeanlyn] remove the never using code
e19a3bd [jeanlyn] avoid speculative tasks write same file

(cherry picked from commit a1e3649c87)
Signed-off-by: Cheng Lian <lian@databricks.com>
2015-06-21 00:13:55 -07:00
..
compatibility/src/test/scala/org/apache/spark/sql/hive/execution [SQL] [TEST] udf_java_method failed due to jdk version 2015-05-21 12:32:10 -07:00
src [SPARK-8379] [SQL] avoid speculative tasks write to the same file 2015-06-21 00:13:55 -07:00
v0.13.1/src/main/scala/org/apache/spark/sql/hive [SPARK-6505] [SQL] Remove the reflection call in HiveFunctionWrapper 2015-04-27 14:08:05 +08:00
pom.xml [SPARK-7558] Demarcate tests in unit-tests.log (1.4) 2015-06-03 20:46:44 -07:00