spark-instrumented-optimizer/python
Davies Liu 55349f9fe8 [SPARK-1740] [PySpark] kill the python worker
Kill only the python worker related to cancelled tasks.

The daemon will start a background thread to monitor all the opened sockets for all workers. If the socket is closed by JVM, this thread will kill the worker.

When an task is cancelled, the socket to worker will be closed, then the worker will be killed by deamon.

Author: Davies Liu <davies.liu@gmail.com>

Closes #1643 from davies/kill and squashes the following commits:

8ffe9f3 [Davies Liu] kill worker by deamon, because runtime.exec() is too heavy
46ca150 [Davies Liu] address comment
acd751c [Davies Liu] kill the worker when task is canceled
2014-08-03 15:52:00 -07:00
..
lib [SPARK-2305] [PySpark] Update Py4J to version 0.8.2.1 2014-07-29 19:02:06 -07:00
pyspark [SPARK-1740] [PySpark] kill the python worker 2014-08-03 15:52:00 -07:00
test_support License headers 2013-12-09 16:41:01 -08:00
.gitignore SPARK-1004. PySpark on YARN 2014-04-29 23:24:34 -07:00
epydoc.conf [SPARK-2538] [PySpark] Hash based disk spilling aggregation 2014-07-24 22:53:47 -07:00
run-tests [SPARK-2478] [mllib] DecisionTree Python API 2014-08-02 13:07:17 -07:00