### What changes were proposed in this pull request?
This PR proposes to disallow to create `SparkContext` in executors, e.g., in UDFs.
### Why are the changes needed?
Currently executors can create SparkContext, but shouldn't be able to create it.
```scala
sc.range(0, 1).foreach { _ =>
new SparkContext(new SparkConf().setAppName("test").setMaster("local"))
}
```
### Does this PR introduce _any_ user-facing change?
Yes, users won't be able to create `SparkContext` in executors.
### How was this patch tested?
Addes tests.
Closes#28986 from ueshin/issues/SPARK-32160/disallow_spark_context_in_executors.
Authored-by: Takuya UESHIN <ueshin@databricks.com>
Signed-off-by: HyukjinKwon <gurwls223@apache.org>