SPARK-29397 added new interfaces for creating driver and executor plugins. These were added in a new, more isolated package that does not pollute the main o.a.s package. The old interface is now redundant. Since it's a DeveloperApi and we're about to have a new major release, let's remove it instead of carrying more baggage forward. Closes #26390 from vanzin/SPARK-29399. Authored-by: Marcelo Vanzin <vanzin@cloudera.com> Signed-off-by: HyukjinKwon <gurwls223@apache.org>
2 KiB
layout | title | displayTitle | license |
---|---|---|---|
global | Migration Guide: Spark Core | Migration Guide: Spark Core | Licensed to the Apache Software Foundation (ASF) under one or more contributor license agreements. See the NOTICE file distributed with this work for additional information regarding copyright ownership. The ASF licenses this file to You under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at http://www.apache.org/licenses/LICENSE-2.0 Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License. |
- Table of contents {:toc}
Upgrading from Core 2.4 to 3.0
-
The
org.apache.spark.ExecutorPlugin
interface and related configuration has been replaced withorg.apache.spark.plugin.SparkPlugin
, which adds new functionality. Plugins using the old interface need to be modified to extend the new interfaces. Check the Monitoring guide for more details. -
Deprecated method
TaskContext.isRunningLocally
has been removed. Local execution was removed and it always has returnedfalse
. -
Deprecated method
shuffleBytesWritten
,shuffleWriteTime
andshuffleRecordsWritten
inShuffleWriteMetrics
have been removed. Instead, usebytesWritten
,writeTime
andrecordsWritten
respectively. -
Deprecated method
AccumulableInfo.apply
have been removed because creatingAccumulableInfo
is disallowed. -
Event log file will be written as UTF-8 encoding, and Spark History Server will replay event log files as UTF-8 encoding. Previously Spark writes event log file as default charset of driver JVM process, so Spark History Server of Spark 2.x is needed to read the old event log files in case of incompatible encoding.