spark-instrumented-optimizer/python/pyspark/resource/profile.pyi
Fokko Driesprong e4d1c10760 [SPARK-32320][PYSPARK] Remove mutable default arguments
This is bad practice, and might lead to unexpected behaviour:
https://florimond.dev/blog/articles/2018/08/python-mutable-defaults-are-the-source-of-all-evil/

```
fokkodriesprongFan spark % grep -R "={}" python | grep def

python/pyspark/resource/profile.py:    def __init__(self, _java_resource_profile=None, _exec_req={}, _task_req={}):
python/pyspark/sql/functions.py:def from_json(col, schema, options={}):
python/pyspark/sql/functions.py:def to_json(col, options={}):
python/pyspark/sql/functions.py:def schema_of_json(json, options={}):
python/pyspark/sql/functions.py:def schema_of_csv(csv, options={}):
python/pyspark/sql/functions.py:def to_csv(col, options={}):
python/pyspark/sql/functions.py:def from_csv(col, schema, options={}):
python/pyspark/sql/avro/functions.py:def from_avro(data, jsonFormatSchema, options={}):
```

```
fokkodriesprongFan spark % grep -R "=\[\]" python | grep def
python/pyspark/ml/tuning.py:    def __init__(self, bestModel, avgMetrics=[], subModels=None):
python/pyspark/ml/tuning.py:    def __init__(self, bestModel, validationMetrics=[], subModels=None):
```

### What changes were proposed in this pull request?

Removing the mutable default arguments.

### Why are the changes needed?

Removing the mutable default arguments, and changing the signature to `Optional[...]`.

### Does this PR introduce _any_ user-facing change?

No 👍

### How was this patch tested?

Using the Flake8 bugbear code analysis plugin.

Closes #29122 from Fokko/SPARK-32320.

Authored-by: Fokko Driesprong <fokko@apache.org>
Signed-off-by: Ruifeng Zheng <ruifengz@foxmail.com>
2020-12-08 09:35:36 +08:00

61 lines
2.3 KiB
Python

#
# Licensed to the Apache Software Foundation (ASF) under one
# or more contributor license agreements. See the NOTICE file
# distributed with this work for additional information
# regarding copyright ownership. The ASF licenses this file
# to you under the Apache License, Version 2.0 (the
# "License"); you may not use this file except in compliance
# with the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing,
# software distributed under the License is distributed on an
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
# KIND, either express or implied. See the License for the
# specific language governing permissions and limitations
# under the License.
from pyspark.resource.requests import ( # noqa: F401
ExecutorResourceRequest as ExecutorResourceRequest,
ExecutorResourceRequests as ExecutorResourceRequests,
TaskResourceRequest as TaskResourceRequest,
TaskResourceRequests as TaskResourceRequests,
)
from typing import overload, Dict, Union, Optional
from py4j.java_gateway import JavaObject # type: ignore[import]
class ResourceProfile:
@overload
def __init__(
self,
_java_resource_profile: JavaObject,
) -> None: ...
@overload
def __init__(
self,
_java_resource_profile: None = ...,
_exec_req: Optional[Dict[str, ExecutorResourceRequest]] = ...,
_task_req: Optional[Dict[str, TaskResourceRequest]] = ...,
) -> None: ...
@property
def id(self) -> int: ...
@property
def taskResources(self) -> Dict[str, TaskResourceRequest]: ...
@property
def executorResources(self) -> Dict[str, ExecutorResourceRequest]: ...
class ResourceProfileBuilder:
def __init__(self) -> None: ...
def require(
self, resourceRequest: Union[ExecutorResourceRequest, TaskResourceRequests]
) -> ResourceProfileBuilder: ...
def clearExecutorResourceRequests(self) -> None: ...
def clearTaskResourceRequests(self) -> None: ...
@property
def taskResources(self) -> Dict[str, TaskResourceRequest]: ...
@property
def executorResources(self) -> Dict[str, ExecutorResourceRequest]: ...
@property
def build(self) -> ResourceProfile: ...