spark-instrumented-optimizer/python/pyspark/__init__.pyi
Yikun Jiang 4c1ccdabe8 [SPARK-34630][PYTHON] Add typehint for pyspark.__version__
### What changes were proposed in this pull request?
This PR adds the typehint of pyspark.__version__, which was mentioned in [SPARK-34630](https://issues.apache.org/jira/browse/SPARK-34630).

### Why are the changes needed?
There were some short discussion happened in https://github.com/apache/spark/pull/31823#discussion_r593830911 .

After further deep investigation on [1][2], we can see the `pyspark.__version__` is added by [setup.py](c06758834e/python/setup.py (L201)), it makes `__version__` embedded into pyspark module, that means the `__init__.pyi` is the right place to add the typehint for `__version__`.

So, this patch adds the type hint `__version__` in pyspark/__init__.pyi.

[1] [PEP-396 Module Version Numbers](https://www.python.org/dev/peps/pep-0396/)
[2] https://packaging.python.org/guides/single-sourcing-package-version/
### Does this PR introduce _any_ user-facing change?
No

### How was this patch tested?
1. Disable the ignore_error on
ee7bf7d962/python/mypy.ini (L132)

2. Run mypy:
- Before fix
```shell
(venv) ➜  spark git:(SPARK-34629) ✗ mypy --config-file python/mypy.ini python/pyspark | grep version
python/pyspark/pandas/spark/accessors.py:884: error: Module has no attribute "__version__"
```

- After fix
```shell
(venv) ➜  spark git:(SPARK-34629) ✗ mypy --config-file python/mypy.ini python/pyspark | grep version
```
no output

Closes #32110 from Yikun/SPARK-34629.

Authored-by: Yikun Jiang <yikunkero@gmail.com>
Signed-off-by: HyukjinKwon <gurwls223@apache.org>
2021-04-11 10:40:08 +09:00

76 lines
2.5 KiB
Python

#
# Licensed to the Apache Software Foundation (ASF) under one
# or more contributor license agreements. See the NOTICE file
# distributed with this work for additional information
# regarding copyright ownership. The ASF licenses this file
# to you under the Apache License, Version 2.0 (the
# "License"); you may not use this file except in compliance
# with the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing,
# software distributed under the License is distributed on an
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
# KIND, either express or implied. See the License for the
# specific language governing permissions and limitations
# under the License.
from typing import Callable, Optional, TypeVar
from pyspark.accumulators import ( # noqa: F401
Accumulator as Accumulator,
AccumulatorParam as AccumulatorParam,
)
from pyspark.broadcast import Broadcast as Broadcast # noqa: F401
from pyspark.conf import SparkConf as SparkConf # noqa: F401
from pyspark.context import SparkContext as SparkContext # noqa: F401
from pyspark.files import SparkFiles as SparkFiles # noqa: F401
from pyspark.status import (
StatusTracker as StatusTracker,
SparkJobInfo as SparkJobInfo,
SparkStageInfo as SparkStageInfo,
) # noqa: F401
from pyspark.profiler import ( # noqa: F401
BasicProfiler as BasicProfiler,
Profiler as Profiler,
)
from pyspark.rdd import RDD as RDD, RDDBarrier as RDDBarrier # noqa: F401
from pyspark.serializers import ( # noqa: F401
MarshalSerializer as MarshalSerializer,
PickleSerializer as PickleSerializer,
)
from pyspark.status import ( # noqa: F401
SparkJobInfo as SparkJobInfo,
SparkStageInfo as SparkStageInfo,
StatusTracker as StatusTracker,
)
from pyspark.storagelevel import StorageLevel as StorageLevel # noqa: F401
from pyspark.taskcontext import ( # noqa: F401
BarrierTaskContext as BarrierTaskContext,
BarrierTaskInfo as BarrierTaskInfo,
TaskContext as TaskContext,
)
from pyspark.util import InheritableThread as InheritableThread # noqa: F401
# Compatibility imports
from pyspark.sql import ( # noqa: F401
SQLContext as SQLContext,
HiveContext as HiveContext,
Row as Row,
)
T = TypeVar("T")
F = TypeVar("F", bound=Callable)
def since(version: str) -> Callable[[T], T]: ...
def copy_func(
f: F,
name: Optional[str] = ...,
sinceversion: Optional[str] = ...,
doc: Optional[str] = ...,
) -> F: ...
def keyword_only(func: F) -> F: ...
__version__: str