[SPARK-4821] [mllib] [python] [docs] Fix for pyspark.mllib.rand doc

+ small doc edit
+ include edit to make IntelliJ happy

CC: davies  mengxr

Note to davies  -- this does not fix the "WARNING: Literal block expected; none found." warnings since that seems to involve spacing which IntelliJ does not like.  (Those warnings occur when generating the Python docs.)

Author: Joseph K. Bradley <joseph@databricks.com>

Closes #3669 from jkbradley/python-warnings and squashes the following commits:

4587868 [Joseph K. Bradley] fixed warning
8cb073c [Joseph K. Bradley] Updated based on davies recommendation
c51eca4 [Joseph K. Bradley] Updated rst file for pyspark.mllib.rand doc.  Small doc edit.  Small include edit to make IntelliJ happy.
This commit is contained in:
Joseph K. Bradley 2014-12-17 14:12:46 -08:00 committed by Xiangrui Meng
parent 636d9fc450
commit affc3f460f
3 changed files with 5 additions and 30 deletions

View file

@ -1,5 +1,5 @@
pyspark.streaming module
==================
========================
Module contents
---------------

View file

@ -32,29 +32,4 @@ import sys
import rand as random
random.__name__ = 'random'
random.RandomRDDs.__module__ = __name__ + '.random'
class RandomModuleHook(object):
"""
Hook to import pyspark.mllib.random
"""
fullname = __name__ + '.random'
def find_module(self, name, path=None):
# skip all other modules
if not name.startswith(self.fullname):
return
return self
def load_module(self, name):
if name == self.fullname:
return random
cname = name.rsplit('.', 1)[-1]
try:
return getattr(random, cname)
except AttributeError:
raise ImportError
sys.meta_path.append(RandomModuleHook())
sys.modules[__name__ + '.random'] = random

View file

@ -53,10 +53,10 @@ class Normalizer(VectorTransformer):
"""
:: Experimental ::
Normalizes samples individually to unit L\ :sup:`p`\ norm
Normalizes samples individually to unit L\ :sup:`p`\ norm
For any 1 <= `p` <= float('inf'), normalizes samples using
sum(abs(vector). :sup:`p`) :sup:`(1/p)` as norm.
For any 1 <= `p` < float('inf'), normalizes samples using
sum(abs(vector) :sup:`p`) :sup:`(1/p)` as norm.
For `p` = float('inf'), max(abs(vector)) will be used as norm for normalization.