[SPARK-16508][SPARKR] Split docs for arrange and orderBy methods

## What changes were proposed in this pull request?

This PR splits arrange and orderBy methods according to their functionality (the former for sorting sparkDataFrame and the latter for windowSpec).

## How was this patch tested?

![screen shot 2016-08-06 at 6 39 19 pm](https://cloud.githubusercontent.com/assets/15318264/17459969/51eade28-5c05-11e6-8ca1-8d8a8e344bab.png)
![screen shot 2016-08-06 at 6 39 29 pm](https://cloud.githubusercontent.com/assets/15318264/17459966/51e3c246-5c05-11e6-8d35-3e905ca48676.png)
![screen shot 2016-08-06 at 6 40 02 pm](https://cloud.githubusercontent.com/assets/15318264/17459967/51e650ec-5c05-11e6-8698-0f037f5199ff.png)

Author: Junyang Qian <junyangq@databricks.com>

Closes #14522 from junyangq/SPARK-16508-0.
This commit is contained in:
Junyang Qian 2016-08-15 11:03:03 -07:00 committed by Shivaram Venkataraman
parent 3d8bfe7a39
commit 564fe614c1
4 changed files with 17 additions and 15 deletions

1
.gitignore vendored
View file

@ -82,3 +82,4 @@ spark-warehouse/
*.Rproj
*.Rproj.*
.Rproj.user

View file

@ -2048,14 +2048,14 @@ setMethod("rename",
setClassUnion("characterOrColumn", c("character", "Column"))
#' Arrange
#' Arrange Rows by Variables
#'
#' Sort a SparkDataFrame by the specified column(s).
#'
#' @param x A SparkDataFrame to be sorted.
#' @param col A character or Column object vector indicating the fields to sort on
#' @param ... Additional sorting fields
#' @param decreasing A logical argument indicating sorting order for columns when
#' @param x a SparkDataFrame to be sorted.
#' @param col a character or Column object indicating the fields to sort on
#' @param ... additional sorting fields
#' @param decreasing a logical argument indicating sorting order for columns when
#' a character vector is specified for col
#' @return A SparkDataFrame where all elements are sorted.
#' @family SparkDataFrame functions
@ -2120,7 +2120,6 @@ setMethod("arrange",
})
#' @rdname arrange
#' @name orderBy
#' @aliases orderBy,SparkDataFrame,characterOrColumn-method
#' @export
#' @note orderBy(SparkDataFrame, characterOrColumn) since 1.4.0

View file

@ -82,16 +82,18 @@ setMethod("partitionBy",
}
})
#' orderBy
#' Ordering Columns in a WindowSpec
#'
#' Defines the ordering columns in a WindowSpec.
#'
#' @param x a WindowSpec
#' @return a WindowSpec
#' @rdname arrange
#' @param col a character or Column object indicating an ordering column
#' @param ... additional sorting fields
#' @return A WindowSpec.
#' @name orderBy
#' @rdname orderBy
#' @aliases orderBy,WindowSpec,character-method
#' @family windowspec_method
#' @seealso See \link{arrange} for use in sorting a SparkDataFrame
#' @export
#' @examples
#' \dontrun{
@ -105,7 +107,7 @@ setMethod("orderBy",
windowSpec(callJMethod(x@sws, "orderBy", col, list(...)))
})
#' @rdname arrange
#' @rdname orderBy
#' @name orderBy
#' @aliases orderBy,WindowSpec,Column-method
#' @export
@ -122,7 +124,7 @@ setMethod("orderBy",
#' rowsBetween
#'
#' Defines the frame boundaries, from `start` (inclusive) to `end` (inclusive).
#'
#'
#' Both `start` and `end` are relative positions from the current row. For example, "0" means
#' "current row", while "-1" means the row before the current row, and "5" means the fifth row
#' after the current row.
@ -154,7 +156,7 @@ setMethod("rowsBetween",
#' rangeBetween
#'
#' Defines the frame boundaries, from `start` (inclusive) to `end` (inclusive).
#'
#'
#' Both `start` and `end` are relative from the current row. For example, "0" means "current row",
#' while "-1" means one off before the current row, and "5" means the five off after the
#' current row.
@ -188,7 +190,7 @@ setMethod("rangeBetween",
#' over
#'
#' Define a windowing column.
#' Define a windowing column.
#'
#' @rdname over
#' @name over

View file

@ -551,7 +551,7 @@ setGeneric("merge")
#' @export
setGeneric("mutate", function(.data, ...) {standardGeneric("mutate") })
#' @rdname arrange
#' @rdname orderBy
#' @export
setGeneric("orderBy", function(x, col, ...) { standardGeneric("orderBy") })