[SPARK-17770][CATALYST] making ObjectType public

## What changes were proposed in this pull request?

In order to facilitate the writing of additional Encoders, I proposed opening up the ObjectType SQL DataType. This DataType is used extensively in the JavaBean Encoder, but would also be useful in writing other custom encoders.

As mentioned by marmbrus, it is understood that the Expressions API is subject to potential change.

## How was this patch tested?

The change only affects the visibility of the ObjectType class, and the existing SQL test suite still runs without error.

Author: ALeksander Eskilson <alek.eskilson@cerner.com>

Closes #15453 from bdrillard/master.
This commit is contained in:
ALeksander Eskilson 2016-10-26 18:03:31 -07:00 committed by Michael Armbrust
parent 5b27598ff5
commit f1aeed8b02

View file

@ -19,7 +19,10 @@ package org.apache.spark.sql.types
import scala.language.existentials
private[sql] object ObjectType extends AbstractDataType {
import org.apache.spark.annotation.InterfaceStability
@InterfaceStability.Evolving
object ObjectType extends AbstractDataType {
override private[sql] def defaultConcreteType: DataType =
throw new UnsupportedOperationException("null literals can't be casted to ObjectType")
@ -32,11 +35,10 @@ private[sql] object ObjectType extends AbstractDataType {
}
/**
* Represents a JVM object that is passing through Spark SQL expression evaluation. Note this
* is only used internally while converting into the internal format and is not intended for use
* outside of the execution engine.
* Represents a JVM object that is passing through Spark SQL expression evaluation.
*/
private[sql] case class ObjectType(cls: Class[_]) extends DataType {
@InterfaceStability.Evolving
case class ObjectType(cls: Class[_]) extends DataType {
override def defaultSize: Int = 4096
def asNullable: DataType = this