[SPARK-29807][SQL] Rename "spark.sql.ansi.enabled" to "spark.sql.dialect.spark.ansi.enabled"

### What changes were proposed in this pull request?
Rename config "spark.sql.ansi.enabled" to "spark.sql.dialect.spark.ansi.enabled"

### Why are the changes needed?
The relation between "spark.sql.ansi.enabled" and "spark.sql.dialect" is confusing, since the "PostgreSQL" dialect should contain the features of "spark.sql.ansi.enabled".

To make things clearer, we can rename the "spark.sql.ansi.enabled" to "spark.sql.dialect.spark.ansi.enabled", thus the option "spark.sql.dialect.spark.ansi.enabled" is only for Spark dialect.

For the casting and arithmetic operations, runtime exceptions should be thrown if "spark.sql.dialect" is "spark" and "spark.sql.dialect.spark.ansi.enabled" is true or "spark.sql.dialect" is PostgresSQL.

### Does this PR introduce any user-facing change?
Yes, the config name changed.

### How was this patch tested?
Existing UT.

Closes #26444 from xuanyuanking/SPARK-29807.

Authored-by: Yuanjian Li <xyliyuanjian@gmail.com>
Signed-off-by: Wenchen Fan <wenchen@databricks.com>
This commit is contained in:
Yuanjian Li 2019-11-16 17:46:39 +08:00 committed by Wenchen Fan
parent f77c10de38
commit 40ea4a11d7
19 changed files with 77 additions and 64 deletions

View file

@ -19,15 +19,15 @@ license: |
limitations under the License. limitations under the License.
--- ---
When `spark.sql.ansi.enabled` is true, Spark SQL has two kinds of keywords: When `spark.sql.dialect.spark.ansi.enabled` is true, Spark SQL has two kinds of keywords:
* Reserved keywords: Keywords that are reserved and can't be used as identifiers for table, view, column, function, alias, etc. * Reserved keywords: Keywords that are reserved and can't be used as identifiers for table, view, column, function, alias, etc.
* Non-reserved keywords: Keywords that have a special meaning only in particular contexts and can be used as identifiers in other contexts. For example, `SELECT 1 WEEK` is an interval literal, but WEEK can be used as identifiers in other places. * Non-reserved keywords: Keywords that have a special meaning only in particular contexts and can be used as identifiers in other contexts. For example, `SELECT 1 WEEK` is an interval literal, but WEEK can be used as identifiers in other places.
When `spark.sql.ansi.enabled` is false, Spark SQL has two kinds of keywords: When `spark.sql.dialect.spark.ansi.enabled` is false, Spark SQL has two kinds of keywords:
* Non-reserved keywords: Same definition as the one when `spark.sql.ansi.enabled=true`. * Non-reserved keywords: Same definition as the one when `spark.sql.dialect.spark.ansi.enabled=true`.
* Strict-non-reserved keywords: A strict version of non-reserved keywords, which can not be used as table alias. * Strict-non-reserved keywords: A strict version of non-reserved keywords, which can not be used as table alias.
By default `spark.sql.ansi.enabled` is false. By default `spark.sql.dialect.spark.ansi.enabled` is false.
Below is a list of all the keywords in Spark SQL. Below is a list of all the keywords in Spark SQL.

View file

@ -955,7 +955,7 @@ number
| MINUS? BIGDECIMAL_LITERAL #bigDecimalLiteral | MINUS? BIGDECIMAL_LITERAL #bigDecimalLiteral
; ;
// When `spark.sql.ansi.enabled=true`, there are 2 kinds of keywords in Spark SQL. // When `spark.sql.dialect.spark.ansi.enabled=true`, there are 2 kinds of keywords in Spark SQL.
// - Reserved keywords: // - Reserved keywords:
// Keywords that are reserved and can't be used as identifiers for table, view, column, // Keywords that are reserved and can't be used as identifiers for table, view, column,
// function, alias, etc. // function, alias, etc.
@ -1155,9 +1155,9 @@ ansiNonReserved
| YEARS | YEARS
; ;
// When `spark.sql.ansi.enabled=false`, there are 2 kinds of keywords in Spark SQL. // When `spark.sql.dialect.spark.ansi.enabled=false`, there are 2 kinds of keywords in Spark SQL.
// - Non-reserved keywords: // - Non-reserved keywords:
// Same definition as the one when `spark.sql.ansi.enabled=true`. // Same definition as the one when `spark.sql.dialect.spark.ansi.enabled=true`.
// - Strict-non-reserved keywords: // - Strict-non-reserved keywords:
// A strict version of non-reserved keywords, which can not be used as table alias. // A strict version of non-reserved keywords, which can not be used as table alias.
// You can find the full keywords list by searching "Start of the keywords list" in this file. // You can find the full keywords list by searching "Start of the keywords list" in this file.

View file

@ -592,7 +592,7 @@ abstract class CastBase extends UnaryExpression with TimeZoneAwareExpression wit
* Change the precision / scale in a given decimal to those set in `decimalType` (if any), * Change the precision / scale in a given decimal to those set in `decimalType` (if any),
* modifying `value` in-place and returning it if successful. If an overflow occurs, it * modifying `value` in-place and returning it if successful. If an overflow occurs, it
* either returns null or throws an exception according to the value set for * either returns null or throws an exception according to the value set for
* `spark.sql.ansi.enabled`. * `spark.sql.dialect.spark.ansi.enabled`.
* *
* NOTE: this modifies `value` in-place, so don't call it on external data. * NOTE: this modifies `value` in-place, so don't call it on external data.
*/ */
@ -611,7 +611,7 @@ abstract class CastBase extends UnaryExpression with TimeZoneAwareExpression wit
/** /**
* Create new `Decimal` with precision and scale given in `decimalType` (if any). * Create new `Decimal` with precision and scale given in `decimalType` (if any).
* If overflow occurs, if `spark.sql.ansi.enabled` is false, null is returned; * If overflow occurs, if `spark.sql.dialect.spark.ansi.enabled` is false, null is returned;
* otherwise, an `ArithmeticException` is thrown. * otherwise, an `ArithmeticException` is thrown.
*/ */
private[this] def toPrecision(value: Decimal, decimalType: DecimalType): Decimal = private[this] def toPrecision(value: Decimal, decimalType: DecimalType): Decimal =

View file

@ -150,7 +150,7 @@ abstract class BinaryArithmetic extends BinaryOperator with NullIntolerant {
sys.error("BinaryArithmetics must override either calendarIntervalMethod or genCode") sys.error("BinaryArithmetics must override either calendarIntervalMethod or genCode")
// Name of the function for the exact version of this expression in [[Math]]. // Name of the function for the exact version of this expression in [[Math]].
// If the option "spark.sql.ansi.enabled" is enabled and there is corresponding // If the option "spark.sql.dialect.spark.ansi.enabled" is enabled and there is corresponding
// function in [[Math]], the exact function will be called instead of evaluation with [[symbol]]. // function in [[Math]], the exact function will be called instead of evaluation with [[symbol]].
def exactMathMethod: Option[String] = None def exactMathMethod: Option[String] = None

View file

@ -101,7 +101,7 @@ abstract class AbstractSqlParser(conf: SQLConf) extends ParserInterface with Log
lexer.removeErrorListeners() lexer.removeErrorListeners()
lexer.addErrorListener(ParseErrorListener) lexer.addErrorListener(ParseErrorListener)
lexer.legacy_setops_precedence_enbled = conf.setOpsPrecedenceEnforced lexer.legacy_setops_precedence_enbled = conf.setOpsPrecedenceEnforced
lexer.ansi = conf.ansiEnabled lexer.ansi = conf.dialectSparkAnsiEnabled
val tokenStream = new CommonTokenStream(lexer) val tokenStream = new CommonTokenStream(lexer)
val parser = new SqlBaseParser(tokenStream) val parser = new SqlBaseParser(tokenStream)
@ -109,7 +109,7 @@ abstract class AbstractSqlParser(conf: SQLConf) extends ParserInterface with Log
parser.removeErrorListeners() parser.removeErrorListeners()
parser.addErrorListener(ParseErrorListener) parser.addErrorListener(ParseErrorListener)
parser.legacy_setops_precedence_enbled = conf.setOpsPrecedenceEnforced parser.legacy_setops_precedence_enbled = conf.setOpsPrecedenceEnforced
parser.ansi = conf.ansiEnabled parser.ansi = conf.dialectSparkAnsiEnabled
try { try {
try { try {

View file

@ -1673,6 +1673,20 @@ object SQLConf {
.checkValues(Dialect.values.map(_.toString)) .checkValues(Dialect.values.map(_.toString))
.createWithDefault(Dialect.SPARK.toString) .createWithDefault(Dialect.SPARK.toString)
val ANSI_ENABLED = buildConf("spark.sql.ansi.enabled")
.internal()
.doc("This configuration is deprecated and will be removed in the future releases." +
"It is replaced by spark.sql.dialect.spark.ansi.enabled.")
.booleanConf
.createWithDefault(false)
val DIALECT_SPARK_ANSI_ENABLED = buildConf("spark.sql.dialect.spark.ansi.enabled")
.doc("When true, Spark tries to conform to the ANSI SQL specification: 1. Spark will " +
"throw a runtime exception if an overflow occurs in any operation on integral/decimal " +
"field. 2. Spark will forbid using the reserved keywords of ANSI SQL as identifiers in " +
"the SQL parser.")
.fallbackConf(ANSI_ENABLED)
val ALLOW_CREATING_MANAGED_TABLE_USING_NONEMPTY_LOCATION = val ALLOW_CREATING_MANAGED_TABLE_USING_NONEMPTY_LOCATION =
buildConf("spark.sql.legacy.allowCreatingManagedTableUsingNonemptyLocation") buildConf("spark.sql.legacy.allowCreatingManagedTableUsingNonemptyLocation")
.internal() .internal()
@ -1784,14 +1798,6 @@ object SQLConf {
.checkValues(StoreAssignmentPolicy.values.map(_.toString)) .checkValues(StoreAssignmentPolicy.values.map(_.toString))
.createWithDefault(StoreAssignmentPolicy.ANSI.toString) .createWithDefault(StoreAssignmentPolicy.ANSI.toString)
val ANSI_ENABLED = buildConf("spark.sql.ansi.enabled")
.doc("When true, Spark tries to conform to the ANSI SQL specification: 1. Spark will " +
"throw a runtime exception if an overflow occurs in any operation on integral/decimal " +
"field. 2. Spark will forbid using the reserved keywords of ANSI SQL as identifiers in " +
"the SQL parser.")
.booleanConf
.createWithDefault(false)
val SORT_BEFORE_REPARTITION = val SORT_BEFORE_REPARTITION =
buildConf("spark.sql.execution.sortBeforeRepartition") buildConf("spark.sql.execution.sortBeforeRepartition")
.internal() .internal()
@ -2521,9 +2527,11 @@ class SQLConf extends Serializable with Logging {
def storeAssignmentPolicy: StoreAssignmentPolicy.Value = def storeAssignmentPolicy: StoreAssignmentPolicy.Value =
StoreAssignmentPolicy.withName(getConf(STORE_ASSIGNMENT_POLICY)) StoreAssignmentPolicy.withName(getConf(STORE_ASSIGNMENT_POLICY))
def ansiEnabled: Boolean = getConf(ANSI_ENABLED) def usePostgreSQLDialect: Boolean = getConf(DIALECT) == Dialect.POSTGRESQL.toString
def usePostgreSQLDialect: Boolean = getConf(DIALECT) == Dialect.POSTGRESQL.toString() def dialectSparkAnsiEnabled: Boolean = getConf(DIALECT_SPARK_ANSI_ENABLED)
def ansiEnabled: Boolean = usePostgreSQLDialect || dialectSparkAnsiEnabled
def nestedSchemaPruningEnabled: Boolean = getConf(NESTED_SCHEMA_PRUNING_ENABLED) def nestedSchemaPruningEnabled: Boolean = getConf(NESTED_SCHEMA_PRUNING_ENABLED)

View file

@ -436,7 +436,7 @@ class ExpressionEncoderSuite extends CodegenInterpretedPlanTest with AnalysisTes
testAndVerifyNotLeakingReflectionObjects( testAndVerifyNotLeakingReflectionObjects(
s"overflowing $testName, ansiEnabled=$ansiEnabled") { s"overflowing $testName, ansiEnabled=$ansiEnabled") {
withSQLConf( withSQLConf(
SQLConf.ANSI_ENABLED.key -> ansiEnabled.toString SQLConf.DIALECT_SPARK_ANSI_ENABLED.key -> ansiEnabled.toString
) { ) {
// Need to construct Encoder here rather than implicitly resolving it // Need to construct Encoder here rather than implicitly resolving it
// so that SQLConf changes are respected. // so that SQLConf changes are respected.

View file

@ -169,7 +169,7 @@ class RowEncoderSuite extends CodegenInterpretedPlanTest {
} }
private def testDecimalOverflow(schema: StructType, row: Row): Unit = { private def testDecimalOverflow(schema: StructType, row: Row): Unit = {
withSQLConf(SQLConf.ANSI_ENABLED.key -> "true") { withSQLConf(SQLConf.DIALECT_SPARK_ANSI_ENABLED.key -> "true") {
val encoder = RowEncoder(schema).resolveAndBind() val encoder = RowEncoder(schema).resolveAndBind()
intercept[Exception] { intercept[Exception] {
encoder.toRow(row) encoder.toRow(row)
@ -182,7 +182,7 @@ class RowEncoderSuite extends CodegenInterpretedPlanTest {
} }
} }
withSQLConf(SQLConf.ANSI_ENABLED.key -> "false") { withSQLConf(SQLConf.DIALECT_SPARK_ANSI_ENABLED.key -> "false") {
val encoder = RowEncoder(schema).resolveAndBind() val encoder = RowEncoder(schema).resolveAndBind()
assert(encoder.fromRow(encoder.toRow(row)).get(0) == null) assert(encoder.fromRow(encoder.toRow(row)).get(0) == null)
} }

View file

@ -61,7 +61,7 @@ class ArithmeticExpressionSuite extends SparkFunSuite with ExpressionEvalHelper
checkEvaluation(Add(positiveLongLit, negativeLongLit), -1L) checkEvaluation(Add(positiveLongLit, negativeLongLit), -1L)
Seq("true", "false").foreach { checkOverflow => Seq("true", "false").foreach { checkOverflow =>
withSQLConf(SQLConf.ANSI_ENABLED.key -> checkOverflow) { withSQLConf(SQLConf.DIALECT_SPARK_ANSI_ENABLED.key -> checkOverflow) {
DataTypeTestUtils.numericAndInterval.foreach { tpe => DataTypeTestUtils.numericAndInterval.foreach { tpe =>
checkConsistencyBetweenInterpretedAndCodegenAllowingException(Add, tpe, tpe) checkConsistencyBetweenInterpretedAndCodegenAllowingException(Add, tpe, tpe)
} }
@ -80,7 +80,7 @@ class ArithmeticExpressionSuite extends SparkFunSuite with ExpressionEvalHelper
checkEvaluation(UnaryMinus(Literal(Int.MinValue)), Int.MinValue) checkEvaluation(UnaryMinus(Literal(Int.MinValue)), Int.MinValue)
checkEvaluation(UnaryMinus(Literal(Short.MinValue)), Short.MinValue) checkEvaluation(UnaryMinus(Literal(Short.MinValue)), Short.MinValue)
checkEvaluation(UnaryMinus(Literal(Byte.MinValue)), Byte.MinValue) checkEvaluation(UnaryMinus(Literal(Byte.MinValue)), Byte.MinValue)
withSQLConf(SQLConf.ANSI_ENABLED.key -> "true") { withSQLConf(SQLConf.DIALECT_SPARK_ANSI_ENABLED.key -> "true") {
checkExceptionInExpression[ArithmeticException]( checkExceptionInExpression[ArithmeticException](
UnaryMinus(Literal(Long.MinValue)), "overflow") UnaryMinus(Literal(Long.MinValue)), "overflow")
checkExceptionInExpression[ArithmeticException]( checkExceptionInExpression[ArithmeticException](
@ -122,7 +122,7 @@ class ArithmeticExpressionSuite extends SparkFunSuite with ExpressionEvalHelper
checkEvaluation(Subtract(positiveLongLit, negativeLongLit), positiveLong - negativeLong) checkEvaluation(Subtract(positiveLongLit, negativeLongLit), positiveLong - negativeLong)
Seq("true", "false").foreach { checkOverflow => Seq("true", "false").foreach { checkOverflow =>
withSQLConf(SQLConf.ANSI_ENABLED.key -> checkOverflow) { withSQLConf(SQLConf.DIALECT_SPARK_ANSI_ENABLED.key -> checkOverflow) {
DataTypeTestUtils.numericAndInterval.foreach { tpe => DataTypeTestUtils.numericAndInterval.foreach { tpe =>
checkConsistencyBetweenInterpretedAndCodegenAllowingException(Subtract, tpe, tpe) checkConsistencyBetweenInterpretedAndCodegenAllowingException(Subtract, tpe, tpe)
} }
@ -144,7 +144,7 @@ class ArithmeticExpressionSuite extends SparkFunSuite with ExpressionEvalHelper
checkEvaluation(Multiply(positiveLongLit, negativeLongLit), positiveLong * negativeLong) checkEvaluation(Multiply(positiveLongLit, negativeLongLit), positiveLong * negativeLong)
Seq("true", "false").foreach { checkOverflow => Seq("true", "false").foreach { checkOverflow =>
withSQLConf(SQLConf.ANSI_ENABLED.key -> checkOverflow) { withSQLConf(SQLConf.DIALECT_SPARK_ANSI_ENABLED.key -> checkOverflow) {
DataTypeTestUtils.numericTypeWithoutDecimal.foreach { tpe => DataTypeTestUtils.numericTypeWithoutDecimal.foreach { tpe =>
checkConsistencyBetweenInterpretedAndCodegenAllowingException(Multiply, tpe, tpe) checkConsistencyBetweenInterpretedAndCodegenAllowingException(Multiply, tpe, tpe)
} }
@ -445,12 +445,12 @@ class ArithmeticExpressionSuite extends SparkFunSuite with ExpressionEvalHelper
val e4 = Add(minLongLiteral, minLongLiteral) val e4 = Add(minLongLiteral, minLongLiteral)
val e5 = Subtract(minLongLiteral, maxLongLiteral) val e5 = Subtract(minLongLiteral, maxLongLiteral)
val e6 = Multiply(minLongLiteral, minLongLiteral) val e6 = Multiply(minLongLiteral, minLongLiteral)
withSQLConf(SQLConf.ANSI_ENABLED.key -> "true") { withSQLConf(SQLConf.DIALECT_SPARK_ANSI_ENABLED.key -> "true") {
Seq(e1, e2, e3, e4, e5, e6).foreach { e => Seq(e1, e2, e3, e4, e5, e6).foreach { e =>
checkExceptionInExpression[ArithmeticException](e, "overflow") checkExceptionInExpression[ArithmeticException](e, "overflow")
} }
} }
withSQLConf(SQLConf.ANSI_ENABLED.key -> "false") { withSQLConf(SQLConf.DIALECT_SPARK_ANSI_ENABLED.key -> "false") {
checkEvaluation(e1, Long.MinValue) checkEvaluation(e1, Long.MinValue)
checkEvaluation(e2, Long.MinValue) checkEvaluation(e2, Long.MinValue)
checkEvaluation(e3, -2L) checkEvaluation(e3, -2L)
@ -469,12 +469,12 @@ class ArithmeticExpressionSuite extends SparkFunSuite with ExpressionEvalHelper
val e4 = Add(minIntLiteral, minIntLiteral) val e4 = Add(minIntLiteral, minIntLiteral)
val e5 = Subtract(minIntLiteral, maxIntLiteral) val e5 = Subtract(minIntLiteral, maxIntLiteral)
val e6 = Multiply(minIntLiteral, minIntLiteral) val e6 = Multiply(minIntLiteral, minIntLiteral)
withSQLConf(SQLConf.ANSI_ENABLED.key -> "true") { withSQLConf(SQLConf.DIALECT_SPARK_ANSI_ENABLED.key -> "true") {
Seq(e1, e2, e3, e4, e5, e6).foreach { e => Seq(e1, e2, e3, e4, e5, e6).foreach { e =>
checkExceptionInExpression[ArithmeticException](e, "overflow") checkExceptionInExpression[ArithmeticException](e, "overflow")
} }
} }
withSQLConf(SQLConf.ANSI_ENABLED.key -> "false") { withSQLConf(SQLConf.DIALECT_SPARK_ANSI_ENABLED.key -> "false") {
checkEvaluation(e1, Int.MinValue) checkEvaluation(e1, Int.MinValue)
checkEvaluation(e2, Int.MinValue) checkEvaluation(e2, Int.MinValue)
checkEvaluation(e3, -2) checkEvaluation(e3, -2)
@ -493,12 +493,12 @@ class ArithmeticExpressionSuite extends SparkFunSuite with ExpressionEvalHelper
val e4 = Add(minShortLiteral, minShortLiteral) val e4 = Add(minShortLiteral, minShortLiteral)
val e5 = Subtract(minShortLiteral, maxShortLiteral) val e5 = Subtract(minShortLiteral, maxShortLiteral)
val e6 = Multiply(minShortLiteral, minShortLiteral) val e6 = Multiply(minShortLiteral, minShortLiteral)
withSQLConf(SQLConf.ANSI_ENABLED.key -> "true") { withSQLConf(SQLConf.DIALECT_SPARK_ANSI_ENABLED.key -> "true") {
Seq(e1, e2, e3, e4, e5, e6).foreach { e => Seq(e1, e2, e3, e4, e5, e6).foreach { e =>
checkExceptionInExpression[ArithmeticException](e, "overflow") checkExceptionInExpression[ArithmeticException](e, "overflow")
} }
} }
withSQLConf(SQLConf.ANSI_ENABLED.key -> "false") { withSQLConf(SQLConf.DIALECT_SPARK_ANSI_ENABLED.key -> "false") {
checkEvaluation(e1, Short.MinValue) checkEvaluation(e1, Short.MinValue)
checkEvaluation(e2, Short.MinValue) checkEvaluation(e2, Short.MinValue)
checkEvaluation(e3, (-2).toShort) checkEvaluation(e3, (-2).toShort)
@ -517,12 +517,12 @@ class ArithmeticExpressionSuite extends SparkFunSuite with ExpressionEvalHelper
val e4 = Add(minByteLiteral, minByteLiteral) val e4 = Add(minByteLiteral, minByteLiteral)
val e5 = Subtract(minByteLiteral, maxByteLiteral) val e5 = Subtract(minByteLiteral, maxByteLiteral)
val e6 = Multiply(minByteLiteral, minByteLiteral) val e6 = Multiply(minByteLiteral, minByteLiteral)
withSQLConf(SQLConf.ANSI_ENABLED.key -> "true") { withSQLConf(SQLConf.DIALECT_SPARK_ANSI_ENABLED.key -> "true") {
Seq(e1, e2, e3, e4, e5, e6).foreach { e => Seq(e1, e2, e3, e4, e5, e6).foreach { e =>
checkExceptionInExpression[ArithmeticException](e, "overflow") checkExceptionInExpression[ArithmeticException](e, "overflow")
} }
} }
withSQLConf(SQLConf.ANSI_ENABLED.key -> "false") { withSQLConf(SQLConf.DIALECT_SPARK_ANSI_ENABLED.key -> "false") {
checkEvaluation(e1, Byte.MinValue) checkEvaluation(e1, Byte.MinValue)
checkEvaluation(e2, Byte.MinValue) checkEvaluation(e2, Byte.MinValue)
checkEvaluation(e3, (-2).toByte) checkEvaluation(e3, (-2).toByte)

View file

@ -891,7 +891,8 @@ abstract class CastSuiteBase extends SparkFunSuite with ExpressionEvalHelper {
} }
test("Throw exception on casting out-of-range value to decimal type") { test("Throw exception on casting out-of-range value to decimal type") {
withSQLConf(SQLConf.ANSI_ENABLED.key -> requiredAnsiEnabledForOverflowTestCases.toString) { withSQLConf(
SQLConf.DIALECT_SPARK_ANSI_ENABLED.key -> requiredAnsiEnabledForOverflowTestCases.toString) {
checkExceptionInExpression[ArithmeticException]( checkExceptionInExpression[ArithmeticException](
cast(Literal("134.12"), DecimalType(3, 2)), "cannot be represented") cast(Literal("134.12"), DecimalType(3, 2)), "cannot be represented")
checkExceptionInExpression[ArithmeticException]( checkExceptionInExpression[ArithmeticException](
@ -957,7 +958,8 @@ abstract class CastSuiteBase extends SparkFunSuite with ExpressionEvalHelper {
} }
test("Throw exception on casting out-of-range value to byte type") { test("Throw exception on casting out-of-range value to byte type") {
withSQLConf(SQLConf.ANSI_ENABLED.key -> requiredAnsiEnabledForOverflowTestCases.toString) { withSQLConf(
SQLConf.DIALECT_SPARK_ANSI_ENABLED.key -> requiredAnsiEnabledForOverflowTestCases.toString) {
testIntMaxAndMin(ByteType) testIntMaxAndMin(ByteType)
Seq(Byte.MaxValue + 1, Byte.MinValue - 1).foreach { value => Seq(Byte.MaxValue + 1, Byte.MinValue - 1).foreach { value =>
checkExceptionInExpression[ArithmeticException](cast(value, ByteType), "overflow") checkExceptionInExpression[ArithmeticException](cast(value, ByteType), "overflow")
@ -982,7 +984,8 @@ abstract class CastSuiteBase extends SparkFunSuite with ExpressionEvalHelper {
} }
test("Throw exception on casting out-of-range value to short type") { test("Throw exception on casting out-of-range value to short type") {
withSQLConf(SQLConf.ANSI_ENABLED.key -> requiredAnsiEnabledForOverflowTestCases.toString) { withSQLConf(
SQLConf.DIALECT_SPARK_ANSI_ENABLED.key -> requiredAnsiEnabledForOverflowTestCases.toString) {
testIntMaxAndMin(ShortType) testIntMaxAndMin(ShortType)
Seq(Short.MaxValue + 1, Short.MinValue - 1).foreach { value => Seq(Short.MaxValue + 1, Short.MinValue - 1).foreach { value =>
checkExceptionInExpression[ArithmeticException](cast(value, ShortType), "overflow") checkExceptionInExpression[ArithmeticException](cast(value, ShortType), "overflow")
@ -1007,7 +1010,8 @@ abstract class CastSuiteBase extends SparkFunSuite with ExpressionEvalHelper {
} }
test("Throw exception on casting out-of-range value to int type") { test("Throw exception on casting out-of-range value to int type") {
withSQLConf(SQLConf.ANSI_ENABLED.key -> requiredAnsiEnabledForOverflowTestCases.toString) { withSQLConf(
SQLConf.DIALECT_SPARK_ANSI_ENABLED.key ->requiredAnsiEnabledForOverflowTestCases.toString) {
testIntMaxAndMin(IntegerType) testIntMaxAndMin(IntegerType)
testLongMaxAndMin(IntegerType) testLongMaxAndMin(IntegerType)
@ -1024,7 +1028,8 @@ abstract class CastSuiteBase extends SparkFunSuite with ExpressionEvalHelper {
} }
test("Throw exception on casting out-of-range value to long type") { test("Throw exception on casting out-of-range value to long type") {
withSQLConf(SQLConf.ANSI_ENABLED.key -> requiredAnsiEnabledForOverflowTestCases.toString) { withSQLConf(
SQLConf.DIALECT_SPARK_ANSI_ENABLED.key -> requiredAnsiEnabledForOverflowTestCases.toString) {
testLongMaxAndMin(LongType) testLongMaxAndMin(LongType)
Seq(Long.MaxValue, 0, Long.MinValue).foreach { value => Seq(Long.MaxValue, 0, Long.MinValue).foreach { value =>
@ -1201,7 +1206,7 @@ class CastSuite extends CastSuiteBase {
} }
test("SPARK-28470: Cast should honor nullOnOverflow property") { test("SPARK-28470: Cast should honor nullOnOverflow property") {
withSQLConf(SQLConf.ANSI_ENABLED.key -> "false") { withSQLConf(SQLConf.DIALECT_SPARK_ANSI_ENABLED.key -> "false") {
checkEvaluation(Cast(Literal("134.12"), DecimalType(3, 2)), null) checkEvaluation(Cast(Literal("134.12"), DecimalType(3, 2)), null)
checkEvaluation( checkEvaluation(
Cast(Literal(Timestamp.valueOf("2019-07-25 22:04:36")), DecimalType(3, 2)), null) Cast(Literal(Timestamp.valueOf("2019-07-25 22:04:36")), DecimalType(3, 2)), null)

View file

@ -32,7 +32,7 @@ class DecimalExpressionSuite extends SparkFunSuite with ExpressionEvalHelper {
} }
test("MakeDecimal") { test("MakeDecimal") {
withSQLConf(SQLConf.ANSI_ENABLED.key -> "false") { withSQLConf(SQLConf.DIALECT_SPARK_ANSI_ENABLED.key -> "false") {
checkEvaluation(MakeDecimal(Literal(101L), 3, 1), Decimal("10.1")) checkEvaluation(MakeDecimal(Literal(101L), 3, 1), Decimal("10.1"))
checkEvaluation(MakeDecimal(Literal.create(null, LongType), 3, 1), null) checkEvaluation(MakeDecimal(Literal.create(null, LongType), 3, 1), null)
val overflowExpr = MakeDecimal(Literal.create(1000L, LongType), 3, 1) val overflowExpr = MakeDecimal(Literal.create(1000L, LongType), 3, 1)
@ -41,7 +41,7 @@ class DecimalExpressionSuite extends SparkFunSuite with ExpressionEvalHelper {
evaluateWithoutCodegen(overflowExpr, null) evaluateWithoutCodegen(overflowExpr, null)
checkEvaluationWithUnsafeProjection(overflowExpr, null) checkEvaluationWithUnsafeProjection(overflowExpr, null)
} }
withSQLConf(SQLConf.ANSI_ENABLED.key -> "true") { withSQLConf(SQLConf.DIALECT_SPARK_ANSI_ENABLED.key -> "true") {
checkEvaluation(MakeDecimal(Literal(101L), 3, 1), Decimal("10.1")) checkEvaluation(MakeDecimal(Literal(101L), 3, 1), Decimal("10.1"))
checkEvaluation(MakeDecimal(Literal.create(null, LongType), 3, 1), null) checkEvaluation(MakeDecimal(Literal.create(null, LongType), 3, 1), null)
val overflowExpr = MakeDecimal(Literal.create(1000L, LongType), 3, 1) val overflowExpr = MakeDecimal(Literal.create(1000L, LongType), 3, 1)

View file

@ -57,7 +57,7 @@ class ScalaUDFSuite extends SparkFunSuite with ExpressionEvalHelper {
} }
test("SPARK-28369: honor nullOnOverflow config for ScalaUDF") { test("SPARK-28369: honor nullOnOverflow config for ScalaUDF") {
withSQLConf(SQLConf.ANSI_ENABLED.key -> "true") { withSQLConf(SQLConf.DIALECT_SPARK_ANSI_ENABLED.key -> "true") {
val udf = ScalaUDF( val udf = ScalaUDF(
(a: java.math.BigDecimal) => a.multiply(new java.math.BigDecimal(100)), (a: java.math.BigDecimal) => a.multiply(new java.math.BigDecimal(100)),
DecimalType.SYSTEM_DEFAULT, DecimalType.SYSTEM_DEFAULT,
@ -69,7 +69,7 @@ class ScalaUDFSuite extends SparkFunSuite with ExpressionEvalHelper {
} }
assert(e2.getCause.isInstanceOf[ArithmeticException]) assert(e2.getCause.isInstanceOf[ArithmeticException])
} }
withSQLConf(SQLConf.ANSI_ENABLED.key -> "false") { withSQLConf(SQLConf.DIALECT_SPARK_ANSI_ENABLED.key -> "false") {
val udf = ScalaUDF( val udf = ScalaUDF(
(a: java.math.BigDecimal) => a.multiply(new java.math.BigDecimal(100)), (a: java.math.BigDecimal) => a.multiply(new java.math.BigDecimal(100)),
DecimalType.SYSTEM_DEFAULT, DecimalType.SYSTEM_DEFAULT,

View file

@ -615,7 +615,7 @@ class ExpressionParserSuite extends AnalysisTest {
assertEqual(s"${sign}interval $intervalValue", expectedLiteral) assertEqual(s"${sign}interval $intervalValue", expectedLiteral)
// SPARK-23264 Support interval values without INTERVAL clauses if ANSI SQL enabled // SPARK-23264 Support interval values without INTERVAL clauses if ANSI SQL enabled
withSQLConf(SQLConf.ANSI_ENABLED.key -> "true") { withSQLConf(SQLConf.DIALECT_SPARK_ANSI_ENABLED.key -> "true") {
assertEqual(intervalValue, expected) assertEqual(intervalValue, expected)
} }
} }
@ -701,12 +701,12 @@ class ExpressionParserSuite extends AnalysisTest {
test("SPARK-23264 Interval Compatibility tests") { test("SPARK-23264 Interval Compatibility tests") {
def checkIntervals(intervalValue: String, expected: Literal): Unit = { def checkIntervals(intervalValue: String, expected: Literal): Unit = {
withSQLConf(SQLConf.ANSI_ENABLED.key -> "true") { withSQLConf(SQLConf.DIALECT_SPARK_ANSI_ENABLED.key -> "true") {
assertEqual(intervalValue, expected) assertEqual(intervalValue, expected)
} }
// Compatibility tests: If ANSI SQL disabled, `intervalValue` should be parsed as an alias // Compatibility tests: If ANSI SQL disabled, `intervalValue` should be parsed as an alias
withSQLConf(SQLConf.ANSI_ENABLED.key -> "false") { withSQLConf(SQLConf.DIALECT_SPARK_ANSI_ENABLED.key -> "false") {
val aliases = defaultParser.parseExpression(intervalValue).collect { val aliases = defaultParser.parseExpression(intervalValue).collect {
case a @ Alias(_: Literal, name) case a @ Alias(_: Literal, name)
if intervalUnits.exists { unit => name.startsWith(unit.toString) } => a if intervalUnits.exists { unit => name.startsWith(unit.toString) } => a
@ -804,12 +804,12 @@ class ExpressionParserSuite extends AnalysisTest {
} }
test("current date/timestamp braceless expressions") { test("current date/timestamp braceless expressions") {
withSQLConf(SQLConf.ANSI_ENABLED.key -> "true") { withSQLConf(SQLConf.DIALECT_SPARK_ANSI_ENABLED.key -> "true") {
assertEqual("current_date", CurrentDate()) assertEqual("current_date", CurrentDate())
assertEqual("current_timestamp", CurrentTimestamp()) assertEqual("current_timestamp", CurrentTimestamp())
} }
withSQLConf(SQLConf.ANSI_ENABLED.key -> "false") { withSQLConf(SQLConf.DIALECT_SPARK_ANSI_ENABLED.key -> "false") {
assertEqual("current_date", UnresolvedAttribute.quoted("current_date")) assertEqual("current_date", UnresolvedAttribute.quoted("current_date"))
assertEqual("current_timestamp", UnresolvedAttribute.quoted("current_timestamp")) assertEqual("current_timestamp", UnresolvedAttribute.quoted("current_timestamp"))
} }

View file

@ -658,7 +658,7 @@ class TableIdentifierParserSuite extends SparkFunSuite with SQLHelper {
} }
test("table identifier - reserved/non-reserved keywords if ANSI mode enabled") { test("table identifier - reserved/non-reserved keywords if ANSI mode enabled") {
withSQLConf(SQLConf.ANSI_ENABLED.key -> "true") { withSQLConf(SQLConf.DIALECT_SPARK_ANSI_ENABLED.key -> "true") {
reservedKeywordsInAnsiMode.foreach { keyword => reservedKeywordsInAnsiMode.foreach { keyword =>
val errMsg = intercept[ParseException] { val errMsg = intercept[ParseException] {
parseTableIdentifier(keyword) parseTableIdentifier(keyword)

View file

@ -46,9 +46,9 @@ select concat_ws(NULL,10,20,null,30) is null;
select reverse('abcde'); select reverse('abcde');
-- [SPARK-28036] Built-in udf left/right has inconsistent behavior -- [SPARK-28036] Built-in udf left/right has inconsistent behavior
-- [SPARK-28479][SPARK-28989] Parser error when enabling ANSI mode -- [SPARK-28479][SPARK-28989] Parser error when enabling ANSI mode
set spark.sql.ansi.enabled=false; set spark.sql.dialect.spark.ansi.enabled=false;
select i, left('ahoj', i), right('ahoj', i) from range(-5, 6) t(i) order by i; select i, left('ahoj', i), right('ahoj', i) from range(-5, 6) t(i) order by i;
set spark.sql.ansi.enabled=true; set spark.sql.dialect.spark.ansi.enabled=true;
-- [SPARK-28037] Add built-in String Functions: quote_literal -- [SPARK-28037] Add built-in String Functions: quote_literal
-- select quote_literal(''); -- select quote_literal('');
-- select quote_literal('abc'''); -- select quote_literal('abc''');

View file

@ -151,11 +151,11 @@ edcba
-- !query 18 -- !query 18
set spark.sql.ansi.enabled=false set spark.sql.dialect.spark.ansi.enabled=false
-- !query 18 schema -- !query 18 schema
struct<key:string,value:string> struct<key:string,value:string>
-- !query 18 output -- !query 18 output
spark.sql.ansi.enabled false spark.sql.dialect.spark.ansi.enabled false
-- !query 19 -- !query 19
@ -177,11 +177,11 @@ struct<i:bigint,left('ahoj', t.`i`):string,right('ahoj', t.`i`):string>
-- !query 20 -- !query 20
set spark.sql.ansi.enabled=true set spark.sql.dialect.spark.ansi.enabled=true
-- !query 20 schema -- !query 20 schema
struct<key:string,value:string> struct<key:string,value:string>
-- !query 20 output -- !query 20 output
spark.sql.ansi.enabled true spark.sql.dialect.spark.ansi.enabled true
-- !query 21 -- !query 21

View file

@ -163,7 +163,7 @@ class DataFrameSuite extends QueryTest with SharedSparkSession {
DecimalData(BigDecimal("9"* 20 + ".123"), BigDecimal("9"* 20 + ".123")) :: Nil).toDF() DecimalData(BigDecimal("9"* 20 + ".123"), BigDecimal("9"* 20 + ".123")) :: Nil).toDF()
Seq(true, false).foreach { ansiEnabled => Seq(true, false).foreach { ansiEnabled =>
withSQLConf((SQLConf.ANSI_ENABLED.key, ansiEnabled.toString)) { withSQLConf((SQLConf.DIALECT_SPARK_ANSI_ENABLED.key, ansiEnabled.toString)) {
val structDf = largeDecimals.select("a").agg(sum("a")) val structDf = largeDecimals.select("a").agg(sum("a"))
if (!ansiEnabled) { if (!ansiEnabled) {
checkAnswer(structDf, Row(null)) checkAnswer(structDf, Row(null))

View file

@ -343,10 +343,10 @@ class SQLQueryTestSuite extends QueryTest with SharedSparkSession {
localSparkSession.udf.register("boolne", (b1: Boolean, b2: Boolean) => b1 != b2) localSparkSession.udf.register("boolne", (b1: Boolean, b2: Boolean) => b1 != b2)
// vol used by boolean.sql and case.sql. // vol used by boolean.sql and case.sql.
localSparkSession.udf.register("vol", (s: String) => s) localSparkSession.udf.register("vol", (s: String) => s)
localSparkSession.conf.set(SQLConf.ANSI_ENABLED.key, true) localSparkSession.conf.set(SQLConf.DIALECT_SPARK_ANSI_ENABLED.key, true)
localSparkSession.conf.set(SQLConf.DIALECT.key, SQLConf.Dialect.POSTGRESQL.toString) localSparkSession.conf.set(SQLConf.DIALECT.key, SQLConf.Dialect.POSTGRESQL.toString)
case _: AnsiTest => case _: AnsiTest =>
localSparkSession.conf.set(SQLConf.ANSI_ENABLED.key, true) localSparkSession.conf.set(SQLConf.DIALECT_SPARK_ANSI_ENABLED.key, true)
case _ => case _ =>
} }

View file

@ -107,10 +107,10 @@ class ThriftServerQueryTestSuite extends SQLQueryTestSuite {
testCase match { testCase match {
case _: PgSQLTest => case _: PgSQLTest =>
statement.execute(s"SET ${SQLConf.ANSI_ENABLED.key} = true") statement.execute(s"SET ${SQLConf.DIALECT_SPARK_ANSI_ENABLED.key} = true")
statement.execute(s"SET ${SQLConf.DIALECT.key} = ${SQLConf.Dialect.POSTGRESQL.toString}") statement.execute(s"SET ${SQLConf.DIALECT.key} = ${SQLConf.Dialect.POSTGRESQL.toString}")
case _: AnsiTest => case _: AnsiTest =>
statement.execute(s"SET ${SQLConf.ANSI_ENABLED.key} = true") statement.execute(s"SET ${SQLConf.DIALECT_SPARK_ANSI_ENABLED.key} = true")
case _ => case _ =>
} }