[SPARK-35749][SPARK-35773][SQL] Parse unit list interval literals as tightest year-month/day-time interval types
### What changes were proposed in this pull request? This PR allow the parser to parse unit list interval literals like `'3' day '10' hours '3' seconds` or `'8' years '3' months` as `YearMonthIntervalType` or `DayTimeIntervalType`. ### Why are the changes needed? For ANSI compliance. ### Does this PR introduce _any_ user-facing change? Yes. I noted the following things in the `sql-migration-guide.md`. * Unit list interval literals are parsed as `YearMonthIntervaType` or `DayTimeIntervalType` instead of `CalendarIntervalType`. * `WEEK`, `MILLISECONS`, `MICROSECOND` and `NANOSECOND` are not valid units for unit list interval literals. * Units of year-month and day-time cannot be mixed like `1 YEAR 2 MINUTES`. ### How was this patch tested? New tests and modified tests. Closes #32949 from sarutak/day-time-multi-units. Authored-by: Kousuke Saruta <sarutak@oss.nttdata.com> Signed-off-by: Wenchen Fan <wenchen@databricks.com>
This commit is contained in:
parent
01ddaf3918
commit
8e92ef825a
|
@ -83,7 +83,9 @@ license: |
|
|||
|
||||
- In Spark 3.2, `TRANSFORM` operator can support `ArrayType/MapType/StructType` without Hive SerDe, in this mode, we use `StructsToJosn` to convert `ArrayType/MapType/StructType` column to `STRING` and use `JsonToStructs` to parse `STRING` to `ArrayType/MapType/StructType`. In Spark 3.1, Spark just support case `ArrayType/MapType/StructType` column as `STRING` but can't support parse `STRING` to `ArrayType/MapType/StructType` output columns.
|
||||
|
||||
- In Spark 3.2, the unit-to-unit interval literals like `INTERVAL '1-1' YEAR TO MONTH` are converted to ANSI interval types: `YearMonthIntervalType` or `DayTimeIntervalType`. In Spark 3.1 and earlier, such interval literals are converted to `CalendarIntervalType`. To restore the behavior before Spark 3.2, you can set `spark.sql.legacy.interval.enabled` to `true`.
|
||||
- In Spark 3.2, the unit-to-unit interval literals like `INTERVAL '1-1' YEAR TO MONTH` and the unit list interval literals like `INTERVAL '3' DAYS '1' HOUR` are converted to ANSI interval types: `YearMonthIntervalType` or `DayTimeIntervalType`. In Spark 3.1 and earlier, such interval literals are converted to `CalendarIntervalType`. To restore the behavior before Spark 3.2, you can set `spark.sql.legacy.interval.enabled` to `true`.
|
||||
|
||||
- In Spark 3.2, the unit list interval literals can not mix year-month fields (YEAR and MONTH) and day-time fields (WEEK, DAY, ..., MICROSECOND). For example, `INTERVAL 1 day 1 hour` is invalid in Spark 3.2. In Spark 3.1 and earlier, there is no such limitation and the literal returns value of `CalendarIntervalType`. To restore the behavior before Spark 3.2, you can set `spark.sql.legacy.interval.enabled` to `true`.
|
||||
|
||||
- In Spark 3.2, Spark supports `DayTimeIntervalType` and `YearMonthIntervalType` as inputs and outputs of `TRANSFORM` clause in Hive `SERDE` mode, the behavior is different between Hive `SERDE` mode and `ROW FORMAT DELIMITED` mode when these two types are used as inputs. In Hive `SERDE` mode, `DayTimeIntervalType` column is converted to `HiveIntervalDayTime`, its string format is `[-]?d h:m:s.n`, but in `ROW FORMAT DELIMITED` mode the format is `INTERVAL '[-]?d h:m:s.n' DAY TO TIME`. In Hive `SERDE` mode, `YearMonthIntervalType` column is converted to `HiveIntervalYearMonth`, its string format is `[-]?y-m`, but in `ROW FORMAT DELIMITED` mode the format is `INTERVAL '[-]?y-m' YEAR TO MONTH`.
|
||||
|
||||
|
|
|
@ -877,7 +877,7 @@ interval
|
|||
;
|
||||
|
||||
errorCapturingMultiUnitsInterval
|
||||
: multiUnitsInterval unitToUnitInterval?
|
||||
: body=multiUnitsInterval unitToUnitInterval?
|
||||
;
|
||||
|
||||
multiUnitsInterval
|
||||
|
|
|
@ -2695,7 +2695,7 @@ object DatePart {
|
|||
224
|
||||
> SELECT _FUNC_('SECONDS', timestamp'2019-10-01 00:00:01.000001');
|
||||
1.000001
|
||||
> SELECT _FUNC_('days', interval 1 year 10 months 5 days);
|
||||
> SELECT _FUNC_('days', interval 5 days 3 hours 7 minutes);
|
||||
5
|
||||
> SELECT _FUNC_('seconds', interval 5 hours 30 seconds 1 milliseconds 1 microseconds);
|
||||
30.001001
|
||||
|
@ -2764,7 +2764,7 @@ case class DatePart(field: Expression, source: Expression, child: Expression)
|
|||
224
|
||||
> SELECT _FUNC_(SECONDS FROM timestamp'2019-10-01 00:00:01.000001');
|
||||
1.000001
|
||||
> SELECT _FUNC_(days FROM interval 1 year 10 months 5 days);
|
||||
> SELECT _FUNC_(days FROM interval 5 days 3 hours 7 minutes);
|
||||
5
|
||||
> SELECT _FUNC_(seconds FROM interval 5 hours 30 seconds 1 milliseconds 1 microseconds);
|
||||
30.001001
|
||||
|
|
|
@ -22,7 +22,7 @@ import java.util.concurrent.TimeUnit
|
|||
import javax.xml.bind.DatatypeConverter
|
||||
|
||||
import scala.collection.JavaConverters._
|
||||
import scala.collection.mutable.ArrayBuffer
|
||||
import scala.collection.mutable.{ArrayBuffer, Set}
|
||||
|
||||
import org.antlr.v4.runtime.{ParserRuleContext, Token}
|
||||
import org.antlr.v4.runtime.tree.{ParseTree, RuleNode, TerminalNode}
|
||||
|
@ -2385,20 +2385,50 @@ class AstBuilder extends SqlBaseBaseVisitor[AnyRef] with SQLConfHelper with Logg
|
|||
// `CalendarInterval` doesn't have enough info. For instance, new CalendarInterval(0, 0, 0)
|
||||
// can be derived from INTERVAL '0-0' YEAR TO MONTH as well as from
|
||||
// INTERVAL '0 00:00:00' DAY TO SECOND.
|
||||
val fromUnit =
|
||||
ctx.errorCapturingUnitToUnitInterval.body.from.getText.toLowerCase(Locale.ROOT)
|
||||
val toUnit = ctx.errorCapturingUnitToUnitInterval.body.to.getText.toLowerCase(Locale.ROOT)
|
||||
if (toUnit == "month") {
|
||||
assert(calendarInterval.days == 0 && calendarInterval.microseconds == 0)
|
||||
// TODO(SPARK-35773): Parse year-month interval literals to tightest types
|
||||
Literal(calendarInterval.months, YearMonthIntervalType())
|
||||
val start = YearMonthIntervalType.stringToField(fromUnit)
|
||||
Literal(calendarInterval.months, YearMonthIntervalType(start, YearMonthIntervalType.MONTH))
|
||||
} else {
|
||||
assert(calendarInterval.months == 0)
|
||||
val fromUnit =
|
||||
ctx.errorCapturingUnitToUnitInterval.body.from.getText.toLowerCase(Locale.ROOT)
|
||||
val micros = IntervalUtils.getDuration(calendarInterval, TimeUnit.MICROSECONDS)
|
||||
val start = DayTimeIntervalType.stringToField(fromUnit)
|
||||
val end = DayTimeIntervalType.stringToField(toUnit)
|
||||
Literal(micros, DayTimeIntervalType(start, end))
|
||||
}
|
||||
} else if (ctx.errorCapturingMultiUnitsInterval != null && !conf.legacyIntervalEnabled) {
|
||||
val units =
|
||||
ctx.errorCapturingMultiUnitsInterval.body.unit.asScala.map(
|
||||
_.getText.toLowerCase(Locale.ROOT).stripSuffix("s"))
|
||||
val yearMonthFields = Set.empty[Byte]
|
||||
val dayTimeFields = Set.empty[Byte]
|
||||
for (unit <- units) {
|
||||
if (YearMonthIntervalType.stringToField.contains(unit)) {
|
||||
yearMonthFields += YearMonthIntervalType.stringToField(unit)
|
||||
} else if (DayTimeIntervalType.stringToField.contains(unit)) {
|
||||
dayTimeFields += DayTimeIntervalType.stringToField(unit)
|
||||
} else if (unit == "week") {
|
||||
dayTimeFields += DayTimeIntervalType.DAY
|
||||
} else {
|
||||
assert(unit == "millisecond" || unit == "microsecond")
|
||||
dayTimeFields += DayTimeIntervalType.SECOND
|
||||
}
|
||||
}
|
||||
if (yearMonthFields.nonEmpty) {
|
||||
if (dayTimeFields.nonEmpty) {
|
||||
val literalStr = source(ctx)
|
||||
throw QueryParsingErrors.mixedIntervalUnitsError(literalStr, ctx)
|
||||
}
|
||||
Literal(
|
||||
calendarInterval.months, YearMonthIntervalType(yearMonthFields.min, yearMonthFields.max))
|
||||
} else {
|
||||
Literal(
|
||||
IntervalUtils.getDuration(calendarInterval, TimeUnit.MICROSECONDS),
|
||||
DayTimeIntervalType(dayTimeFields.min, dayTimeFields.max))
|
||||
}
|
||||
} else {
|
||||
Literal(calendarInterval, CalendarIntervalType)
|
||||
}
|
||||
|
|
|
@ -210,6 +210,10 @@ object QueryParsingErrors {
|
|||
new ParseException(s"Intervals FROM $from TO $to are not supported.", ctx)
|
||||
}
|
||||
|
||||
def mixedIntervalUnitsError(literal: String, ctx: IntervalContext): Throwable = {
|
||||
new ParseException(s"Cannot mix year-month and day-time fields: $literal", ctx)
|
||||
}
|
||||
|
||||
def dataTypeUnsupportedError(dataType: String, ctx: PrimitiveDataTypeContext): Throwable = {
|
||||
new ParseException(s"DataType $dataType is not supported.", ctx)
|
||||
}
|
||||
|
|
|
@ -17,7 +17,7 @@
|
|||
package org.apache.spark.sql.catalyst.parser
|
||||
|
||||
import java.sql.{Date, Timestamp}
|
||||
import java.time.LocalDateTime
|
||||
import java.time.{Duration, LocalDateTime, Period}
|
||||
import java.util.concurrent.TimeUnit
|
||||
|
||||
import scala.language.implicitConversions
|
||||
|
@ -27,9 +27,11 @@ import org.apache.spark.sql.catalyst.analysis.{UnresolvedAttribute, _}
|
|||
import org.apache.spark.sql.catalyst.expressions._
|
||||
import org.apache.spark.sql.catalyst.expressions.aggregate.{First, Last}
|
||||
import org.apache.spark.sql.catalyst.util.{DateTimeTestUtils, IntervalUtils}
|
||||
import org.apache.spark.sql.catalyst.util.DateTimeConstants._
|
||||
import org.apache.spark.sql.internal.SQLConf
|
||||
import org.apache.spark.sql.internal.SQLConf.TimestampTypes
|
||||
import org.apache.spark.sql.types._
|
||||
import org.apache.spark.sql.types.{DayTimeIntervalType => DT, YearMonthIntervalType => YM}
|
||||
import org.apache.spark.unsafe.types.{CalendarInterval, UTF8String}
|
||||
|
||||
/**
|
||||
|
@ -678,7 +680,36 @@ class ExpressionParserSuite extends AnalysisTest {
|
|||
}
|
||||
}
|
||||
|
||||
def intervalLiteral(u: UTF8String, s: String): Literal = {
|
||||
val ymIntervalUnits = Seq("year", "month")
|
||||
val dtIntervalUnits = Seq("week", "day", "hour", "minute", "second", "millisecond", "microsecond")
|
||||
|
||||
def ymIntervalLiteral(u: String, s: String): Literal = {
|
||||
val period = u match {
|
||||
case "year" => Period.ofYears(Integer.parseInt(s))
|
||||
case "month" => Period.ofMonths(Integer.parseInt(s))
|
||||
}
|
||||
Literal.create(period, YearMonthIntervalType(YM.stringToField(u)))
|
||||
}
|
||||
|
||||
def dtIntervalLiteral(u: String, s: String): Literal = {
|
||||
val value = if (u == "second") {
|
||||
(BigDecimal(s) * NANOS_PER_SECOND).toLong
|
||||
} else {
|
||||
java.lang.Long.parseLong(s)
|
||||
}
|
||||
val (duration, field) = u match {
|
||||
case "week" => (Duration.ofDays(value * 7), DT.DAY)
|
||||
case "day" => (Duration.ofDays(value), DT.DAY)
|
||||
case "hour" => (Duration.ofHours(value), DT.HOUR)
|
||||
case "minute" => (Duration.ofMinutes(value), DT.MINUTE)
|
||||
case "second" => (Duration.ofNanos(value), DT.SECOND)
|
||||
case "millisecond" => (Duration.ofMillis(value), DT.SECOND)
|
||||
case "microsecond" => (Duration.ofNanos(value * NANOS_PER_MICROS), DT.SECOND)
|
||||
}
|
||||
Literal.create(duration, DayTimeIntervalType(field))
|
||||
}
|
||||
|
||||
def legacyIntervalLiteral(u: String, s: String): Literal = {
|
||||
Literal(IntervalUtils.stringToInterval(s + " " + u.toString))
|
||||
}
|
||||
|
||||
|
@ -698,33 +729,55 @@ class ExpressionParserSuite extends AnalysisTest {
|
|||
// Single Intervals.
|
||||
val forms = Seq("", "s")
|
||||
val values = Seq("0", "10", "-7", "21")
|
||||
Seq("year", "month", "week", "day", "hour", "minute", "second", "millisecond", "microsecond")
|
||||
.foreach { unit =>
|
||||
|
||||
ymIntervalUnits.foreach { unit =>
|
||||
forms.foreach { form =>
|
||||
values.foreach { value =>
|
||||
val expected = ymIntervalLiteral(unit, value)
|
||||
checkIntervals(s"$value $unit$form", expected)
|
||||
checkIntervals(s"'$value' $unit$form", expected)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
dtIntervalUnits.foreach { unit =>
|
||||
forms.foreach { form =>
|
||||
values.foreach { value =>
|
||||
val expected = dtIntervalLiteral(unit, value)
|
||||
checkIntervals(s"$value $unit$form", expected)
|
||||
checkIntervals(s"'$value' $unit$form", expected)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Hive nanosecond notation.
|
||||
checkIntervals("13.123456789 seconds", dtIntervalLiteral("second", "13.123456789"))
|
||||
|
||||
withSQLConf(SQLConf.LEGACY_INTERVAL_ENABLED.key -> "true") {
|
||||
(ymIntervalUnits ++ dtIntervalUnits).foreach { unit =>
|
||||
forms.foreach { form =>
|
||||
values.foreach { value =>
|
||||
val expected = intervalLiteral(unit, value)
|
||||
val expected = legacyIntervalLiteral(unit, value)
|
||||
checkIntervals(s"$value $unit$form", expected)
|
||||
checkIntervals(s"'$value' $unit$form", expected)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Hive nanosecond notation.
|
||||
checkIntervals("13.123456789 seconds", intervalLiteral("second", "13.123456789"))
|
||||
checkIntervals(
|
||||
"-13.123456789 second",
|
||||
Literal(new CalendarInterval(
|
||||
0,
|
||||
0,
|
||||
DateTimeTestUtils.secFrac(-13, -123, -456))))
|
||||
checkIntervals(
|
||||
"13.123456 second",
|
||||
Literal(new CalendarInterval(
|
||||
0,
|
||||
0,
|
||||
DateTimeTestUtils.secFrac(13, 123, 456))))
|
||||
checkIntervals("1.001 second",
|
||||
Literal(IntervalUtils.stringToInterval("1 second 1 millisecond")))
|
||||
checkIntervals(
|
||||
"-13.123456789 second",
|
||||
Literal(new CalendarInterval(
|
||||
0,
|
||||
0,
|
||||
DateTimeTestUtils.secFrac(-13, -123, -456))))
|
||||
checkIntervals(
|
||||
"13.123456 second",
|
||||
Literal(new CalendarInterval(
|
||||
0,
|
||||
0,
|
||||
DateTimeTestUtils.secFrac(13, 123, 456))))
|
||||
checkIntervals("1.001 second",
|
||||
Literal(IntervalUtils.stringToInterval("1 second 1 millisecond")))
|
||||
}
|
||||
|
||||
// Non Existing unit
|
||||
intercept("interval 10 nanoseconds", "invalid unit 'nanoseconds'")
|
||||
|
@ -771,8 +824,22 @@ class ExpressionParserSuite extends AnalysisTest {
|
|||
|
||||
// Composed intervals.
|
||||
checkIntervals(
|
||||
"3 months 4 days 22 seconds 1 millisecond",
|
||||
Literal(new CalendarInterval(3, 4, 22001000L)))
|
||||
"10 years 3 months", Literal.create(Period.of(10, 3, 0), YearMonthIntervalType()))
|
||||
checkIntervals(
|
||||
"8 days 2 hours 3 minutes 21 seconds",
|
||||
Literal.create(Duration.ofDays(8).plusHours(2).plusMinutes(3).plusSeconds(21)))
|
||||
|
||||
Seq(true, false).foreach { legacyEnabled =>
|
||||
withSQLConf(SQLConf.LEGACY_INTERVAL_ENABLED.key -> legacyEnabled.toString) {
|
||||
val intervalStr = "3 monThs 4 dayS 22 sEcond 1 millisecond"
|
||||
if (legacyEnabled) {
|
||||
checkIntervals(intervalStr, Literal(new CalendarInterval(3, 4, 22001000L)))
|
||||
} else {
|
||||
intercept(s"interval $intervalStr",
|
||||
s"Cannot mix year-month and day-time fields: interval $intervalStr")
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
test("composed expressions") {
|
||||
|
|
|
@ -59,7 +59,11 @@ DESC FUNCTION EXTENDED boolean;
|
|||
|
||||
-- cast string to interval and interval to string
|
||||
SELECT CAST('interval 3 month 1 hour' AS interval);
|
||||
SELECT CAST("interval '3-1' year to month" AS interval year to month);
|
||||
SELECT CAST("interval '3 00:00:01' day to second" AS interval day to second);
|
||||
SELECT CAST(interval 3 month 1 hour AS string);
|
||||
SELECT CAST(interval 3 year 1 month AS string);
|
||||
SELECT CAST(interval 3 day 1 second AS string);
|
||||
|
||||
-- trim string before cast to numeric
|
||||
select cast(' 1' as tinyint);
|
||||
|
|
|
@ -1,4 +1,4 @@
|
|||
CREATE TEMPORARY VIEW t AS select '2011-05-06 07:08:09.1234567' as c, to_timestamp_ntz('2011-05-06 07:08:09.1234567') as ntz,interval 10 year 20 month 30 day 40 hour 50 minute 6.7890 second as i;
|
||||
CREATE TEMPORARY VIEW t AS select '2011-05-06 07:08:09.1234567' as c, to_timestamp_ntz('2011-05-06 07:08:09.1234567') as ntz, interval 10 year 20 month as i, interval 30 day 40 hour 50 minute 6.7890 second as j;
|
||||
|
||||
select extract(year from c), extract(year from ntz), extract(year from i) from t;
|
||||
select extract(y from c), extract(y from ntz), extract(y from i) from t;
|
||||
|
@ -20,9 +20,9 @@ select extract(week from c), extract(week from ntz) from t;
|
|||
select extract(w from c), extract(w from ntz) from t;
|
||||
select extract(weeks from c), extract(weeks from ntz) from t;
|
||||
|
||||
select extract(day from c), extract(day from ntz), extract(day from i) from t;
|
||||
select extract(d from c), extract(d from ntz), extract(d from i) from t;
|
||||
select extract(days from c), extract(days from ntz), extract(days from i) from t;
|
||||
select extract(day from c), extract(day from ntz), extract(day from j) from t;
|
||||
select extract(d from c), extract(d from ntz), extract(d from j) from t;
|
||||
select extract(days from c), extract(days from ntz), extract(days from j) from t;
|
||||
|
||||
select extract(dayofweek from c), extract(dayofweek from ntz) from t;
|
||||
select extract(dow from c), extract(dow from ntz) from t;
|
||||
|
@ -32,26 +32,27 @@ select extract(dow_iso from c), extract(dow_iso from ntz) from t;
|
|||
|
||||
select extract(doy from c), extract(doy from ntz) from t;
|
||||
|
||||
select extract(hour from c), extract(hour from ntz), extract(hour from i) from t;
|
||||
select extract(h from c), extract(h from ntz), extract(h from i) from t;
|
||||
select extract(hours from c), extract(hours from ntz), extract(hours from i) from t;
|
||||
select extract(hr from c), extract(hr from ntz), extract(hr from i) from t;
|
||||
select extract(hrs from c), extract(hrs from ntz), extract(hrs from i) from t;
|
||||
select extract(hour from c), extract(hour from ntz), extract(hour from j) from t;
|
||||
select extract(h from c), extract(h from ntz), extract(h from j) from t;
|
||||
select extract(hours from c), extract(hours from ntz), extract(hours from j) from t;
|
||||
select extract(hr from c), extract(hr from ntz), extract(hr from j) from t;
|
||||
select extract(hrs from c), extract(hrs from ntz), extract(hrs from j) from t;
|
||||
|
||||
select extract(minute from c), extract(minute from ntz), extract(minute from i) from t;
|
||||
select extract(m from c), extract(m from ntz), extract(m from i) from t;
|
||||
select extract(min from c), extract(min from ntz), extract(min from i) from t;
|
||||
select extract(mins from c), extract(mins from ntz), extract(mins from i) from t;
|
||||
select extract(minutes from c), extract(minutes from ntz), extract(minutes from i) from t;
|
||||
select extract(minute from c), extract(minute from ntz), extract(minute from j) from t;
|
||||
select extract(m from c), extract(m from ntz), extract(m from j) from t;
|
||||
select extract(min from c), extract(min from ntz), extract(min from j) from t;
|
||||
select extract(mins from c), extract(mins from ntz), extract(mins from j) from t;
|
||||
select extract(minutes from c), extract(minutes from ntz), extract(minutes from j) from t;
|
||||
|
||||
select extract(second from c), extract(second from ntz), extract(second from i) from t;
|
||||
select extract(s from c), extract(s from ntz), extract(s from i) from t;
|
||||
select extract(sec from c), extract(sec from ntz), extract(sec from i) from t;
|
||||
select extract(seconds from c), extract(seconds from ntz), extract(seconds from i) from t;
|
||||
select extract(secs from c), extract(secs from ntz), extract(secs from i) from t;
|
||||
select extract(second from c), extract(second from ntz), extract(second from j) from t;
|
||||
select extract(s from c), extract(s from ntz), extract(s from j) from t;
|
||||
select extract(sec from c), extract(sec from ntz), extract(sec from j) from t;
|
||||
select extract(seconds from c), extract(seconds from ntz), extract(seconds from j) from t;
|
||||
select extract(secs from c), extract(secs from ntz), extract(secs from j) from t;
|
||||
|
||||
select extract(not_supported from c) from t;
|
||||
select extract(not_supported from i) from t;
|
||||
select extract(not_supported from j) from t;
|
||||
|
||||
select date_part('year', c), date_part('year', ntz), date_part('year', i) from t;
|
||||
select date_part('y', c), date_part('y', ntz), date_part('y', i) from t;
|
||||
|
@ -73,9 +74,9 @@ select date_part('week', c), date_part('week', ntz) from t;
|
|||
select date_part('w', c), date_part('w', ntz) from t;
|
||||
select date_part('weeks', c), date_part('weeks', ntz) from t;
|
||||
|
||||
select date_part('day', c), date_part('day', ntz), date_part('day', i) from t;
|
||||
select date_part('d', c), date_part('d', ntz), date_part('d', i) from t;
|
||||
select date_part('days', c), date_part('days', ntz), date_part('days', i) from t;
|
||||
select date_part('day', c), date_part('day', ntz), date_part('day', j) from t;
|
||||
select date_part('d', c), date_part('d', ntz), date_part('d', j) from t;
|
||||
select date_part('days', c), date_part('days', ntz), date_part('days', j) from t;
|
||||
|
||||
select date_part('dayofweek', c), date_part('dayofweek', ntz) from t;
|
||||
select date_part('dow', c), date_part('dow', ntz) from t;
|
||||
|
@ -85,23 +86,23 @@ select date_part('dow_iso', c), date_part('dow_iso', ntz) from t;
|
|||
|
||||
select date_part('doy', c), date_part('doy', ntz) from t;
|
||||
|
||||
select date_part('hour', c), date_part('hour', ntz), date_part('hour', i) from t;
|
||||
select date_part('h', c), date_part('h', ntz), date_part('h', i) from t;
|
||||
select date_part('hours', c), date_part('hours', ntz), date_part('hours', i) from t;
|
||||
select date_part('hr', c), date_part('hr', ntz), date_part('hr', i) from t;
|
||||
select date_part('hrs', c), date_part('hrs', ntz), date_part('hrs', i) from t;
|
||||
select date_part('hour', c), date_part('hour', ntz), date_part('hour', j) from t;
|
||||
select date_part('h', c), date_part('h', ntz), date_part('h', j) from t;
|
||||
select date_part('hours', c), date_part('hours', ntz), date_part('hours', j) from t;
|
||||
select date_part('hr', c), date_part('hr', ntz), date_part('hr', j) from t;
|
||||
select date_part('hrs', c), date_part('hrs', ntz), date_part('hrs', j) from t;
|
||||
|
||||
select date_part('minute', c), date_part('minute', ntz), date_part('minute', i) from t;
|
||||
select date_part('m', c), date_part('m', ntz), date_part('m', i) from t;
|
||||
select date_part('min', c), date_part('min', ntz), date_part('min', i) from t;
|
||||
select date_part('mins', c), date_part('mins', ntz), date_part('mins', i) from t;
|
||||
select date_part('minutes', c), date_part('minutes', ntz), date_part('minutes', i) from t;
|
||||
select date_part('minute', c), date_part('minute', ntz), date_part('minute', j) from t;
|
||||
select date_part('m', c), date_part('m', ntz), date_part('m', j) from t;
|
||||
select date_part('min', c), date_part('min', ntz), date_part('min', j) from t;
|
||||
select date_part('mins', c), date_part('mins', ntz), date_part('mins', j) from t;
|
||||
select date_part('minutes', c), date_part('minutes', ntz), date_part('minutes', j) from t;
|
||||
|
||||
select date_part('second', c), date_part('second', ntz), date_part('second', i) from t;
|
||||
select date_part('s', c), date_part('s', ntz), date_part('s', i) from t;
|
||||
select date_part('sec', c), date_part('sec', ntz), date_part('sec', i) from t;
|
||||
select date_part('seconds', c), date_part('seconds', ntz), date_part('seconds', i) from t;
|
||||
select date_part('secs', c), date_part('secs', ntz), date_part('secs', i) from t;
|
||||
select date_part('second', c), date_part('second', ntz), date_part('second', j) from t;
|
||||
select date_part('s', c), date_part('s', ntz), date_part('s', j) from t;
|
||||
select date_part('sec', c), date_part('sec', ntz), date_part('sec', j) from t;
|
||||
select date_part('seconds', c), date_part('seconds', ntz), date_part('seconds', j) from t;
|
||||
select date_part('secs', c), date_part('secs', ntz), date_part('secs', j) from t;
|
||||
|
||||
select date_part('not_supported', c) from t;
|
||||
select date_part(c, c) from t;
|
||||
|
@ -124,10 +125,11 @@ select extract('hour', c) from t;
|
|||
select extract('minute', c) from t;
|
||||
select extract('second', c) from t;
|
||||
|
||||
select c - i from t;
|
||||
select year(c - i) from t;
|
||||
select extract(year from c - i) from t;
|
||||
select c - j from t;
|
||||
select day(c - j) from t;
|
||||
select extract(day from c - j) from t;
|
||||
select extract(month from to_timestamp(c) - i) from t;
|
||||
select extract(second from to_timestamp(c) - j) from t;
|
||||
|
||||
-- extract fields from year-month/day-time intervals
|
||||
select extract(YEAR from interval '2-1' YEAR TO MONTH);
|
||||
|
|
|
@ -3,6 +3,8 @@
|
|||
-- multiply and divide an interval by a number
|
||||
select 3 * (timestamp'2019-10-15 10:11:12.001002' - date'2019-10-15');
|
||||
select interval 4 month 2 weeks 3 microseconds * 1.5;
|
||||
select interval 2 years 4 months;
|
||||
select interval 2 weeks 3 microseconds * 1.5;
|
||||
select (timestamp'2019-10-15' - timestamp'2019-10-14') / 1.5;
|
||||
select interval 2147483647 month * 2;
|
||||
select interval 2147483647 month / 0.5;
|
||||
|
@ -17,9 +19,17 @@ select null * interval '2 seconds';
|
|||
|
||||
-- interval with a positive/negative sign
|
||||
select -interval '-1 month 1 day -1 second';
|
||||
select -interval '-1 year 1 month';
|
||||
select -interval '-1 day 1 hour -1 minute 1 second';
|
||||
select -interval -1 month 1 day -1 second;
|
||||
select -interval -1 year 1 month;
|
||||
select -interval -1 day 1 hour -1 minute 1 second;
|
||||
select +interval '-1 month 1 day -1 second';
|
||||
select +interval '-1 year 1 month';
|
||||
select +interval '-1 day 1 hour -1 minute 1 second';
|
||||
select +interval -1 month 1 day -1 second;
|
||||
select +interval -1 year 1 month;
|
||||
select +interval -1 day 1 hour -1 minute 1 second;
|
||||
select interval -'1-1' year to month;
|
||||
select interval -'-1-1' year to month;
|
||||
select interval +'-1-1' year to month;
|
||||
|
@ -67,7 +77,12 @@ select cast('- +1 second' as interval);
|
|||
-- interval literal
|
||||
select interval 13.123456789 seconds, interval -13.123456789 second;
|
||||
select interval 1 year 2 month 3 week 4 day 5 hour 6 minute 7 seconds 8 millisecond 9 microsecond;
|
||||
select interval 1 year 2 month;
|
||||
select interval 4 day 5 hour 6 minute 7 seconds;
|
||||
select interval 3 week 8 millisecond 9 microsecond;
|
||||
select interval '30' year '25' month '-100' day '40' hour '80' minute '299.889987299' second;
|
||||
select interval '30' year '25' month;
|
||||
select interval '-100' day '40' hour '80' minute '299.889987299' second;
|
||||
select interval '0-0' year to month;
|
||||
select interval '0 0:0:0' day to second;
|
||||
select interval '0 0:0:0.1' day to second;
|
||||
|
@ -95,6 +110,11 @@ select interval 10 nanoseconds;
|
|||
|
||||
-- map + interval test
|
||||
select map(1, interval 1 day, 2, interval 3 week);
|
||||
select map(1, interval 1 day, 2, interval 2 day);
|
||||
select map(1, interval 1 year, 2, interval 2 month);
|
||||
select map(1, interval 1 month, 2, interval 2 month);
|
||||
select map(1, interval 1 week, 2, interval 2 day);
|
||||
select map(1, interval 2 millisecond, 3, interval 3 microsecond);
|
||||
|
||||
-- typed interval expression
|
||||
select interval 'interval 3 year 1 hour';
|
||||
|
@ -219,14 +239,28 @@ select a / 0.5 from values (interval '-2147483648 months', interval '2147483647
|
|||
-- interval support for csv and json functions
|
||||
SELECT
|
||||
from_csv('1, 1 day', 'a INT, b interval'),
|
||||
from_csv('1, 1', 'a INT, b interval day'),
|
||||
to_csv(from_csv('1, 1 day', 'a INT, b interval')),
|
||||
to_csv(named_struct('a', interval 32 month, 'b', interval 70 minute)),
|
||||
from_csv(to_csv(named_struct('a', interval 32 month, 'b', interval 70 minute)), 'a interval, b interval');
|
||||
to_csv(from_csv('1, 1', 'a INT, b interval day')),
|
||||
to_csv(named_struct('a', interval 32 hour, 'b', interval 70 minute)),
|
||||
from_csv(to_csv(named_struct('a', interval 32 hour, 'b', interval 70 minute)), 'a interval hour, b interval minute');
|
||||
SELECT
|
||||
from_json('{"a":"1 days"}', 'a interval'),
|
||||
from_csv('1, 1', 'a INT, b interval year'),
|
||||
to_json(from_json('{"a":"1 days"}', 'a interval')),
|
||||
to_json(map('a', interval 25 month 100 day 130 minute)),
|
||||
from_json(to_json(map('a', interval 25 month 100 day 130 minute)), 'a interval');
|
||||
to_csv(from_csv('1, 1', 'a INT, b interval year')),
|
||||
to_csv(named_struct('a', interval 32 year, 'b', interval 10 month)),
|
||||
from_csv(to_csv(named_struct('a', interval 32 year, 'b', interval 10 month)), 'a interval year, b interval month');
|
||||
SELECT
|
||||
from_json('{"a":"1"}', 'a interval day'),
|
||||
to_json(from_json('{"a":"1"}', 'a interval day')),
|
||||
to_json(map('a', interval 100 day 130 minute)),
|
||||
from_json(to_json(map('a', interval 100 day 130 minute)), 'a interval day to minute');
|
||||
SELECT
|
||||
from_json('{"a":"1"}', 'a interval year'),
|
||||
to_json(from_json('{"a":"1"}', 'a interval year')),
|
||||
to_json(map('a', interval 32 year 10 month)),
|
||||
from_json(to_json(map('a', interval 32 year 10 month)), 'a interval year to month');
|
||||
|
||||
select interval '+';
|
||||
select interval '+.';
|
||||
|
@ -274,3 +308,16 @@ SELECT INTERVAL '-2562047789:00' HOUR TO MINUTE;
|
|||
SELECT INTERVAL '153722867281:54.775808' MINUTE TO SECOND;
|
||||
SELECT INTERVAL '-153722867281:54.775808' MINUTE TO SECOND;
|
||||
|
||||
SELECT INTERVAL '178956970' YEAR;
|
||||
SELECT INTERVAL '-178956970' YEAR;
|
||||
SELECT INTERVAL '2147483647' MONTH;
|
||||
SELECT INTERVAL '-2147483647' MONTH;
|
||||
|
||||
SELECT INTERVAL '106751991' DAY;
|
||||
SELECT INTERVAL '-106751991' DAY;
|
||||
SELECT INTERVAL '2562047788' HOUR;
|
||||
SELECT INTERVAL '-2562047788' HOUR;
|
||||
SELECT INTERVAL '153722867280' MINUTE;
|
||||
SELECT INTERVAL '-153722867280' MINUTE;
|
||||
SELECT INTERVAL '54.775807' SECOND;
|
||||
SELECT INTERVAL '-54.775807' SECOND;
|
||||
|
|
|
@ -339,7 +339,7 @@ select timestamp '2019-01-01中文'
|
|||
-- !query
|
||||
select timestamp'2011-11-11 11:11:11' + interval '2' day
|
||||
-- !query schema
|
||||
struct<TIMESTAMP '2011-11-11 11:11:11' + INTERVAL '2 days':timestamp>
|
||||
struct<TIMESTAMP '2011-11-11 11:11:11' + INTERVAL '2' DAY:timestamp>
|
||||
-- !query output
|
||||
2011-11-13 11:11:11
|
||||
|
||||
|
@ -347,7 +347,7 @@ struct<TIMESTAMP '2011-11-11 11:11:11' + INTERVAL '2 days':timestamp>
|
|||
-- !query
|
||||
select timestamp'2011-11-11 11:11:11' - interval '2' day
|
||||
-- !query schema
|
||||
struct<TIMESTAMP '2011-11-11 11:11:11' - INTERVAL '2 days':timestamp>
|
||||
struct<TIMESTAMP '2011-11-11 11:11:11' - INTERVAL '2' DAY:timestamp>
|
||||
-- !query output
|
||||
2011-11-09 11:11:11
|
||||
|
||||
|
@ -355,25 +355,23 @@ struct<TIMESTAMP '2011-11-11 11:11:11' - INTERVAL '2 days':timestamp>
|
|||
-- !query
|
||||
select date'2011-11-11 11:11:11' + interval '2' second
|
||||
-- !query schema
|
||||
struct<>
|
||||
struct<DATE '2011-11-11' + INTERVAL '02' SECOND:timestamp>
|
||||
-- !query output
|
||||
java.lang.IllegalArgumentException
|
||||
requirement failed: Cannot add hours, minutes or seconds, milliseconds, microseconds to a date
|
||||
2011-11-11 00:00:02
|
||||
|
||||
|
||||
-- !query
|
||||
select date'2011-11-11 11:11:11' - interval '2' second
|
||||
-- !query schema
|
||||
struct<>
|
||||
struct<DATE '2011-11-11' - INTERVAL '02' SECOND:timestamp>
|
||||
-- !query output
|
||||
java.lang.IllegalArgumentException
|
||||
requirement failed: Cannot add hours, minutes or seconds, milliseconds, microseconds to a date
|
||||
2011-11-10 23:59:58
|
||||
|
||||
|
||||
-- !query
|
||||
select '2011-11-11' - interval '2' day
|
||||
-- !query schema
|
||||
struct<2011-11-11 - INTERVAL '2 days':string>
|
||||
struct<2011-11-11 - INTERVAL '2' DAY:string>
|
||||
-- !query output
|
||||
2011-11-09 00:00:00
|
||||
|
||||
|
@ -381,7 +379,7 @@ struct<2011-11-11 - INTERVAL '2 days':string>
|
|||
-- !query
|
||||
select '2011-11-11 11:11:11' - interval '2' second
|
||||
-- !query schema
|
||||
struct<2011-11-11 11:11:11 - INTERVAL '2 seconds':string>
|
||||
struct<2011-11-11 11:11:11 - INTERVAL '02' SECOND:string>
|
||||
-- !query output
|
||||
2011-11-11 11:11:09
|
||||
|
||||
|
@ -401,7 +399,7 @@ select 1 - interval '2' second
|
|||
struct<>
|
||||
-- !query output
|
||||
org.apache.spark.sql.AnalysisException
|
||||
cannot resolve '1 + (- INTERVAL '2 seconds')' due to data type mismatch: argument 1 requires (timestamp or timestamp without time zone) type, however, '1' is of int type.; line 1 pos 7
|
||||
cannot resolve '1 + (- INTERVAL '02' SECOND)' due to data type mismatch: argument 1 requires (timestamp or timestamp without time zone) type, however, '1' is of int type.; line 1 pos 7
|
||||
|
||||
|
||||
-- !query
|
||||
|
@ -1527,7 +1525,7 @@ NULL
|
|||
-- !query
|
||||
select to_timestamp_ntz('2021-06-25 10:11:12') + interval 2 day
|
||||
-- !query schema
|
||||
struct<to_timestamp_ntz(2021-06-25 10:11:12) + INTERVAL '2 days':timestamp_ntz>
|
||||
struct<to_timestamp_ntz(2021-06-25 10:11:12) + INTERVAL '2' DAY:timestamp_ntz>
|
||||
-- !query output
|
||||
2021-06-27 10:11:12
|
||||
|
||||
|
@ -1599,7 +1597,7 @@ struct<to_timestamp_ntz(2021-06-25 10:11:12) + INTERVAL '20 15:40:32.998999' DAY
|
|||
-- !query
|
||||
select to_timestamp_ntz('2021-06-25 10:11:12') - interval 2 day
|
||||
-- !query schema
|
||||
struct<to_timestamp_ntz(2021-06-25 10:11:12) - INTERVAL '2 days':timestamp_ntz>
|
||||
struct<to_timestamp_ntz(2021-06-25 10:11:12) - INTERVAL '2' DAY:timestamp_ntz>
|
||||
-- !query output
|
||||
2021-06-23 10:11:12
|
||||
|
||||
|
|
|
@ -1,5 +1,5 @@
|
|||
-- Automatically generated by SQLQueryTestSuite
|
||||
-- Number of queries: 165
|
||||
-- Number of queries: 199
|
||||
|
||||
|
||||
-- !query
|
||||
|
@ -13,9 +13,31 @@ struct<((TIMESTAMP '2019-10-15 10:11:12.001002' - DATE '2019-10-15') * 3):interv
|
|||
-- !query
|
||||
select interval 4 month 2 weeks 3 microseconds * 1.5
|
||||
-- !query schema
|
||||
struct<multiply_interval(INTERVAL '4 months 14 days 0.000003 seconds', 1.5):interval>
|
||||
struct<>
|
||||
-- !query output
|
||||
6 months 21 days 0.000005 seconds
|
||||
org.apache.spark.sql.catalyst.parser.ParseException
|
||||
|
||||
Cannot mix year-month and day-time fields: interval 4 month 2 weeks 3 microseconds(line 1, pos 7)
|
||||
|
||||
== SQL ==
|
||||
select interval 4 month 2 weeks 3 microseconds * 1.5
|
||||
-------^^^
|
||||
|
||||
|
||||
-- !query
|
||||
select interval 2 years 4 months
|
||||
-- !query schema
|
||||
struct<INTERVAL '2-4' YEAR TO MONTH:interval year to month>
|
||||
-- !query output
|
||||
2-4
|
||||
|
||||
|
||||
-- !query
|
||||
select interval 2 weeks 3 microseconds * 1.5
|
||||
-- !query schema
|
||||
struct<(INTERVAL '14 00:00:00.000003' DAY TO SECOND * 1.5):interval day to second>
|
||||
-- !query output
|
||||
21 00:00:00.000005000
|
||||
|
||||
|
||||
-- !query
|
||||
|
@ -41,7 +63,7 @@ select interval 2147483647 month / 0.5
|
|||
struct<>
|
||||
-- !query output
|
||||
java.lang.ArithmeticException
|
||||
integer overflow
|
||||
Overflow
|
||||
|
||||
|
||||
-- !query
|
||||
|
@ -50,7 +72,7 @@ select interval 2147483647 day * 2
|
|||
struct<>
|
||||
-- !query output
|
||||
java.lang.ArithmeticException
|
||||
integer overflow
|
||||
long overflow
|
||||
|
||||
|
||||
-- !query
|
||||
|
@ -59,7 +81,7 @@ select interval 2147483647 day / 0.5
|
|||
struct<>
|
||||
-- !query output
|
||||
java.lang.ArithmeticException
|
||||
integer overflow
|
||||
long overflow
|
||||
|
||||
|
||||
-- !query
|
||||
|
@ -103,12 +125,50 @@ struct<(- INTERVAL '-1 months 1 days -1 seconds'):interval>
|
|||
1 months -1 days 1 seconds
|
||||
|
||||
|
||||
-- !query
|
||||
select -interval '-1 year 1 month'
|
||||
-- !query schema
|
||||
struct<(- INTERVAL '-11 months'):interval>
|
||||
-- !query output
|
||||
11 months
|
||||
|
||||
|
||||
-- !query
|
||||
select -interval '-1 day 1 hour -1 minute 1 second'
|
||||
-- !query schema
|
||||
struct<(- INTERVAL '-1 days 59 minutes 1 seconds'):interval>
|
||||
-- !query output
|
||||
1 days -59 minutes -1 seconds
|
||||
|
||||
|
||||
-- !query
|
||||
select -interval -1 month 1 day -1 second
|
||||
-- !query schema
|
||||
struct<(- INTERVAL '-1 months 1 days -1 seconds'):interval>
|
||||
struct<>
|
||||
-- !query output
|
||||
1 months -1 days 1 seconds
|
||||
org.apache.spark.sql.catalyst.parser.ParseException
|
||||
|
||||
Cannot mix year-month and day-time fields: interval -1 month 1 day -1 second(line 1, pos 8)
|
||||
|
||||
== SQL ==
|
||||
select -interval -1 month 1 day -1 second
|
||||
--------^^^
|
||||
|
||||
|
||||
-- !query
|
||||
select -interval -1 year 1 month
|
||||
-- !query schema
|
||||
struct<(- INTERVAL '-0-11' YEAR TO MONTH):interval year to month>
|
||||
-- !query output
|
||||
0-11
|
||||
|
||||
|
||||
-- !query
|
||||
select -interval -1 day 1 hour -1 minute 1 second
|
||||
-- !query schema
|
||||
struct<(- INTERVAL '-0 23:00:59' DAY TO SECOND):interval day to second>
|
||||
-- !query output
|
||||
0 23:00:59.000000000
|
||||
|
||||
|
||||
-- !query
|
||||
|
@ -119,12 +179,50 @@ struct<(+ INTERVAL '-1 months 1 days -1 seconds'):interval>
|
|||
-1 months 1 days -1 seconds
|
||||
|
||||
|
||||
-- !query
|
||||
select +interval '-1 year 1 month'
|
||||
-- !query schema
|
||||
struct<(+ INTERVAL '-11 months'):interval>
|
||||
-- !query output
|
||||
-11 months
|
||||
|
||||
|
||||
-- !query
|
||||
select +interval '-1 day 1 hour -1 minute 1 second'
|
||||
-- !query schema
|
||||
struct<(+ INTERVAL '-1 days 59 minutes 1 seconds'):interval>
|
||||
-- !query output
|
||||
-1 days 59 minutes 1 seconds
|
||||
|
||||
|
||||
-- !query
|
||||
select +interval -1 month 1 day -1 second
|
||||
-- !query schema
|
||||
struct<(+ INTERVAL '-1 months 1 days -1 seconds'):interval>
|
||||
struct<>
|
||||
-- !query output
|
||||
-1 months 1 days -1 seconds
|
||||
org.apache.spark.sql.catalyst.parser.ParseException
|
||||
|
||||
Cannot mix year-month and day-time fields: interval -1 month 1 day -1 second(line 1, pos 8)
|
||||
|
||||
== SQL ==
|
||||
select +interval -1 month 1 day -1 second
|
||||
--------^^^
|
||||
|
||||
|
||||
-- !query
|
||||
select +interval -1 year 1 month
|
||||
-- !query schema
|
||||
struct<(+ INTERVAL '-0-11' YEAR TO MONTH):interval year to month>
|
||||
-- !query output
|
||||
-0-11
|
||||
|
||||
|
||||
-- !query
|
||||
select +interval -1 day 1 hour -1 minute 1 second
|
||||
-- !query schema
|
||||
struct<(+ INTERVAL '-0 23:00:59' DAY TO SECOND):interval day to second>
|
||||
-- !query output
|
||||
-0 23:00:59.000000000
|
||||
|
||||
|
||||
-- !query
|
||||
|
@ -414,25 +512,77 @@ NULL
|
|||
-- !query
|
||||
select interval 13.123456789 seconds, interval -13.123456789 second
|
||||
-- !query schema
|
||||
struct<INTERVAL '13.123456 seconds':interval,INTERVAL '-13.123456 seconds':interval>
|
||||
struct<INTERVAL '13.123456' SECOND:interval second,INTERVAL '-13.123456' SECOND:interval second>
|
||||
-- !query output
|
||||
13.123456 seconds -13.123456 seconds
|
||||
0 00:00:13.123456000 -0 00:00:13.123456000
|
||||
|
||||
|
||||
-- !query
|
||||
select interval 1 year 2 month 3 week 4 day 5 hour 6 minute 7 seconds 8 millisecond 9 microsecond
|
||||
-- !query schema
|
||||
struct<INTERVAL '1 years 2 months 25 days 5 hours 6 minutes 7.008009 seconds':interval>
|
||||
struct<>
|
||||
-- !query output
|
||||
1 years 2 months 25 days 5 hours 6 minutes 7.008009 seconds
|
||||
org.apache.spark.sql.catalyst.parser.ParseException
|
||||
|
||||
Cannot mix year-month and day-time fields: interval 1 year 2 month 3 week 4 day 5 hour 6 minute 7 seconds 8 millisecond 9 microsecond(line 1, pos 7)
|
||||
|
||||
== SQL ==
|
||||
select interval 1 year 2 month 3 week 4 day 5 hour 6 minute 7 seconds 8 millisecond 9 microsecond
|
||||
-------^^^
|
||||
|
||||
|
||||
-- !query
|
||||
select interval 1 year 2 month
|
||||
-- !query schema
|
||||
struct<INTERVAL '1-2' YEAR TO MONTH:interval year to month>
|
||||
-- !query output
|
||||
1-2
|
||||
|
||||
|
||||
-- !query
|
||||
select interval 4 day 5 hour 6 minute 7 seconds
|
||||
-- !query schema
|
||||
struct<INTERVAL '4 05:06:07' DAY TO SECOND:interval day to second>
|
||||
-- !query output
|
||||
4 05:06:07.000000000
|
||||
|
||||
|
||||
-- !query
|
||||
select interval 3 week 8 millisecond 9 microsecond
|
||||
-- !query schema
|
||||
struct<INTERVAL '21 00:00:00.008009' DAY TO SECOND:interval day to second>
|
||||
-- !query output
|
||||
21 00:00:00.008009000
|
||||
|
||||
|
||||
-- !query
|
||||
select interval '30' year '25' month '-100' day '40' hour '80' minute '299.889987299' second
|
||||
-- !query schema
|
||||
struct<INTERVAL '32 years 1 months -100 days 41 hours 24 minutes 59.889987 seconds':interval>
|
||||
struct<>
|
||||
-- !query output
|
||||
32 years 1 months -100 days 41 hours 24 minutes 59.889987 seconds
|
||||
org.apache.spark.sql.catalyst.parser.ParseException
|
||||
|
||||
Cannot mix year-month and day-time fields: interval '30' year '25' month '-100' day '40' hour '80' minute '299.889987299' second(line 1, pos 7)
|
||||
|
||||
== SQL ==
|
||||
select interval '30' year '25' month '-100' day '40' hour '80' minute '299.889987299' second
|
||||
-------^^^
|
||||
|
||||
|
||||
-- !query
|
||||
select interval '30' year '25' month
|
||||
-- !query schema
|
||||
struct<INTERVAL '32-1' YEAR TO MONTH:interval year to month>
|
||||
-- !query output
|
||||
32-1
|
||||
|
||||
|
||||
-- !query
|
||||
select interval '-100' day '40' hour '80' minute '299.889987299' second
|
||||
-- !query schema
|
||||
struct<INTERVAL '-98 06:35:00.110013' DAY TO SECOND:interval day to second>
|
||||
-- !query output
|
||||
-98 06:35:00.110013000
|
||||
|
||||
|
||||
-- !query
|
||||
|
@ -526,17 +676,17 @@ struct<INTERVAL '40:32' MINUTE TO SECOND:interval minute to second>
|
|||
-- !query
|
||||
select interval 30 day day
|
||||
-- !query schema
|
||||
struct<day:interval>
|
||||
struct<day:interval day>
|
||||
-- !query output
|
||||
30 days
|
||||
30 00:00:00.000000000
|
||||
|
||||
|
||||
-- !query
|
||||
select interval 30 days days
|
||||
-- !query schema
|
||||
struct<days:interval>
|
||||
struct<days:interval day>
|
||||
-- !query output
|
||||
30 days
|
||||
30 00:00:00.000000000
|
||||
|
||||
|
||||
-- !query
|
||||
|
@ -640,9 +790,50 @@ select interval 10 nanoseconds
|
|||
-- !query
|
||||
select map(1, interval 1 day, 2, interval 3 week)
|
||||
-- !query schema
|
||||
struct<map(1, INTERVAL '1 days', 2, INTERVAL '21 days'):map<int,interval>>
|
||||
struct<map(1, INTERVAL '1' DAY, 2, INTERVAL '21' DAY):map<int,interval day>>
|
||||
-- !query output
|
||||
{1:1 days,2:21 days}
|
||||
{1:1 00:00:00.000000000,2:21 00:00:00.000000000}
|
||||
|
||||
|
||||
-- !query
|
||||
select map(1, interval 1 day, 2, interval 2 day)
|
||||
-- !query schema
|
||||
struct<map(1, INTERVAL '1' DAY, 2, INTERVAL '2' DAY):map<int,interval day>>
|
||||
-- !query output
|
||||
{1:1 00:00:00.000000000,2:2 00:00:00.000000000}
|
||||
|
||||
|
||||
-- !query
|
||||
select map(1, interval 1 year, 2, interval 2 month)
|
||||
-- !query schema
|
||||
struct<>
|
||||
-- !query output
|
||||
org.apache.spark.sql.AnalysisException
|
||||
cannot resolve 'map(1, INTERVAL '1' YEAR, 2, INTERVAL '2' MONTH)' due to data type mismatch: The given values of function map should all be the same type, but they are [interval year, interval month]; line 1 pos 7
|
||||
|
||||
|
||||
-- !query
|
||||
select map(1, interval 1 month, 2, interval 2 month)
|
||||
-- !query schema
|
||||
struct<map(1, INTERVAL '1' MONTH, 2, INTERVAL '2' MONTH):map<int,interval month>>
|
||||
-- !query output
|
||||
{1:0-1,2:0-2}
|
||||
|
||||
|
||||
-- !query
|
||||
select map(1, interval 1 week, 2, interval 2 day)
|
||||
-- !query schema
|
||||
struct<map(1, INTERVAL '7' DAY, 2, INTERVAL '2' DAY):map<int,interval day>>
|
||||
-- !query output
|
||||
{1:7 00:00:00.000000000,2:2 00:00:00.000000000}
|
||||
|
||||
|
||||
-- !query
|
||||
select map(1, interval 2 millisecond, 3, interval 3 microsecond)
|
||||
-- !query schema
|
||||
struct<map(1, INTERVAL '00.002' SECOND, 3, INTERVAL '00.000003' SECOND):map<int,interval second>>
|
||||
-- !query output
|
||||
{1:0 00:00:00.002000000,3:0 00:00:00.000003000}
|
||||
|
||||
|
||||
-- !query
|
||||
|
@ -1065,17 +1256,17 @@ struct<INTERVAL '1 days':interval>
|
|||
-- !query
|
||||
select interval '1\t' day
|
||||
-- !query schema
|
||||
struct<INTERVAL '1 days':interval>
|
||||
struct<INTERVAL '1' DAY:interval day>
|
||||
-- !query output
|
||||
1 days
|
||||
1 00:00:00.000000000
|
||||
|
||||
|
||||
-- !query
|
||||
select interval '1 ' day
|
||||
-- !query schema
|
||||
struct<INTERVAL '1 days':interval>
|
||||
struct<INTERVAL '1' DAY:interval day>
|
||||
-- !query output
|
||||
1 days
|
||||
1 00:00:00.000000000
|
||||
|
||||
|
||||
-- !query
|
||||
|
@ -1213,25 +1404,53 @@ integer overflow
|
|||
-- !query
|
||||
SELECT
|
||||
from_csv('1, 1 day', 'a INT, b interval'),
|
||||
from_csv('1, 1', 'a INT, b interval day'),
|
||||
to_csv(from_csv('1, 1 day', 'a INT, b interval')),
|
||||
to_csv(named_struct('a', interval 32 month, 'b', interval 70 minute)),
|
||||
from_csv(to_csv(named_struct('a', interval 32 month, 'b', interval 70 minute)), 'a interval, b interval')
|
||||
to_csv(from_csv('1, 1', 'a INT, b interval day')),
|
||||
to_csv(named_struct('a', interval 32 hour, 'b', interval 70 minute)),
|
||||
from_csv(to_csv(named_struct('a', interval 32 hour, 'b', interval 70 minute)), 'a interval hour, b interval minute')
|
||||
-- !query schema
|
||||
struct<from_csv(1, 1 day):struct<a:int,b:interval>,to_csv(from_csv(1, 1 day)):string,to_csv(named_struct(a, INTERVAL '2 years 8 months', b, INTERVAL '1 hours 10 minutes')):string,from_csv(to_csv(named_struct(a, INTERVAL '2 years 8 months', b, INTERVAL '1 hours 10 minutes'))):struct<a:interval,b:interval>>
|
||||
struct<from_csv(1, 1 day):struct<a:int,b:interval>,from_csv(1, 1):struct<a:int,b:interval day>,to_csv(from_csv(1, 1 day)):string,to_csv(from_csv(1, 1)):string,to_csv(named_struct(a, INTERVAL '32' HOUR, b, INTERVAL '70' MINUTE)):string,from_csv(to_csv(named_struct(a, INTERVAL '32' HOUR, b, INTERVAL '70' MINUTE))):struct<a:interval hour,b:interval minute>>
|
||||
-- !query output
|
||||
{"a":1,"b":1 days} 1,1 days 2 years 8 months,1 hours 10 minutes {"a":2 years 8 months,"b":1 hours 10 minutes}
|
||||
{"a":1,"b":1 days} {"a":1,"b":1 00:00:00.000000000} 1,1 days 1,INTERVAL '1' DAY INTERVAL '32' HOUR,INTERVAL '70' MINUTE {"a":1 08:00:00.000000000,"b":0 01:10:00.000000000}
|
||||
|
||||
|
||||
-- !query
|
||||
SELECT
|
||||
from_json('{"a":"1 days"}', 'a interval'),
|
||||
from_csv('1, 1', 'a INT, b interval year'),
|
||||
to_json(from_json('{"a":"1 days"}', 'a interval')),
|
||||
to_json(map('a', interval 25 month 100 day 130 minute)),
|
||||
from_json(to_json(map('a', interval 25 month 100 day 130 minute)), 'a interval')
|
||||
to_csv(from_csv('1, 1', 'a INT, b interval year')),
|
||||
to_csv(named_struct('a', interval 32 year, 'b', interval 10 month)),
|
||||
from_csv(to_csv(named_struct('a', interval 32 year, 'b', interval 10 month)), 'a interval year, b interval month')
|
||||
-- !query schema
|
||||
struct<from_json({"a":"1 days"}):struct<a:interval>,to_json(from_json({"a":"1 days"})):string,to_json(map(a, INTERVAL '2 years 1 months 100 days 2 hours 10 minutes')):string,from_json(to_json(map(a, INTERVAL '2 years 1 months 100 days 2 hours 10 minutes'))):struct<a:interval>>
|
||||
struct<from_json({"a":"1 days"}):struct<a:interval>,from_csv(1, 1):struct<a:int,b:interval year>,to_json(from_json({"a":"1 days"})):string,to_csv(from_csv(1, 1)):string,to_csv(named_struct(a, INTERVAL '32' YEAR, b, INTERVAL '10' MONTH)):string,from_csv(to_csv(named_struct(a, INTERVAL '32' YEAR, b, INTERVAL '10' MONTH))):struct<a:interval year,b:interval month>>
|
||||
-- !query output
|
||||
{"a":1 days} {"a":"1 days"} {"a":"2 years 1 months 100 days 2 hours 10 minutes"} {"a":2 years 1 months 100 days 2 hours 10 minutes}
|
||||
{"a":1 days} {"a":1,"b":1-0} {"a":"1 days"} 1,INTERVAL '1' YEAR INTERVAL '32' YEAR,INTERVAL '10' MONTH {"a":32-0,"b":0-10}
|
||||
|
||||
|
||||
-- !query
|
||||
SELECT
|
||||
from_json('{"a":"1"}', 'a interval day'),
|
||||
to_json(from_json('{"a":"1"}', 'a interval day')),
|
||||
to_json(map('a', interval 100 day 130 minute)),
|
||||
from_json(to_json(map('a', interval 100 day 130 minute)), 'a interval day to minute')
|
||||
-- !query schema
|
||||
struct<from_json({"a":"1"}):struct<a:interval day>,to_json(from_json({"a":"1"})):string,to_json(map(a, INTERVAL '100 02:10' DAY TO MINUTE)):string,from_json(to_json(map(a, INTERVAL '100 02:10' DAY TO MINUTE))):struct<a:interval day to minute>>
|
||||
-- !query output
|
||||
{"a":1 00:00:00.000000000} {"a":"INTERVAL '1' DAY"} {"a":"INTERVAL '100 02:10' DAY TO MINUTE"} {"a":100 02:10:00.000000000}
|
||||
|
||||
|
||||
-- !query
|
||||
SELECT
|
||||
from_json('{"a":"1"}', 'a interval year'),
|
||||
to_json(from_json('{"a":"1"}', 'a interval year')),
|
||||
to_json(map('a', interval 32 year 10 month)),
|
||||
from_json(to_json(map('a', interval 32 year 10 month)), 'a interval year to month')
|
||||
-- !query schema
|
||||
struct<from_json({"a":"1"}):struct<a:interval year>,to_json(from_json({"a":"1"})):string,to_json(map(a, INTERVAL '32-10' YEAR TO MONTH)):string,from_json(to_json(map(a, INTERVAL '32-10' YEAR TO MONTH))):struct<a:interval year to month>>
|
||||
-- !query output
|
||||
{"a":1-0} {"a":"INTERVAL '1' YEAR"} {"a":"INTERVAL '32-10' YEAR TO MONTH"} {"a":32-10}
|
||||
|
||||
|
||||
-- !query
|
||||
|
@ -1363,9 +1582,9 @@ select interval 'interval 1' day
|
|||
-- !query
|
||||
select interval '-\t 1' day
|
||||
-- !query schema
|
||||
struct<INTERVAL '-1 days':interval>
|
||||
struct<INTERVAL '-1' DAY:interval day>
|
||||
-- !query output
|
||||
-1 days
|
||||
-1 00:00:00.000000000
|
||||
|
||||
|
||||
-- !query
|
||||
|
@ -1650,3 +1869,99 @@ requirement failed: minute 153722867281 outside range [0, 153722867280](line 1,
|
|||
== SQL ==
|
||||
SELECT INTERVAL '-153722867281:54.775808' MINUTE TO SECOND
|
||||
----------------^^^
|
||||
|
||||
|
||||
-- !query
|
||||
SELECT INTERVAL '178956970' YEAR
|
||||
-- !query schema
|
||||
struct<INTERVAL '178956970' YEAR:interval year>
|
||||
-- !query output
|
||||
178956970-0
|
||||
|
||||
|
||||
-- !query
|
||||
SELECT INTERVAL '-178956970' YEAR
|
||||
-- !query schema
|
||||
struct<INTERVAL '-178956970' YEAR:interval year>
|
||||
-- !query output
|
||||
-178956970-0
|
||||
|
||||
|
||||
-- !query
|
||||
SELECT INTERVAL '2147483647' MONTH
|
||||
-- !query schema
|
||||
struct<INTERVAL '2147483647' MONTH:interval month>
|
||||
-- !query output
|
||||
178956970-7
|
||||
|
||||
|
||||
-- !query
|
||||
SELECT INTERVAL '-2147483647' MONTH
|
||||
-- !query schema
|
||||
struct<INTERVAL '-2147483647' MONTH:interval month>
|
||||
-- !query output
|
||||
-178956970-7
|
||||
|
||||
|
||||
-- !query
|
||||
SELECT INTERVAL '106751991' DAY
|
||||
-- !query schema
|
||||
struct<INTERVAL '106751991' DAY:interval day>
|
||||
-- !query output
|
||||
106751991 00:00:00.000000000
|
||||
|
||||
|
||||
-- !query
|
||||
SELECT INTERVAL '-106751991' DAY
|
||||
-- !query schema
|
||||
struct<INTERVAL '-106751991' DAY:interval day>
|
||||
-- !query output
|
||||
-106751991 00:00:00.000000000
|
||||
|
||||
|
||||
-- !query
|
||||
SELECT INTERVAL '2562047788' HOUR
|
||||
-- !query schema
|
||||
struct<INTERVAL '2562047788' HOUR:interval hour>
|
||||
-- !query output
|
||||
106751991 04:00:00.000000000
|
||||
|
||||
|
||||
-- !query
|
||||
SELECT INTERVAL '-2562047788' HOUR
|
||||
-- !query schema
|
||||
struct<INTERVAL '-2562047788' HOUR:interval hour>
|
||||
-- !query output
|
||||
-106751991 04:00:00.000000000
|
||||
|
||||
|
||||
-- !query
|
||||
SELECT INTERVAL '153722867280' MINUTE
|
||||
-- !query schema
|
||||
struct<INTERVAL '153722867280' MINUTE:interval minute>
|
||||
-- !query output
|
||||
106751991 04:00:00.000000000
|
||||
|
||||
|
||||
-- !query
|
||||
SELECT INTERVAL '-153722867280' MINUTE
|
||||
-- !query schema
|
||||
struct<INTERVAL '-153722867280' MINUTE:interval minute>
|
||||
-- !query output
|
||||
-106751991 04:00:00.000000000
|
||||
|
||||
|
||||
-- !query
|
||||
SELECT INTERVAL '54.775807' SECOND
|
||||
-- !query schema
|
||||
struct<INTERVAL '54.775807' SECOND:interval second>
|
||||
-- !query output
|
||||
0 00:00:54.775807000
|
||||
|
||||
|
||||
-- !query
|
||||
SELECT INTERVAL '-54.775807' SECOND
|
||||
-- !query schema
|
||||
struct<INTERVAL '-54.775807' SECOND:interval second>
|
||||
-- !query output
|
||||
-0 00:00:54.775807000
|
||||
|
|
|
@ -1,5 +1,5 @@
|
|||
-- Automatically generated by SQLQueryTestSuite
|
||||
-- Number of queries: 51
|
||||
-- Number of queries: 55
|
||||
|
||||
|
||||
-- !query
|
||||
|
@ -283,12 +283,50 @@ struct<CAST(interval 3 month 1 hour AS INTERVAL):interval>
|
|||
3 months 1 hours
|
||||
|
||||
|
||||
-- !query
|
||||
SELECT CAST("interval '3-1' year to month" AS interval year to month)
|
||||
-- !query schema
|
||||
struct<CAST(interval '3-1' year to month AS INTERVAL YEAR TO MONTH):interval year to month>
|
||||
-- !query output
|
||||
3-1
|
||||
|
||||
|
||||
-- !query
|
||||
SELECT CAST("interval '3 00:00:01' day to second" AS interval day to second)
|
||||
-- !query schema
|
||||
struct<CAST(interval '3 00:00:01' day to second AS INTERVAL DAY TO SECOND):interval day to second>
|
||||
-- !query output
|
||||
3 00:00:01.000000000
|
||||
|
||||
|
||||
-- !query
|
||||
SELECT CAST(interval 3 month 1 hour AS string)
|
||||
-- !query schema
|
||||
struct<CAST(INTERVAL '3 months 1 hours' AS STRING):string>
|
||||
struct<>
|
||||
-- !query output
|
||||
3 months 1 hours
|
||||
org.apache.spark.sql.catalyst.parser.ParseException
|
||||
|
||||
Cannot mix year-month and day-time fields: interval 3 month 1 hour(line 1, pos 12)
|
||||
|
||||
== SQL ==
|
||||
SELECT CAST(interval 3 month 1 hour AS string)
|
||||
------------^^^
|
||||
|
||||
|
||||
-- !query
|
||||
SELECT CAST(interval 3 year 1 month AS string)
|
||||
-- !query schema
|
||||
struct<CAST(INTERVAL '3-1' YEAR TO MONTH AS STRING):string>
|
||||
-- !query output
|
||||
INTERVAL '3-1' YEAR TO MONTH
|
||||
|
||||
|
||||
-- !query
|
||||
SELECT CAST(interval 3 day 1 second AS string)
|
||||
-- !query schema
|
||||
struct<CAST(INTERVAL '3 00:00:01' DAY TO SECOND AS STRING):string>
|
||||
-- !query output
|
||||
INTERVAL '3 00:00:01' DAY TO SECOND
|
||||
|
||||
|
||||
-- !query
|
||||
|
|
|
@ -319,7 +319,7 @@ select timestamp '2019-01-01中文'
|
|||
-- !query
|
||||
select timestamp'2011-11-11 11:11:11' + interval '2' day
|
||||
-- !query schema
|
||||
struct<TIMESTAMP '2011-11-11 11:11:11' + INTERVAL '2 days':timestamp>
|
||||
struct<TIMESTAMP '2011-11-11 11:11:11' + INTERVAL '2' DAY:timestamp>
|
||||
-- !query output
|
||||
2011-11-13 11:11:11
|
||||
|
||||
|
@ -327,7 +327,7 @@ struct<TIMESTAMP '2011-11-11 11:11:11' + INTERVAL '2 days':timestamp>
|
|||
-- !query
|
||||
select timestamp'2011-11-11 11:11:11' - interval '2' day
|
||||
-- !query schema
|
||||
struct<TIMESTAMP '2011-11-11 11:11:11' - INTERVAL '2 days':timestamp>
|
||||
struct<TIMESTAMP '2011-11-11 11:11:11' - INTERVAL '2' DAY:timestamp>
|
||||
-- !query output
|
||||
2011-11-09 11:11:11
|
||||
|
||||
|
@ -335,23 +335,23 @@ struct<TIMESTAMP '2011-11-11 11:11:11' - INTERVAL '2 days':timestamp>
|
|||
-- !query
|
||||
select date'2011-11-11 11:11:11' + interval '2' second
|
||||
-- !query schema
|
||||
struct<DATE '2011-11-11' + INTERVAL '2 seconds':date>
|
||||
struct<DATE '2011-11-11' + INTERVAL '02' SECOND:timestamp>
|
||||
-- !query output
|
||||
2011-11-11
|
||||
2011-11-11 00:00:02
|
||||
|
||||
|
||||
-- !query
|
||||
select date'2011-11-11 11:11:11' - interval '2' second
|
||||
-- !query schema
|
||||
struct<DATE '2011-11-11' - INTERVAL '2 seconds':date>
|
||||
struct<DATE '2011-11-11' - INTERVAL '02' SECOND:timestamp>
|
||||
-- !query output
|
||||
2011-11-10
|
||||
2011-11-10 23:59:58
|
||||
|
||||
|
||||
-- !query
|
||||
select '2011-11-11' - interval '2' day
|
||||
-- !query schema
|
||||
struct<2011-11-11 - INTERVAL '2 days':string>
|
||||
struct<2011-11-11 - INTERVAL '2' DAY:string>
|
||||
-- !query output
|
||||
2011-11-09 00:00:00
|
||||
|
||||
|
@ -359,7 +359,7 @@ struct<2011-11-11 - INTERVAL '2 days':string>
|
|||
-- !query
|
||||
select '2011-11-11 11:11:11' - interval '2' second
|
||||
-- !query schema
|
||||
struct<2011-11-11 11:11:11 - INTERVAL '2 seconds':string>
|
||||
struct<2011-11-11 11:11:11 - INTERVAL '02' SECOND:string>
|
||||
-- !query output
|
||||
2011-11-11 11:11:09
|
||||
|
||||
|
@ -367,7 +367,7 @@ struct<2011-11-11 11:11:11 - INTERVAL '2 seconds':string>
|
|||
-- !query
|
||||
select '1' - interval '2' second
|
||||
-- !query schema
|
||||
struct<1 - INTERVAL '2 seconds':string>
|
||||
struct<1 - INTERVAL '02' SECOND:string>
|
||||
-- !query output
|
||||
NULL
|
||||
|
||||
|
@ -378,7 +378,7 @@ select 1 - interval '2' second
|
|||
struct<>
|
||||
-- !query output
|
||||
org.apache.spark.sql.AnalysisException
|
||||
cannot resolve '1 + (- INTERVAL '2 seconds')' due to data type mismatch: argument 1 requires (timestamp or timestamp without time zone) type, however, '1' is of int type.; line 1 pos 7
|
||||
cannot resolve '1 + (- INTERVAL '02' SECOND)' due to data type mismatch: argument 1 requires (timestamp or timestamp without time zone) type, however, '1' is of int type.; line 1 pos 7
|
||||
|
||||
|
||||
-- !query
|
||||
|
@ -1469,7 +1469,7 @@ NULL
|
|||
-- !query
|
||||
select to_timestamp_ntz('2021-06-25 10:11:12') + interval 2 day
|
||||
-- !query schema
|
||||
struct<to_timestamp_ntz(2021-06-25 10:11:12) + INTERVAL '2 days':timestamp_ntz>
|
||||
struct<to_timestamp_ntz(2021-06-25 10:11:12) + INTERVAL '2' DAY:timestamp_ntz>
|
||||
-- !query output
|
||||
2021-06-27 10:11:12
|
||||
|
||||
|
@ -1541,7 +1541,7 @@ struct<to_timestamp_ntz(2021-06-25 10:11:12) + INTERVAL '20 15:40:32.998999' DAY
|
|||
-- !query
|
||||
select to_timestamp_ntz('2021-06-25 10:11:12') - interval 2 day
|
||||
-- !query schema
|
||||
struct<to_timestamp_ntz(2021-06-25 10:11:12) - INTERVAL '2 days':timestamp_ntz>
|
||||
struct<to_timestamp_ntz(2021-06-25 10:11:12) - INTERVAL '2' DAY:timestamp_ntz>
|
||||
-- !query output
|
||||
2021-06-23 10:11:12
|
||||
|
||||
|
|
|
@ -319,7 +319,7 @@ select timestamp '2019-01-01中文'
|
|||
-- !query
|
||||
select timestamp'2011-11-11 11:11:11' + interval '2' day
|
||||
-- !query schema
|
||||
struct<TIMESTAMP '2011-11-11 11:11:11' + INTERVAL '2 days':timestamp>
|
||||
struct<TIMESTAMP '2011-11-11 11:11:11' + INTERVAL '2' DAY:timestamp>
|
||||
-- !query output
|
||||
2011-11-13 11:11:11
|
||||
|
||||
|
@ -327,7 +327,7 @@ struct<TIMESTAMP '2011-11-11 11:11:11' + INTERVAL '2 days':timestamp>
|
|||
-- !query
|
||||
select timestamp'2011-11-11 11:11:11' - interval '2' day
|
||||
-- !query schema
|
||||
struct<TIMESTAMP '2011-11-11 11:11:11' - INTERVAL '2 days':timestamp>
|
||||
struct<TIMESTAMP '2011-11-11 11:11:11' - INTERVAL '2' DAY:timestamp>
|
||||
-- !query output
|
||||
2011-11-09 11:11:11
|
||||
|
||||
|
@ -335,23 +335,23 @@ struct<TIMESTAMP '2011-11-11 11:11:11' - INTERVAL '2 days':timestamp>
|
|||
-- !query
|
||||
select date'2011-11-11 11:11:11' + interval '2' second
|
||||
-- !query schema
|
||||
struct<DATE '2011-11-11' + INTERVAL '2 seconds':date>
|
||||
struct<DATE '2011-11-11' + INTERVAL '02' SECOND:timestamp>
|
||||
-- !query output
|
||||
2011-11-11
|
||||
2011-11-11 00:00:02
|
||||
|
||||
|
||||
-- !query
|
||||
select date'2011-11-11 11:11:11' - interval '2' second
|
||||
-- !query schema
|
||||
struct<DATE '2011-11-11' - INTERVAL '2 seconds':date>
|
||||
struct<DATE '2011-11-11' - INTERVAL '02' SECOND:timestamp>
|
||||
-- !query output
|
||||
2011-11-10
|
||||
2011-11-10 23:59:58
|
||||
|
||||
|
||||
-- !query
|
||||
select '2011-11-11' - interval '2' day
|
||||
-- !query schema
|
||||
struct<2011-11-11 - INTERVAL '2 days':string>
|
||||
struct<2011-11-11 - INTERVAL '2' DAY:string>
|
||||
-- !query output
|
||||
2011-11-09 00:00:00
|
||||
|
||||
|
@ -359,7 +359,7 @@ struct<2011-11-11 - INTERVAL '2 days':string>
|
|||
-- !query
|
||||
select '2011-11-11 11:11:11' - interval '2' second
|
||||
-- !query schema
|
||||
struct<2011-11-11 11:11:11 - INTERVAL '2 seconds':string>
|
||||
struct<2011-11-11 11:11:11 - INTERVAL '02' SECOND:string>
|
||||
-- !query output
|
||||
2011-11-11 11:11:09
|
||||
|
||||
|
@ -367,7 +367,7 @@ struct<2011-11-11 11:11:11 - INTERVAL '2 seconds':string>
|
|||
-- !query
|
||||
select '1' - interval '2' second
|
||||
-- !query schema
|
||||
struct<1 - INTERVAL '2 seconds':string>
|
||||
struct<1 - INTERVAL '02' SECOND:string>
|
||||
-- !query output
|
||||
NULL
|
||||
|
||||
|
@ -378,7 +378,7 @@ select 1 - interval '2' second
|
|||
struct<>
|
||||
-- !query output
|
||||
org.apache.spark.sql.AnalysisException
|
||||
cannot resolve '1 + (- INTERVAL '2 seconds')' due to data type mismatch: argument 1 requires (timestamp or timestamp without time zone) type, however, '1' is of int type.; line 1 pos 7
|
||||
cannot resolve '1 + (- INTERVAL '02' SECOND)' due to data type mismatch: argument 1 requires (timestamp or timestamp without time zone) type, however, '1' is of int type.; line 1 pos 7
|
||||
|
||||
|
||||
-- !query
|
||||
|
@ -1477,7 +1477,7 @@ NULL
|
|||
-- !query
|
||||
select to_timestamp_ntz('2021-06-25 10:11:12') + interval 2 day
|
||||
-- !query schema
|
||||
struct<to_timestamp_ntz(2021-06-25 10:11:12) + INTERVAL '2 days':timestamp_ntz>
|
||||
struct<to_timestamp_ntz(2021-06-25 10:11:12) + INTERVAL '2' DAY:timestamp_ntz>
|
||||
-- !query output
|
||||
2021-06-27 10:11:12
|
||||
|
||||
|
@ -1549,7 +1549,7 @@ struct<to_timestamp_ntz(2021-06-25 10:11:12) + INTERVAL '20 15:40:32.998999' DAY
|
|||
-- !query
|
||||
select to_timestamp_ntz('2021-06-25 10:11:12') - interval 2 day
|
||||
-- !query schema
|
||||
struct<to_timestamp_ntz(2021-06-25 10:11:12) - INTERVAL '2 days':timestamp_ntz>
|
||||
struct<to_timestamp_ntz(2021-06-25 10:11:12) - INTERVAL '2' DAY:timestamp_ntz>
|
||||
-- !query output
|
||||
2021-06-23 10:11:12
|
||||
|
||||
|
|
|
@ -1,9 +1,9 @@
|
|||
-- Automatically generated by SQLQueryTestSuite
|
||||
-- Number of queries: 125
|
||||
-- Number of queries: 127
|
||||
|
||||
|
||||
-- !query
|
||||
CREATE TEMPORARY VIEW t AS select '2011-05-06 07:08:09.1234567' as c, to_timestamp_ntz('2011-05-06 07:08:09.1234567') as ntz,interval 10 year 20 month 30 day 40 hour 50 minute 6.7890 second as i
|
||||
CREATE TEMPORARY VIEW t AS select '2011-05-06 07:08:09.1234567' as c, to_timestamp_ntz('2011-05-06 07:08:09.1234567') as ntz, interval 10 year 20 month as i, interval 30 day 40 hour 50 minute 6.7890 second as j
|
||||
-- !query schema
|
||||
struct<>
|
||||
-- !query output
|
||||
|
@ -131,25 +131,25 @@ struct<extract(weeks FROM c):int,extract(weeks FROM ntz):int>
|
|||
|
||||
|
||||
-- !query
|
||||
select extract(day from c), extract(day from ntz), extract(day from i) from t
|
||||
select extract(day from c), extract(day from ntz), extract(day from j) from t
|
||||
-- !query schema
|
||||
struct<extract(day FROM c):int,extract(day FROM ntz):int,extract(day FROM i):int>
|
||||
struct<extract(day FROM c):int,extract(day FROM ntz):int,extract(day FROM j):int>
|
||||
-- !query output
|
||||
6 6 31
|
||||
|
||||
|
||||
-- !query
|
||||
select extract(d from c), extract(d from ntz), extract(d from i) from t
|
||||
select extract(d from c), extract(d from ntz), extract(d from j) from t
|
||||
-- !query schema
|
||||
struct<extract(d FROM c):int,extract(d FROM ntz):int,extract(d FROM i):int>
|
||||
struct<extract(d FROM c):int,extract(d FROM ntz):int,extract(d FROM j):int>
|
||||
-- !query output
|
||||
6 6 31
|
||||
|
||||
|
||||
-- !query
|
||||
select extract(days from c), extract(days from ntz), extract(days from i) from t
|
||||
select extract(days from c), extract(days from ntz), extract(days from j) from t
|
||||
-- !query schema
|
||||
struct<extract(days FROM c):int,extract(days FROM ntz):int,extract(days FROM i):int>
|
||||
struct<extract(days FROM c):int,extract(days FROM ntz):int,extract(days FROM j):int>
|
||||
-- !query output
|
||||
6 6 31
|
||||
|
||||
|
@ -195,121 +195,121 @@ struct<extract(doy FROM c):int,extract(doy FROM ntz):int>
|
|||
|
||||
|
||||
-- !query
|
||||
select extract(hour from c), extract(hour from ntz), extract(hour from i) from t
|
||||
select extract(hour from c), extract(hour from ntz), extract(hour from j) from t
|
||||
-- !query schema
|
||||
struct<extract(hour FROM c):int,extract(hour FROM ntz):int,extract(hour FROM i):tinyint>
|
||||
struct<extract(hour FROM c):int,extract(hour FROM ntz):int,extract(hour FROM j):tinyint>
|
||||
-- !query output
|
||||
7 7 16
|
||||
|
||||
|
||||
-- !query
|
||||
select extract(h from c), extract(h from ntz), extract(h from i) from t
|
||||
select extract(h from c), extract(h from ntz), extract(h from j) from t
|
||||
-- !query schema
|
||||
struct<extract(h FROM c):int,extract(h FROM ntz):int,extract(h FROM i):tinyint>
|
||||
struct<extract(h FROM c):int,extract(h FROM ntz):int,extract(h FROM j):tinyint>
|
||||
-- !query output
|
||||
7 7 16
|
||||
|
||||
|
||||
-- !query
|
||||
select extract(hours from c), extract(hours from ntz), extract(hours from i) from t
|
||||
select extract(hours from c), extract(hours from ntz), extract(hours from j) from t
|
||||
-- !query schema
|
||||
struct<extract(hours FROM c):int,extract(hours FROM ntz):int,extract(hours FROM i):tinyint>
|
||||
struct<extract(hours FROM c):int,extract(hours FROM ntz):int,extract(hours FROM j):tinyint>
|
||||
-- !query output
|
||||
7 7 16
|
||||
|
||||
|
||||
-- !query
|
||||
select extract(hr from c), extract(hr from ntz), extract(hr from i) from t
|
||||
select extract(hr from c), extract(hr from ntz), extract(hr from j) from t
|
||||
-- !query schema
|
||||
struct<extract(hr FROM c):int,extract(hr FROM ntz):int,extract(hr FROM i):tinyint>
|
||||
struct<extract(hr FROM c):int,extract(hr FROM ntz):int,extract(hr FROM j):tinyint>
|
||||
-- !query output
|
||||
7 7 16
|
||||
|
||||
|
||||
-- !query
|
||||
select extract(hrs from c), extract(hrs from ntz), extract(hrs from i) from t
|
||||
select extract(hrs from c), extract(hrs from ntz), extract(hrs from j) from t
|
||||
-- !query schema
|
||||
struct<extract(hrs FROM c):int,extract(hrs FROM ntz):int,extract(hrs FROM i):tinyint>
|
||||
struct<extract(hrs FROM c):int,extract(hrs FROM ntz):int,extract(hrs FROM j):tinyint>
|
||||
-- !query output
|
||||
7 7 16
|
||||
|
||||
|
||||
-- !query
|
||||
select extract(minute from c), extract(minute from ntz), extract(minute from i) from t
|
||||
select extract(minute from c), extract(minute from ntz), extract(minute from j) from t
|
||||
-- !query schema
|
||||
struct<extract(minute FROM c):int,extract(minute FROM ntz):int,extract(minute FROM i):tinyint>
|
||||
struct<extract(minute FROM c):int,extract(minute FROM ntz):int,extract(minute FROM j):tinyint>
|
||||
-- !query output
|
||||
8 8 50
|
||||
|
||||
|
||||
-- !query
|
||||
select extract(m from c), extract(m from ntz), extract(m from i) from t
|
||||
select extract(m from c), extract(m from ntz), extract(m from j) from t
|
||||
-- !query schema
|
||||
struct<extract(m FROM c):int,extract(m FROM ntz):int,extract(m FROM i):tinyint>
|
||||
struct<extract(m FROM c):int,extract(m FROM ntz):int,extract(m FROM j):tinyint>
|
||||
-- !query output
|
||||
8 8 50
|
||||
|
||||
|
||||
-- !query
|
||||
select extract(min from c), extract(min from ntz), extract(min from i) from t
|
||||
select extract(min from c), extract(min from ntz), extract(min from j) from t
|
||||
-- !query schema
|
||||
struct<extract(min FROM c):int,extract(min FROM ntz):int,extract(min FROM i):tinyint>
|
||||
struct<extract(min FROM c):int,extract(min FROM ntz):int,extract(min FROM j):tinyint>
|
||||
-- !query output
|
||||
8 8 50
|
||||
|
||||
|
||||
-- !query
|
||||
select extract(mins from c), extract(mins from ntz), extract(mins from i) from t
|
||||
select extract(mins from c), extract(mins from ntz), extract(mins from j) from t
|
||||
-- !query schema
|
||||
struct<extract(mins FROM c):int,extract(mins FROM ntz):int,extract(mins FROM i):tinyint>
|
||||
struct<extract(mins FROM c):int,extract(mins FROM ntz):int,extract(mins FROM j):tinyint>
|
||||
-- !query output
|
||||
8 8 50
|
||||
|
||||
|
||||
-- !query
|
||||
select extract(minutes from c), extract(minutes from ntz), extract(minutes from i) from t
|
||||
select extract(minutes from c), extract(minutes from ntz), extract(minutes from j) from t
|
||||
-- !query schema
|
||||
struct<extract(minutes FROM c):int,extract(minutes FROM ntz):int,extract(minutes FROM i):tinyint>
|
||||
struct<extract(minutes FROM c):int,extract(minutes FROM ntz):int,extract(minutes FROM j):tinyint>
|
||||
-- !query output
|
||||
8 8 50
|
||||
|
||||
|
||||
-- !query
|
||||
select extract(second from c), extract(second from ntz), extract(second from i) from t
|
||||
select extract(second from c), extract(second from ntz), extract(second from j) from t
|
||||
-- !query schema
|
||||
struct<extract(second FROM c):decimal(8,6),extract(second FROM ntz):decimal(8,6),extract(second FROM i):decimal(8,6)>
|
||||
struct<extract(second FROM c):decimal(8,6),extract(second FROM ntz):decimal(8,6),extract(second FROM j):decimal(8,6)>
|
||||
-- !query output
|
||||
9.123456 9.123456 6.789000
|
||||
|
||||
|
||||
-- !query
|
||||
select extract(s from c), extract(s from ntz), extract(s from i) from t
|
||||
select extract(s from c), extract(s from ntz), extract(s from j) from t
|
||||
-- !query schema
|
||||
struct<extract(s FROM c):decimal(8,6),extract(s FROM ntz):decimal(8,6),extract(s FROM i):decimal(8,6)>
|
||||
struct<extract(s FROM c):decimal(8,6),extract(s FROM ntz):decimal(8,6),extract(s FROM j):decimal(8,6)>
|
||||
-- !query output
|
||||
9.123456 9.123456 6.789000
|
||||
|
||||
|
||||
-- !query
|
||||
select extract(sec from c), extract(sec from ntz), extract(sec from i) from t
|
||||
select extract(sec from c), extract(sec from ntz), extract(sec from j) from t
|
||||
-- !query schema
|
||||
struct<extract(sec FROM c):decimal(8,6),extract(sec FROM ntz):decimal(8,6),extract(sec FROM i):decimal(8,6)>
|
||||
struct<extract(sec FROM c):decimal(8,6),extract(sec FROM ntz):decimal(8,6),extract(sec FROM j):decimal(8,6)>
|
||||
-- !query output
|
||||
9.123456 9.123456 6.789000
|
||||
|
||||
|
||||
-- !query
|
||||
select extract(seconds from c), extract(seconds from ntz), extract(seconds from i) from t
|
||||
select extract(seconds from c), extract(seconds from ntz), extract(seconds from j) from t
|
||||
-- !query schema
|
||||
struct<extract(seconds FROM c):decimal(8,6),extract(seconds FROM ntz):decimal(8,6),extract(seconds FROM i):decimal(8,6)>
|
||||
struct<extract(seconds FROM c):decimal(8,6),extract(seconds FROM ntz):decimal(8,6),extract(seconds FROM j):decimal(8,6)>
|
||||
-- !query output
|
||||
9.123456 9.123456 6.789000
|
||||
|
||||
|
||||
-- !query
|
||||
select extract(secs from c), extract(secs from ntz), extract(secs from i) from t
|
||||
select extract(secs from c), extract(secs from ntz), extract(secs from j) from t
|
||||
-- !query schema
|
||||
struct<extract(secs FROM c):decimal(8,6),extract(secs FROM ntz):decimal(8,6),extract(secs FROM i):decimal(8,6)>
|
||||
struct<extract(secs FROM c):decimal(8,6),extract(secs FROM ntz):decimal(8,6),extract(secs FROM j):decimal(8,6)>
|
||||
-- !query output
|
||||
9.123456 9.123456 6.789000
|
||||
|
||||
|
@ -329,7 +329,16 @@ select extract(not_supported from i) from t
|
|||
struct<>
|
||||
-- !query output
|
||||
org.apache.spark.sql.AnalysisException
|
||||
Literals of type 'not_supported' are currently not supported for the interval type.; line 1 pos 7
|
||||
Literals of type 'not_supported' are currently not supported for the interval year to month type.; line 1 pos 7
|
||||
|
||||
|
||||
-- !query
|
||||
select extract(not_supported from j) from t
|
||||
-- !query schema
|
||||
struct<>
|
||||
-- !query output
|
||||
org.apache.spark.sql.AnalysisException
|
||||
Literals of type 'not_supported' are currently not supported for the interval day to second type.; line 1 pos 7
|
||||
|
||||
|
||||
-- !query
|
||||
|
@ -453,25 +462,25 @@ struct<date_part(weeks, c):int,date_part(weeks, ntz):int>
|
|||
|
||||
|
||||
-- !query
|
||||
select date_part('day', c), date_part('day', ntz), date_part('day', i) from t
|
||||
select date_part('day', c), date_part('day', ntz), date_part('day', j) from t
|
||||
-- !query schema
|
||||
struct<date_part(day, c):int,date_part(day, ntz):int,date_part(day, i):int>
|
||||
struct<date_part(day, c):int,date_part(day, ntz):int,date_part(day, j):int>
|
||||
-- !query output
|
||||
6 6 31
|
||||
|
||||
|
||||
-- !query
|
||||
select date_part('d', c), date_part('d', ntz), date_part('d', i) from t
|
||||
select date_part('d', c), date_part('d', ntz), date_part('d', j) from t
|
||||
-- !query schema
|
||||
struct<date_part(d, c):int,date_part(d, ntz):int,date_part(d, i):int>
|
||||
struct<date_part(d, c):int,date_part(d, ntz):int,date_part(d, j):int>
|
||||
-- !query output
|
||||
6 6 31
|
||||
|
||||
|
||||
-- !query
|
||||
select date_part('days', c), date_part('days', ntz), date_part('days', i) from t
|
||||
select date_part('days', c), date_part('days', ntz), date_part('days', j) from t
|
||||
-- !query schema
|
||||
struct<date_part(days, c):int,date_part(days, ntz):int,date_part(days, i):int>
|
||||
struct<date_part(days, c):int,date_part(days, ntz):int,date_part(days, j):int>
|
||||
-- !query output
|
||||
6 6 31
|
||||
|
||||
|
@ -517,121 +526,121 @@ struct<date_part(doy, c):int,date_part(doy, ntz):int>
|
|||
|
||||
|
||||
-- !query
|
||||
select date_part('hour', c), date_part('hour', ntz), date_part('hour', i) from t
|
||||
select date_part('hour', c), date_part('hour', ntz), date_part('hour', j) from t
|
||||
-- !query schema
|
||||
struct<date_part(hour, c):int,date_part(hour, ntz):int,date_part(hour, i):tinyint>
|
||||
struct<date_part(hour, c):int,date_part(hour, ntz):int,date_part(hour, j):tinyint>
|
||||
-- !query output
|
||||
7 7 16
|
||||
|
||||
|
||||
-- !query
|
||||
select date_part('h', c), date_part('h', ntz), date_part('h', i) from t
|
||||
select date_part('h', c), date_part('h', ntz), date_part('h', j) from t
|
||||
-- !query schema
|
||||
struct<date_part(h, c):int,date_part(h, ntz):int,date_part(h, i):tinyint>
|
||||
struct<date_part(h, c):int,date_part(h, ntz):int,date_part(h, j):tinyint>
|
||||
-- !query output
|
||||
7 7 16
|
||||
|
||||
|
||||
-- !query
|
||||
select date_part('hours', c), date_part('hours', ntz), date_part('hours', i) from t
|
||||
select date_part('hours', c), date_part('hours', ntz), date_part('hours', j) from t
|
||||
-- !query schema
|
||||
struct<date_part(hours, c):int,date_part(hours, ntz):int,date_part(hours, i):tinyint>
|
||||
struct<date_part(hours, c):int,date_part(hours, ntz):int,date_part(hours, j):tinyint>
|
||||
-- !query output
|
||||
7 7 16
|
||||
|
||||
|
||||
-- !query
|
||||
select date_part('hr', c), date_part('hr', ntz), date_part('hr', i) from t
|
||||
select date_part('hr', c), date_part('hr', ntz), date_part('hr', j) from t
|
||||
-- !query schema
|
||||
struct<date_part(hr, c):int,date_part(hr, ntz):int,date_part(hr, i):tinyint>
|
||||
struct<date_part(hr, c):int,date_part(hr, ntz):int,date_part(hr, j):tinyint>
|
||||
-- !query output
|
||||
7 7 16
|
||||
|
||||
|
||||
-- !query
|
||||
select date_part('hrs', c), date_part('hrs', ntz), date_part('hrs', i) from t
|
||||
select date_part('hrs', c), date_part('hrs', ntz), date_part('hrs', j) from t
|
||||
-- !query schema
|
||||
struct<date_part(hrs, c):int,date_part(hrs, ntz):int,date_part(hrs, i):tinyint>
|
||||
struct<date_part(hrs, c):int,date_part(hrs, ntz):int,date_part(hrs, j):tinyint>
|
||||
-- !query output
|
||||
7 7 16
|
||||
|
||||
|
||||
-- !query
|
||||
select date_part('minute', c), date_part('minute', ntz), date_part('minute', i) from t
|
||||
select date_part('minute', c), date_part('minute', ntz), date_part('minute', j) from t
|
||||
-- !query schema
|
||||
struct<date_part(minute, c):int,date_part(minute, ntz):int,date_part(minute, i):tinyint>
|
||||
struct<date_part(minute, c):int,date_part(minute, ntz):int,date_part(minute, j):tinyint>
|
||||
-- !query output
|
||||
8 8 50
|
||||
|
||||
|
||||
-- !query
|
||||
select date_part('m', c), date_part('m', ntz), date_part('m', i) from t
|
||||
select date_part('m', c), date_part('m', ntz), date_part('m', j) from t
|
||||
-- !query schema
|
||||
struct<date_part(m, c):int,date_part(m, ntz):int,date_part(m, i):tinyint>
|
||||
struct<date_part(m, c):int,date_part(m, ntz):int,date_part(m, j):tinyint>
|
||||
-- !query output
|
||||
8 8 50
|
||||
|
||||
|
||||
-- !query
|
||||
select date_part('min', c), date_part('min', ntz), date_part('min', i) from t
|
||||
select date_part('min', c), date_part('min', ntz), date_part('min', j) from t
|
||||
-- !query schema
|
||||
struct<date_part(min, c):int,date_part(min, ntz):int,date_part(min, i):tinyint>
|
||||
struct<date_part(min, c):int,date_part(min, ntz):int,date_part(min, j):tinyint>
|
||||
-- !query output
|
||||
8 8 50
|
||||
|
||||
|
||||
-- !query
|
||||
select date_part('mins', c), date_part('mins', ntz), date_part('mins', i) from t
|
||||
select date_part('mins', c), date_part('mins', ntz), date_part('mins', j) from t
|
||||
-- !query schema
|
||||
struct<date_part(mins, c):int,date_part(mins, ntz):int,date_part(mins, i):tinyint>
|
||||
struct<date_part(mins, c):int,date_part(mins, ntz):int,date_part(mins, j):tinyint>
|
||||
-- !query output
|
||||
8 8 50
|
||||
|
||||
|
||||
-- !query
|
||||
select date_part('minutes', c), date_part('minutes', ntz), date_part('minutes', i) from t
|
||||
select date_part('minutes', c), date_part('minutes', ntz), date_part('minutes', j) from t
|
||||
-- !query schema
|
||||
struct<date_part(minutes, c):int,date_part(minutes, ntz):int,date_part(minutes, i):tinyint>
|
||||
struct<date_part(minutes, c):int,date_part(minutes, ntz):int,date_part(minutes, j):tinyint>
|
||||
-- !query output
|
||||
8 8 50
|
||||
|
||||
|
||||
-- !query
|
||||
select date_part('second', c), date_part('second', ntz), date_part('second', i) from t
|
||||
select date_part('second', c), date_part('second', ntz), date_part('second', j) from t
|
||||
-- !query schema
|
||||
struct<date_part(second, c):decimal(8,6),date_part(second, ntz):decimal(8,6),date_part(second, i):decimal(8,6)>
|
||||
struct<date_part(second, c):decimal(8,6),date_part(second, ntz):decimal(8,6),date_part(second, j):decimal(8,6)>
|
||||
-- !query output
|
||||
9.123456 9.123456 6.789000
|
||||
|
||||
|
||||
-- !query
|
||||
select date_part('s', c), date_part('s', ntz), date_part('s', i) from t
|
||||
select date_part('s', c), date_part('s', ntz), date_part('s', j) from t
|
||||
-- !query schema
|
||||
struct<date_part(s, c):decimal(8,6),date_part(s, ntz):decimal(8,6),date_part(s, i):decimal(8,6)>
|
||||
struct<date_part(s, c):decimal(8,6),date_part(s, ntz):decimal(8,6),date_part(s, j):decimal(8,6)>
|
||||
-- !query output
|
||||
9.123456 9.123456 6.789000
|
||||
|
||||
|
||||
-- !query
|
||||
select date_part('sec', c), date_part('sec', ntz), date_part('sec', i) from t
|
||||
select date_part('sec', c), date_part('sec', ntz), date_part('sec', j) from t
|
||||
-- !query schema
|
||||
struct<date_part(sec, c):decimal(8,6),date_part(sec, ntz):decimal(8,6),date_part(sec, i):decimal(8,6)>
|
||||
struct<date_part(sec, c):decimal(8,6),date_part(sec, ntz):decimal(8,6),date_part(sec, j):decimal(8,6)>
|
||||
-- !query output
|
||||
9.123456 9.123456 6.789000
|
||||
|
||||
|
||||
-- !query
|
||||
select date_part('seconds', c), date_part('seconds', ntz), date_part('seconds', i) from t
|
||||
select date_part('seconds', c), date_part('seconds', ntz), date_part('seconds', j) from t
|
||||
-- !query schema
|
||||
struct<date_part(seconds, c):decimal(8,6),date_part(seconds, ntz):decimal(8,6),date_part(seconds, i):decimal(8,6)>
|
||||
struct<date_part(seconds, c):decimal(8,6),date_part(seconds, ntz):decimal(8,6),date_part(seconds, j):decimal(8,6)>
|
||||
-- !query output
|
||||
9.123456 9.123456 6.789000
|
||||
|
||||
|
||||
-- !query
|
||||
select date_part('secs', c), date_part('secs', ntz), date_part('secs', i) from t
|
||||
select date_part('secs', c), date_part('secs', ntz), date_part('secs', j) from t
|
||||
-- !query schema
|
||||
struct<date_part(secs, c):decimal(8,6),date_part(secs, ntz):decimal(8,6),date_part(secs, i):decimal(8,6)>
|
||||
struct<date_part(secs, c):decimal(8,6),date_part(secs, ntz):decimal(8,6),date_part(secs, j):decimal(8,6)>
|
||||
-- !query output
|
||||
9.123456 9.123456 6.789000
|
||||
|
||||
|
@ -776,27 +785,27 @@ struct<extract(second FROM c):decimal(8,6)>
|
|||
|
||||
|
||||
-- !query
|
||||
select c - i from t
|
||||
select c - j from t
|
||||
-- !query schema
|
||||
struct<c - i:string>
|
||||
struct<c - j:string>
|
||||
-- !query output
|
||||
1999-08-05 14:18:02.334456
|
||||
2011-04-04 14:18:02.334456
|
||||
|
||||
|
||||
-- !query
|
||||
select year(c - i) from t
|
||||
select day(c - j) from t
|
||||
-- !query schema
|
||||
struct<year(c - i):int>
|
||||
struct<day(c - j):int>
|
||||
-- !query output
|
||||
1999
|
||||
4
|
||||
|
||||
|
||||
-- !query
|
||||
select extract(year from c - i) from t
|
||||
select extract(day from c - j) from t
|
||||
-- !query schema
|
||||
struct<extract(year FROM c - i):int>
|
||||
struct<extract(day FROM c - j):int>
|
||||
-- !query output
|
||||
1999
|
||||
4
|
||||
|
||||
|
||||
-- !query
|
||||
|
@ -804,7 +813,15 @@ select extract(month from to_timestamp(c) - i) from t
|
|||
-- !query schema
|
||||
struct<extract(month FROM to_timestamp(c) - i):int>
|
||||
-- !query output
|
||||
8
|
||||
9
|
||||
|
||||
|
||||
-- !query
|
||||
select extract(second from to_timestamp(c) - j) from t
|
||||
-- !query schema
|
||||
struct<extract(second FROM to_timestamp(c) - j):decimal(8,6)>
|
||||
-- !query output
|
||||
2.334456
|
||||
|
||||
|
||||
-- !query
|
||||
|
|
|
@ -1,5 +1,5 @@
|
|||
-- Automatically generated by SQLQueryTestSuite
|
||||
-- Number of queries: 165
|
||||
-- Number of queries: 199
|
||||
|
||||
|
||||
-- !query
|
||||
|
@ -13,9 +13,31 @@ struct<((TIMESTAMP '2019-10-15 10:11:12.001002' - DATE '2019-10-15') * 3):interv
|
|||
-- !query
|
||||
select interval 4 month 2 weeks 3 microseconds * 1.5
|
||||
-- !query schema
|
||||
struct<multiply_interval(INTERVAL '4 months 14 days 0.000003 seconds', 1.5):interval>
|
||||
struct<>
|
||||
-- !query output
|
||||
6 months 21 days 0.000005 seconds
|
||||
org.apache.spark.sql.catalyst.parser.ParseException
|
||||
|
||||
Cannot mix year-month and day-time fields: interval 4 month 2 weeks 3 microseconds(line 1, pos 7)
|
||||
|
||||
== SQL ==
|
||||
select interval 4 month 2 weeks 3 microseconds * 1.5
|
||||
-------^^^
|
||||
|
||||
|
||||
-- !query
|
||||
select interval 2 years 4 months
|
||||
-- !query schema
|
||||
struct<INTERVAL '2-4' YEAR TO MONTH:interval year to month>
|
||||
-- !query output
|
||||
2-4
|
||||
|
||||
|
||||
-- !query
|
||||
select interval 2 weeks 3 microseconds * 1.5
|
||||
-- !query schema
|
||||
struct<(INTERVAL '14 00:00:00.000003' DAY TO SECOND * 1.5):interval day to second>
|
||||
-- !query output
|
||||
21 00:00:00.000005000
|
||||
|
||||
|
||||
-- !query
|
||||
|
@ -29,33 +51,37 @@ struct<((TIMESTAMP '2019-10-15 00:00:00' - TIMESTAMP '2019-10-14 00:00:00') / 1.
|
|||
-- !query
|
||||
select interval 2147483647 month * 2
|
||||
-- !query schema
|
||||
struct<multiply_interval(INTERVAL '178956970 years 7 months', 2):interval>
|
||||
struct<>
|
||||
-- !query output
|
||||
178956970 years 7 months
|
||||
java.lang.ArithmeticException
|
||||
integer overflow
|
||||
|
||||
|
||||
-- !query
|
||||
select interval 2147483647 month / 0.5
|
||||
-- !query schema
|
||||
struct<divide_interval(INTERVAL '178956970 years 7 months', 0.5):interval>
|
||||
struct<>
|
||||
-- !query output
|
||||
178956970 years 7 months
|
||||
java.lang.ArithmeticException
|
||||
Overflow
|
||||
|
||||
|
||||
-- !query
|
||||
select interval 2147483647 day * 2
|
||||
-- !query schema
|
||||
struct<multiply_interval(INTERVAL '2147483647 days', 2):interval>
|
||||
struct<>
|
||||
-- !query output
|
||||
2147483647 days 2562047788 hours 54.775807 seconds
|
||||
java.lang.ArithmeticException
|
||||
long overflow
|
||||
|
||||
|
||||
-- !query
|
||||
select interval 2147483647 day / 0.5
|
||||
-- !query schema
|
||||
struct<divide_interval(INTERVAL '2147483647 days', 0.5):interval>
|
||||
struct<>
|
||||
-- !query output
|
||||
2147483647 days 2562047788 hours 54.775807 seconds
|
||||
java.lang.ArithmeticException
|
||||
long overflow
|
||||
|
||||
|
||||
-- !query
|
||||
|
@ -98,12 +124,50 @@ struct<(- INTERVAL '-1 months 1 days -1 seconds'):interval>
|
|||
1 months -1 days 1 seconds
|
||||
|
||||
|
||||
-- !query
|
||||
select -interval '-1 year 1 month'
|
||||
-- !query schema
|
||||
struct<(- INTERVAL '-11 months'):interval>
|
||||
-- !query output
|
||||
11 months
|
||||
|
||||
|
||||
-- !query
|
||||
select -interval '-1 day 1 hour -1 minute 1 second'
|
||||
-- !query schema
|
||||
struct<(- INTERVAL '-1 days 59 minutes 1 seconds'):interval>
|
||||
-- !query output
|
||||
1 days -59 minutes -1 seconds
|
||||
|
||||
|
||||
-- !query
|
||||
select -interval -1 month 1 day -1 second
|
||||
-- !query schema
|
||||
struct<(- INTERVAL '-1 months 1 days -1 seconds'):interval>
|
||||
struct<>
|
||||
-- !query output
|
||||
1 months -1 days 1 seconds
|
||||
org.apache.spark.sql.catalyst.parser.ParseException
|
||||
|
||||
Cannot mix year-month and day-time fields: interval -1 month 1 day -1 second(line 1, pos 8)
|
||||
|
||||
== SQL ==
|
||||
select -interval -1 month 1 day -1 second
|
||||
--------^^^
|
||||
|
||||
|
||||
-- !query
|
||||
select -interval -1 year 1 month
|
||||
-- !query schema
|
||||
struct<(- INTERVAL '-0-11' YEAR TO MONTH):interval year to month>
|
||||
-- !query output
|
||||
0-11
|
||||
|
||||
|
||||
-- !query
|
||||
select -interval -1 day 1 hour -1 minute 1 second
|
||||
-- !query schema
|
||||
struct<(- INTERVAL '-0 23:00:59' DAY TO SECOND):interval day to second>
|
||||
-- !query output
|
||||
0 23:00:59.000000000
|
||||
|
||||
|
||||
-- !query
|
||||
|
@ -114,12 +178,50 @@ struct<(+ INTERVAL '-1 months 1 days -1 seconds'):interval>
|
|||
-1 months 1 days -1 seconds
|
||||
|
||||
|
||||
-- !query
|
||||
select +interval '-1 year 1 month'
|
||||
-- !query schema
|
||||
struct<(+ INTERVAL '-11 months'):interval>
|
||||
-- !query output
|
||||
-11 months
|
||||
|
||||
|
||||
-- !query
|
||||
select +interval '-1 day 1 hour -1 minute 1 second'
|
||||
-- !query schema
|
||||
struct<(+ INTERVAL '-1 days 59 minutes 1 seconds'):interval>
|
||||
-- !query output
|
||||
-1 days 59 minutes 1 seconds
|
||||
|
||||
|
||||
-- !query
|
||||
select +interval -1 month 1 day -1 second
|
||||
-- !query schema
|
||||
struct<(+ INTERVAL '-1 months 1 days -1 seconds'):interval>
|
||||
struct<>
|
||||
-- !query output
|
||||
-1 months 1 days -1 seconds
|
||||
org.apache.spark.sql.catalyst.parser.ParseException
|
||||
|
||||
Cannot mix year-month and day-time fields: interval -1 month 1 day -1 second(line 1, pos 8)
|
||||
|
||||
== SQL ==
|
||||
select +interval -1 month 1 day -1 second
|
||||
--------^^^
|
||||
|
||||
|
||||
-- !query
|
||||
select +interval -1 year 1 month
|
||||
-- !query schema
|
||||
struct<(+ INTERVAL '-0-11' YEAR TO MONTH):interval year to month>
|
||||
-- !query output
|
||||
-0-11
|
||||
|
||||
|
||||
-- !query
|
||||
select +interval -1 day 1 hour -1 minute 1 second
|
||||
-- !query schema
|
||||
struct<(+ INTERVAL '-0 23:00:59' DAY TO SECOND):interval day to second>
|
||||
-- !query output
|
||||
-0 23:00:59.000000000
|
||||
|
||||
|
||||
-- !query
|
||||
|
@ -408,25 +510,77 @@ NULL
|
|||
-- !query
|
||||
select interval 13.123456789 seconds, interval -13.123456789 second
|
||||
-- !query schema
|
||||
struct<INTERVAL '13.123456 seconds':interval,INTERVAL '-13.123456 seconds':interval>
|
||||
struct<INTERVAL '13.123456' SECOND:interval second,INTERVAL '-13.123456' SECOND:interval second>
|
||||
-- !query output
|
||||
13.123456 seconds -13.123456 seconds
|
||||
0 00:00:13.123456000 -0 00:00:13.123456000
|
||||
|
||||
|
||||
-- !query
|
||||
select interval 1 year 2 month 3 week 4 day 5 hour 6 minute 7 seconds 8 millisecond 9 microsecond
|
||||
-- !query schema
|
||||
struct<INTERVAL '1 years 2 months 25 days 5 hours 6 minutes 7.008009 seconds':interval>
|
||||
struct<>
|
||||
-- !query output
|
||||
1 years 2 months 25 days 5 hours 6 minutes 7.008009 seconds
|
||||
org.apache.spark.sql.catalyst.parser.ParseException
|
||||
|
||||
Cannot mix year-month and day-time fields: interval 1 year 2 month 3 week 4 day 5 hour 6 minute 7 seconds 8 millisecond 9 microsecond(line 1, pos 7)
|
||||
|
||||
== SQL ==
|
||||
select interval 1 year 2 month 3 week 4 day 5 hour 6 minute 7 seconds 8 millisecond 9 microsecond
|
||||
-------^^^
|
||||
|
||||
|
||||
-- !query
|
||||
select interval 1 year 2 month
|
||||
-- !query schema
|
||||
struct<INTERVAL '1-2' YEAR TO MONTH:interval year to month>
|
||||
-- !query output
|
||||
1-2
|
||||
|
||||
|
||||
-- !query
|
||||
select interval 4 day 5 hour 6 minute 7 seconds
|
||||
-- !query schema
|
||||
struct<INTERVAL '4 05:06:07' DAY TO SECOND:interval day to second>
|
||||
-- !query output
|
||||
4 05:06:07.000000000
|
||||
|
||||
|
||||
-- !query
|
||||
select interval 3 week 8 millisecond 9 microsecond
|
||||
-- !query schema
|
||||
struct<INTERVAL '21 00:00:00.008009' DAY TO SECOND:interval day to second>
|
||||
-- !query output
|
||||
21 00:00:00.008009000
|
||||
|
||||
|
||||
-- !query
|
||||
select interval '30' year '25' month '-100' day '40' hour '80' minute '299.889987299' second
|
||||
-- !query schema
|
||||
struct<INTERVAL '32 years 1 months -100 days 41 hours 24 minutes 59.889987 seconds':interval>
|
||||
struct<>
|
||||
-- !query output
|
||||
32 years 1 months -100 days 41 hours 24 minutes 59.889987 seconds
|
||||
org.apache.spark.sql.catalyst.parser.ParseException
|
||||
|
||||
Cannot mix year-month and day-time fields: interval '30' year '25' month '-100' day '40' hour '80' minute '299.889987299' second(line 1, pos 7)
|
||||
|
||||
== SQL ==
|
||||
select interval '30' year '25' month '-100' day '40' hour '80' minute '299.889987299' second
|
||||
-------^^^
|
||||
|
||||
|
||||
-- !query
|
||||
select interval '30' year '25' month
|
||||
-- !query schema
|
||||
struct<INTERVAL '32-1' YEAR TO MONTH:interval year to month>
|
||||
-- !query output
|
||||
32-1
|
||||
|
||||
|
||||
-- !query
|
||||
select interval '-100' day '40' hour '80' minute '299.889987299' second
|
||||
-- !query schema
|
||||
struct<INTERVAL '-98 06:35:00.110013' DAY TO SECOND:interval day to second>
|
||||
-- !query output
|
||||
-98 06:35:00.110013000
|
||||
|
||||
|
||||
-- !query
|
||||
|
@ -520,17 +674,17 @@ struct<INTERVAL '40:32' MINUTE TO SECOND:interval minute to second>
|
|||
-- !query
|
||||
select interval 30 day day
|
||||
-- !query schema
|
||||
struct<day:interval>
|
||||
struct<day:interval day>
|
||||
-- !query output
|
||||
30 days
|
||||
30 00:00:00.000000000
|
||||
|
||||
|
||||
-- !query
|
||||
select interval 30 days days
|
||||
-- !query schema
|
||||
struct<days:interval>
|
||||
struct<days:interval day>
|
||||
-- !query output
|
||||
30 days
|
||||
30 00:00:00.000000000
|
||||
|
||||
|
||||
-- !query
|
||||
|
@ -634,9 +788,50 @@ select interval 10 nanoseconds
|
|||
-- !query
|
||||
select map(1, interval 1 day, 2, interval 3 week)
|
||||
-- !query schema
|
||||
struct<map(1, INTERVAL '1 days', 2, INTERVAL '21 days'):map<int,interval>>
|
||||
struct<map(1, INTERVAL '1' DAY, 2, INTERVAL '21' DAY):map<int,interval day>>
|
||||
-- !query output
|
||||
{1:1 days,2:21 days}
|
||||
{1:1 00:00:00.000000000,2:21 00:00:00.000000000}
|
||||
|
||||
|
||||
-- !query
|
||||
select map(1, interval 1 day, 2, interval 2 day)
|
||||
-- !query schema
|
||||
struct<map(1, INTERVAL '1' DAY, 2, INTERVAL '2' DAY):map<int,interval day>>
|
||||
-- !query output
|
||||
{1:1 00:00:00.000000000,2:2 00:00:00.000000000}
|
||||
|
||||
|
||||
-- !query
|
||||
select map(1, interval 1 year, 2, interval 2 month)
|
||||
-- !query schema
|
||||
struct<>
|
||||
-- !query output
|
||||
org.apache.spark.sql.AnalysisException
|
||||
cannot resolve 'map(1, INTERVAL '1' YEAR, 2, INTERVAL '2' MONTH)' due to data type mismatch: The given values of function map should all be the same type, but they are [interval year, interval month]; line 1 pos 7
|
||||
|
||||
|
||||
-- !query
|
||||
select map(1, interval 1 month, 2, interval 2 month)
|
||||
-- !query schema
|
||||
struct<map(1, INTERVAL '1' MONTH, 2, INTERVAL '2' MONTH):map<int,interval month>>
|
||||
-- !query output
|
||||
{1:0-1,2:0-2}
|
||||
|
||||
|
||||
-- !query
|
||||
select map(1, interval 1 week, 2, interval 2 day)
|
||||
-- !query schema
|
||||
struct<map(1, INTERVAL '7' DAY, 2, INTERVAL '2' DAY):map<int,interval day>>
|
||||
-- !query output
|
||||
{1:7 00:00:00.000000000,2:2 00:00:00.000000000}
|
||||
|
||||
|
||||
-- !query
|
||||
select map(1, interval 2 millisecond, 3, interval 3 microsecond)
|
||||
-- !query schema
|
||||
struct<map(1, INTERVAL '00.002' SECOND, 3, INTERVAL '00.000003' SECOND):map<int,interval second>>
|
||||
-- !query output
|
||||
{1:0 00:00:00.002000000,3:0 00:00:00.000003000}
|
||||
|
||||
|
||||
-- !query
|
||||
|
@ -1059,17 +1254,17 @@ struct<INTERVAL '1 days':interval>
|
|||
-- !query
|
||||
select interval '1\t' day
|
||||
-- !query schema
|
||||
struct<INTERVAL '1 days':interval>
|
||||
struct<INTERVAL '1' DAY:interval day>
|
||||
-- !query output
|
||||
1 days
|
||||
1 00:00:00.000000000
|
||||
|
||||
|
||||
-- !query
|
||||
select interval '1 ' day
|
||||
-- !query schema
|
||||
struct<INTERVAL '1 days':interval>
|
||||
struct<INTERVAL '1' DAY:interval day>
|
||||
-- !query output
|
||||
1 days
|
||||
1 00:00:00.000000000
|
||||
|
||||
|
||||
-- !query
|
||||
|
@ -1202,25 +1397,53 @@ struct<divide_interval(a, 0.5):interval>
|
|||
-- !query
|
||||
SELECT
|
||||
from_csv('1, 1 day', 'a INT, b interval'),
|
||||
from_csv('1, 1', 'a INT, b interval day'),
|
||||
to_csv(from_csv('1, 1 day', 'a INT, b interval')),
|
||||
to_csv(named_struct('a', interval 32 month, 'b', interval 70 minute)),
|
||||
from_csv(to_csv(named_struct('a', interval 32 month, 'b', interval 70 minute)), 'a interval, b interval')
|
||||
to_csv(from_csv('1, 1', 'a INT, b interval day')),
|
||||
to_csv(named_struct('a', interval 32 hour, 'b', interval 70 minute)),
|
||||
from_csv(to_csv(named_struct('a', interval 32 hour, 'b', interval 70 minute)), 'a interval hour, b interval minute')
|
||||
-- !query schema
|
||||
struct<from_csv(1, 1 day):struct<a:int,b:interval>,to_csv(from_csv(1, 1 day)):string,to_csv(named_struct(a, INTERVAL '2 years 8 months', b, INTERVAL '1 hours 10 minutes')):string,from_csv(to_csv(named_struct(a, INTERVAL '2 years 8 months', b, INTERVAL '1 hours 10 minutes'))):struct<a:interval,b:interval>>
|
||||
struct<from_csv(1, 1 day):struct<a:int,b:interval>,from_csv(1, 1):struct<a:int,b:interval day>,to_csv(from_csv(1, 1 day)):string,to_csv(from_csv(1, 1)):string,to_csv(named_struct(a, INTERVAL '32' HOUR, b, INTERVAL '70' MINUTE)):string,from_csv(to_csv(named_struct(a, INTERVAL '32' HOUR, b, INTERVAL '70' MINUTE))):struct<a:interval hour,b:interval minute>>
|
||||
-- !query output
|
||||
{"a":1,"b":1 days} 1,1 days 2 years 8 months,1 hours 10 minutes {"a":2 years 8 months,"b":1 hours 10 minutes}
|
||||
{"a":1,"b":1 days} {"a":1,"b":1 00:00:00.000000000} 1,1 days 1,INTERVAL '1' DAY INTERVAL '32' HOUR,INTERVAL '70' MINUTE {"a":1 08:00:00.000000000,"b":0 01:10:00.000000000}
|
||||
|
||||
|
||||
-- !query
|
||||
SELECT
|
||||
from_json('{"a":"1 days"}', 'a interval'),
|
||||
from_csv('1, 1', 'a INT, b interval year'),
|
||||
to_json(from_json('{"a":"1 days"}', 'a interval')),
|
||||
to_json(map('a', interval 25 month 100 day 130 minute)),
|
||||
from_json(to_json(map('a', interval 25 month 100 day 130 minute)), 'a interval')
|
||||
to_csv(from_csv('1, 1', 'a INT, b interval year')),
|
||||
to_csv(named_struct('a', interval 32 year, 'b', interval 10 month)),
|
||||
from_csv(to_csv(named_struct('a', interval 32 year, 'b', interval 10 month)), 'a interval year, b interval month')
|
||||
-- !query schema
|
||||
struct<from_json({"a":"1 days"}):struct<a:interval>,to_json(from_json({"a":"1 days"})):string,to_json(map(a, INTERVAL '2 years 1 months 100 days 2 hours 10 minutes')):string,from_json(to_json(map(a, INTERVAL '2 years 1 months 100 days 2 hours 10 minutes'))):struct<a:interval>>
|
||||
struct<from_json({"a":"1 days"}):struct<a:interval>,from_csv(1, 1):struct<a:int,b:interval year>,to_json(from_json({"a":"1 days"})):string,to_csv(from_csv(1, 1)):string,to_csv(named_struct(a, INTERVAL '32' YEAR, b, INTERVAL '10' MONTH)):string,from_csv(to_csv(named_struct(a, INTERVAL '32' YEAR, b, INTERVAL '10' MONTH))):struct<a:interval year,b:interval month>>
|
||||
-- !query output
|
||||
{"a":1 days} {"a":"1 days"} {"a":"2 years 1 months 100 days 2 hours 10 minutes"} {"a":2 years 1 months 100 days 2 hours 10 minutes}
|
||||
{"a":1 days} {"a":1,"b":1-0} {"a":"1 days"} 1,INTERVAL '1' YEAR INTERVAL '32' YEAR,INTERVAL '10' MONTH {"a":32-0,"b":0-10}
|
||||
|
||||
|
||||
-- !query
|
||||
SELECT
|
||||
from_json('{"a":"1"}', 'a interval day'),
|
||||
to_json(from_json('{"a":"1"}', 'a interval day')),
|
||||
to_json(map('a', interval 100 day 130 minute)),
|
||||
from_json(to_json(map('a', interval 100 day 130 minute)), 'a interval day to minute')
|
||||
-- !query schema
|
||||
struct<from_json({"a":"1"}):struct<a:interval day>,to_json(from_json({"a":"1"})):string,to_json(map(a, INTERVAL '100 02:10' DAY TO MINUTE)):string,from_json(to_json(map(a, INTERVAL '100 02:10' DAY TO MINUTE))):struct<a:interval day to minute>>
|
||||
-- !query output
|
||||
{"a":1 00:00:00.000000000} {"a":"INTERVAL '1' DAY"} {"a":"INTERVAL '100 02:10' DAY TO MINUTE"} {"a":100 02:10:00.000000000}
|
||||
|
||||
|
||||
-- !query
|
||||
SELECT
|
||||
from_json('{"a":"1"}', 'a interval year'),
|
||||
to_json(from_json('{"a":"1"}', 'a interval year')),
|
||||
to_json(map('a', interval 32 year 10 month)),
|
||||
from_json(to_json(map('a', interval 32 year 10 month)), 'a interval year to month')
|
||||
-- !query schema
|
||||
struct<from_json({"a":"1"}):struct<a:interval year>,to_json(from_json({"a":"1"})):string,to_json(map(a, INTERVAL '32-10' YEAR TO MONTH)):string,from_json(to_json(map(a, INTERVAL '32-10' YEAR TO MONTH))):struct<a:interval year to month>>
|
||||
-- !query output
|
||||
{"a":1-0} {"a":"INTERVAL '1' YEAR"} {"a":"INTERVAL '32-10' YEAR TO MONTH"} {"a":32-10}
|
||||
|
||||
|
||||
-- !query
|
||||
|
@ -1352,9 +1575,9 @@ select interval 'interval 1' day
|
|||
-- !query
|
||||
select interval '-\t 1' day
|
||||
-- !query schema
|
||||
struct<INTERVAL '-1 days':interval>
|
||||
struct<INTERVAL '-1' DAY:interval day>
|
||||
-- !query output
|
||||
-1 days
|
||||
-1 00:00:00.000000000
|
||||
|
||||
|
||||
-- !query
|
||||
|
@ -1639,3 +1862,99 @@ requirement failed: minute 153722867281 outside range [0, 153722867280](line 1,
|
|||
== SQL ==
|
||||
SELECT INTERVAL '-153722867281:54.775808' MINUTE TO SECOND
|
||||
----------------^^^
|
||||
|
||||
|
||||
-- !query
|
||||
SELECT INTERVAL '178956970' YEAR
|
||||
-- !query schema
|
||||
struct<INTERVAL '178956970' YEAR:interval year>
|
||||
-- !query output
|
||||
178956970-0
|
||||
|
||||
|
||||
-- !query
|
||||
SELECT INTERVAL '-178956970' YEAR
|
||||
-- !query schema
|
||||
struct<INTERVAL '-178956970' YEAR:interval year>
|
||||
-- !query output
|
||||
-178956970-0
|
||||
|
||||
|
||||
-- !query
|
||||
SELECT INTERVAL '2147483647' MONTH
|
||||
-- !query schema
|
||||
struct<INTERVAL '2147483647' MONTH:interval month>
|
||||
-- !query output
|
||||
178956970-7
|
||||
|
||||
|
||||
-- !query
|
||||
SELECT INTERVAL '-2147483647' MONTH
|
||||
-- !query schema
|
||||
struct<INTERVAL '-2147483647' MONTH:interval month>
|
||||
-- !query output
|
||||
-178956970-7
|
||||
|
||||
|
||||
-- !query
|
||||
SELECT INTERVAL '106751991' DAY
|
||||
-- !query schema
|
||||
struct<INTERVAL '106751991' DAY:interval day>
|
||||
-- !query output
|
||||
106751991 00:00:00.000000000
|
||||
|
||||
|
||||
-- !query
|
||||
SELECT INTERVAL '-106751991' DAY
|
||||
-- !query schema
|
||||
struct<INTERVAL '-106751991' DAY:interval day>
|
||||
-- !query output
|
||||
-106751991 00:00:00.000000000
|
||||
|
||||
|
||||
-- !query
|
||||
SELECT INTERVAL '2562047788' HOUR
|
||||
-- !query schema
|
||||
struct<INTERVAL '2562047788' HOUR:interval hour>
|
||||
-- !query output
|
||||
106751991 04:00:00.000000000
|
||||
|
||||
|
||||
-- !query
|
||||
SELECT INTERVAL '-2562047788' HOUR
|
||||
-- !query schema
|
||||
struct<INTERVAL '-2562047788' HOUR:interval hour>
|
||||
-- !query output
|
||||
-106751991 04:00:00.000000000
|
||||
|
||||
|
||||
-- !query
|
||||
SELECT INTERVAL '153722867280' MINUTE
|
||||
-- !query schema
|
||||
struct<INTERVAL '153722867280' MINUTE:interval minute>
|
||||
-- !query output
|
||||
106751991 04:00:00.000000000
|
||||
|
||||
|
||||
-- !query
|
||||
SELECT INTERVAL '-153722867280' MINUTE
|
||||
-- !query schema
|
||||
struct<INTERVAL '-153722867280' MINUTE:interval minute>
|
||||
-- !query output
|
||||
-106751991 04:00:00.000000000
|
||||
|
||||
|
||||
-- !query
|
||||
SELECT INTERVAL '54.775807' SECOND
|
||||
-- !query schema
|
||||
struct<INTERVAL '54.775807' SECOND:interval second>
|
||||
-- !query output
|
||||
0 00:00:54.775807000
|
||||
|
||||
|
||||
-- !query
|
||||
SELECT INTERVAL '-54.775807' SECOND
|
||||
-- !query schema
|
||||
struct<INTERVAL '-54.775807' SECOND:interval second>
|
||||
-- !query output
|
||||
-0 00:00:54.775807000
|
||||
|
|
|
@ -319,7 +319,7 @@ select timestamp '2019-01-01中文'
|
|||
-- !query
|
||||
select timestamp'2011-11-11 11:11:11' + interval '2' day
|
||||
-- !query schema
|
||||
struct<TIMESTAMP_NTZ '2011-11-11 11:11:11' + INTERVAL '2 days':timestamp_ntz>
|
||||
struct<TIMESTAMP_NTZ '2011-11-11 11:11:11' + INTERVAL '2' DAY:timestamp_ntz>
|
||||
-- !query output
|
||||
2011-11-13 11:11:11
|
||||
|
||||
|
@ -327,7 +327,7 @@ struct<TIMESTAMP_NTZ '2011-11-11 11:11:11' + INTERVAL '2 days':timestamp_ntz>
|
|||
-- !query
|
||||
select timestamp'2011-11-11 11:11:11' - interval '2' day
|
||||
-- !query schema
|
||||
struct<TIMESTAMP_NTZ '2011-11-11 11:11:11' - INTERVAL '2 days':timestamp_ntz>
|
||||
struct<TIMESTAMP_NTZ '2011-11-11 11:11:11' - INTERVAL '2' DAY:timestamp_ntz>
|
||||
-- !query output
|
||||
2011-11-09 11:11:11
|
||||
|
||||
|
@ -335,23 +335,23 @@ struct<TIMESTAMP_NTZ '2011-11-11 11:11:11' - INTERVAL '2 days':timestamp_ntz>
|
|||
-- !query
|
||||
select date'2011-11-11 11:11:11' + interval '2' second
|
||||
-- !query schema
|
||||
struct<DATE '2011-11-11' + INTERVAL '2 seconds':date>
|
||||
struct<DATE '2011-11-11' + INTERVAL '02' SECOND:timestamp>
|
||||
-- !query output
|
||||
2011-11-11
|
||||
2011-11-11 00:00:02
|
||||
|
||||
|
||||
-- !query
|
||||
select date'2011-11-11 11:11:11' - interval '2' second
|
||||
-- !query schema
|
||||
struct<DATE '2011-11-11' - INTERVAL '2 seconds':date>
|
||||
struct<DATE '2011-11-11' - INTERVAL '02' SECOND:timestamp>
|
||||
-- !query output
|
||||
2011-11-10
|
||||
2011-11-10 23:59:58
|
||||
|
||||
|
||||
-- !query
|
||||
select '2011-11-11' - interval '2' day
|
||||
-- !query schema
|
||||
struct<2011-11-11 - INTERVAL '2 days':string>
|
||||
struct<2011-11-11 - INTERVAL '2' DAY:string>
|
||||
-- !query output
|
||||
2011-11-09 00:00:00
|
||||
|
||||
|
@ -359,7 +359,7 @@ struct<2011-11-11 - INTERVAL '2 days':string>
|
|||
-- !query
|
||||
select '2011-11-11 11:11:11' - interval '2' second
|
||||
-- !query schema
|
||||
struct<2011-11-11 11:11:11 - INTERVAL '2 seconds':string>
|
||||
struct<2011-11-11 11:11:11 - INTERVAL '02' SECOND:string>
|
||||
-- !query output
|
||||
2011-11-11 11:11:09
|
||||
|
||||
|
@ -367,7 +367,7 @@ struct<2011-11-11 11:11:11 - INTERVAL '2 seconds':string>
|
|||
-- !query
|
||||
select '1' - interval '2' second
|
||||
-- !query schema
|
||||
struct<1 - INTERVAL '2 seconds':string>
|
||||
struct<1 - INTERVAL '02' SECOND:string>
|
||||
-- !query output
|
||||
NULL
|
||||
|
||||
|
@ -378,7 +378,7 @@ select 1 - interval '2' second
|
|||
struct<>
|
||||
-- !query output
|
||||
org.apache.spark.sql.AnalysisException
|
||||
cannot resolve '1 + (- INTERVAL '2 seconds')' due to data type mismatch: argument 1 requires (timestamp or timestamp without time zone) type, however, '1' is of int type.; line 1 pos 7
|
||||
cannot resolve '1 + (- INTERVAL '02' SECOND)' due to data type mismatch: argument 1 requires (timestamp or timestamp without time zone) type, however, '1' is of int type.; line 1 pos 7
|
||||
|
||||
|
||||
-- !query
|
||||
|
@ -1478,7 +1478,7 @@ NULL
|
|||
-- !query
|
||||
select to_timestamp_ntz('2021-06-25 10:11:12') + interval 2 day
|
||||
-- !query schema
|
||||
struct<to_timestamp_ntz(2021-06-25 10:11:12) + INTERVAL '2 days':timestamp_ntz>
|
||||
struct<to_timestamp_ntz(2021-06-25 10:11:12) + INTERVAL '2' DAY:timestamp_ntz>
|
||||
-- !query output
|
||||
2021-06-27 10:11:12
|
||||
|
||||
|
@ -1550,7 +1550,7 @@ struct<to_timestamp_ntz(2021-06-25 10:11:12) + INTERVAL '20 15:40:32.998999' DAY
|
|||
-- !query
|
||||
select to_timestamp_ntz('2021-06-25 10:11:12') - interval 2 day
|
||||
-- !query schema
|
||||
struct<to_timestamp_ntz(2021-06-25 10:11:12) - INTERVAL '2 days':timestamp_ntz>
|
||||
struct<to_timestamp_ntz(2021-06-25 10:11:12) - INTERVAL '2' DAY:timestamp_ntz>
|
||||
-- !query output
|
||||
2021-06-23 10:11:12
|
||||
|
||||
|
|
|
@ -16,7 +16,7 @@ select cast(1 as tinyint) + interval 2 day
|
|||
struct<>
|
||||
-- !query output
|
||||
org.apache.spark.sql.AnalysisException
|
||||
cannot resolve 'CAST(1 AS TINYINT) + INTERVAL '2 days'' due to data type mismatch: argument 1 requires (timestamp or timestamp without time zone) type, however, 'CAST(1 AS TINYINT)' is of tinyint type.; line 1 pos 7
|
||||
cannot resolve 'CAST(1 AS TINYINT) + INTERVAL '2' DAY' due to data type mismatch: argument 1 requires (timestamp or timestamp without time zone) type, however, 'CAST(1 AS TINYINT)' is of tinyint type.; line 1 pos 7
|
||||
|
||||
|
||||
-- !query
|
||||
|
@ -25,7 +25,7 @@ select cast(1 as smallint) + interval 2 day
|
|||
struct<>
|
||||
-- !query output
|
||||
org.apache.spark.sql.AnalysisException
|
||||
cannot resolve 'CAST(1 AS SMALLINT) + INTERVAL '2 days'' due to data type mismatch: argument 1 requires (timestamp or timestamp without time zone) type, however, 'CAST(1 AS SMALLINT)' is of smallint type.; line 1 pos 7
|
||||
cannot resolve 'CAST(1 AS SMALLINT) + INTERVAL '2' DAY' due to data type mismatch: argument 1 requires (timestamp or timestamp without time zone) type, however, 'CAST(1 AS SMALLINT)' is of smallint type.; line 1 pos 7
|
||||
|
||||
|
||||
-- !query
|
||||
|
@ -34,7 +34,7 @@ select cast(1 as int) + interval 2 day
|
|||
struct<>
|
||||
-- !query output
|
||||
org.apache.spark.sql.AnalysisException
|
||||
cannot resolve 'CAST(1 AS INT) + INTERVAL '2 days'' due to data type mismatch: argument 1 requires (timestamp or timestamp without time zone) type, however, 'CAST(1 AS INT)' is of int type.; line 1 pos 7
|
||||
cannot resolve 'CAST(1 AS INT) + INTERVAL '2' DAY' due to data type mismatch: argument 1 requires (timestamp or timestamp without time zone) type, however, 'CAST(1 AS INT)' is of int type.; line 1 pos 7
|
||||
|
||||
|
||||
-- !query
|
||||
|
@ -43,7 +43,7 @@ select cast(1 as bigint) + interval 2 day
|
|||
struct<>
|
||||
-- !query output
|
||||
org.apache.spark.sql.AnalysisException
|
||||
cannot resolve 'CAST(1 AS BIGINT) + INTERVAL '2 days'' due to data type mismatch: argument 1 requires (timestamp or timestamp without time zone) type, however, 'CAST(1 AS BIGINT)' is of bigint type.; line 1 pos 7
|
||||
cannot resolve 'CAST(1 AS BIGINT) + INTERVAL '2' DAY' due to data type mismatch: argument 1 requires (timestamp or timestamp without time zone) type, however, 'CAST(1 AS BIGINT)' is of bigint type.; line 1 pos 7
|
||||
|
||||
|
||||
-- !query
|
||||
|
@ -52,7 +52,7 @@ select cast(1 as float) + interval 2 day
|
|||
struct<>
|
||||
-- !query output
|
||||
org.apache.spark.sql.AnalysisException
|
||||
cannot resolve 'CAST(1 AS FLOAT) + INTERVAL '2 days'' due to data type mismatch: argument 1 requires (timestamp or timestamp without time zone) type, however, 'CAST(1 AS FLOAT)' is of float type.; line 1 pos 7
|
||||
cannot resolve 'CAST(1 AS FLOAT) + INTERVAL '2' DAY' due to data type mismatch: argument 1 requires (timestamp or timestamp without time zone) type, however, 'CAST(1 AS FLOAT)' is of float type.; line 1 pos 7
|
||||
|
||||
|
||||
-- !query
|
||||
|
@ -61,7 +61,7 @@ select cast(1 as double) + interval 2 day
|
|||
struct<>
|
||||
-- !query output
|
||||
org.apache.spark.sql.AnalysisException
|
||||
cannot resolve 'CAST(1 AS DOUBLE) + INTERVAL '2 days'' due to data type mismatch: argument 1 requires (timestamp or timestamp without time zone) type, however, 'CAST(1 AS DOUBLE)' is of double type.; line 1 pos 7
|
||||
cannot resolve 'CAST(1 AS DOUBLE) + INTERVAL '2' DAY' due to data type mismatch: argument 1 requires (timestamp or timestamp without time zone) type, however, 'CAST(1 AS DOUBLE)' is of double type.; line 1 pos 7
|
||||
|
||||
|
||||
-- !query
|
||||
|
@ -70,13 +70,13 @@ select cast(1 as decimal(10, 0)) + interval 2 day
|
|||
struct<>
|
||||
-- !query output
|
||||
org.apache.spark.sql.AnalysisException
|
||||
cannot resolve 'CAST(1 AS DECIMAL(10,0)) + INTERVAL '2 days'' due to data type mismatch: argument 1 requires (timestamp or timestamp without time zone) type, however, 'CAST(1 AS DECIMAL(10,0))' is of decimal(10,0) type.; line 1 pos 7
|
||||
cannot resolve 'CAST(1 AS DECIMAL(10,0)) + INTERVAL '2' DAY' due to data type mismatch: argument 1 requires (timestamp or timestamp without time zone) type, however, 'CAST(1 AS DECIMAL(10,0))' is of decimal(10,0) type.; line 1 pos 7
|
||||
|
||||
|
||||
-- !query
|
||||
select cast('2017-12-11' as string) + interval 2 day
|
||||
-- !query schema
|
||||
struct<CAST(2017-12-11 AS STRING) + INTERVAL '2 days':string>
|
||||
struct<CAST(2017-12-11 AS STRING) + INTERVAL '2' DAY:string>
|
||||
-- !query output
|
||||
2017-12-13 00:00:00
|
||||
|
||||
|
@ -84,7 +84,7 @@ struct<CAST(2017-12-11 AS STRING) + INTERVAL '2 days':string>
|
|||
-- !query
|
||||
select cast('2017-12-11 09:30:00' as string) + interval 2 day
|
||||
-- !query schema
|
||||
struct<CAST(2017-12-11 09:30:00 AS STRING) + INTERVAL '2 days':string>
|
||||
struct<CAST(2017-12-11 09:30:00 AS STRING) + INTERVAL '2' DAY:string>
|
||||
-- !query output
|
||||
2017-12-13 09:30:00
|
||||
|
||||
|
@ -95,7 +95,7 @@ select cast('1' as binary) + interval 2 day
|
|||
struct<>
|
||||
-- !query output
|
||||
org.apache.spark.sql.AnalysisException
|
||||
cannot resolve 'CAST('1' AS BINARY) + INTERVAL '2 days'' due to data type mismatch: argument 1 requires (timestamp or timestamp without time zone) type, however, 'CAST('1' AS BINARY)' is of binary type.; line 1 pos 7
|
||||
cannot resolve 'CAST('1' AS BINARY) + INTERVAL '2' DAY' due to data type mismatch: argument 1 requires (timestamp or timestamp without time zone) type, however, 'CAST('1' AS BINARY)' is of binary type.; line 1 pos 7
|
||||
|
||||
|
||||
-- !query
|
||||
|
@ -104,13 +104,13 @@ select cast(1 as boolean) + interval 2 day
|
|||
struct<>
|
||||
-- !query output
|
||||
org.apache.spark.sql.AnalysisException
|
||||
cannot resolve 'CAST(1 AS BOOLEAN) + INTERVAL '2 days'' due to data type mismatch: argument 1 requires (timestamp or timestamp without time zone) type, however, 'CAST(1 AS BOOLEAN)' is of boolean type.; line 1 pos 7
|
||||
cannot resolve 'CAST(1 AS BOOLEAN) + INTERVAL '2' DAY' due to data type mismatch: argument 1 requires (timestamp or timestamp without time zone) type, however, 'CAST(1 AS BOOLEAN)' is of boolean type.; line 1 pos 7
|
||||
|
||||
|
||||
-- !query
|
||||
select cast('2017-12-11 09:30:00.0' as timestamp) + interval 2 day
|
||||
-- !query schema
|
||||
struct<CAST(2017-12-11 09:30:00.0 AS TIMESTAMP) + INTERVAL '2 days':timestamp>
|
||||
struct<CAST(2017-12-11 09:30:00.0 AS TIMESTAMP) + INTERVAL '2' DAY:timestamp>
|
||||
-- !query output
|
||||
2017-12-13 09:30:00
|
||||
|
||||
|
@ -118,7 +118,7 @@ struct<CAST(2017-12-11 09:30:00.0 AS TIMESTAMP) + INTERVAL '2 days':timestamp>
|
|||
-- !query
|
||||
select cast('2017-12-11 09:30:00' as date) + interval 2 day
|
||||
-- !query schema
|
||||
struct<CAST(2017-12-11 09:30:00 AS DATE) + INTERVAL '2 days':date>
|
||||
struct<date_add(CAST(2017-12-11 09:30:00 AS DATE), extractansiintervaldays(INTERVAL '2' DAY)):date>
|
||||
-- !query output
|
||||
2017-12-13
|
||||
|
||||
|
@ -129,7 +129,7 @@ select interval 2 day + cast(1 as tinyint)
|
|||
struct<>
|
||||
-- !query output
|
||||
org.apache.spark.sql.AnalysisException
|
||||
cannot resolve 'CAST(1 AS TINYINT) + INTERVAL '2 days'' due to data type mismatch: argument 1 requires (timestamp or timestamp without time zone) type, however, 'CAST(1 AS TINYINT)' is of tinyint type.; line 1 pos 7
|
||||
cannot resolve 'CAST(1 AS TINYINT) + INTERVAL '2' DAY' due to data type mismatch: argument 1 requires (timestamp or timestamp without time zone) type, however, 'CAST(1 AS TINYINT)' is of tinyint type.; line 1 pos 7
|
||||
|
||||
|
||||
-- !query
|
||||
|
@ -138,7 +138,7 @@ select interval 2 day + cast(1 as smallint)
|
|||
struct<>
|
||||
-- !query output
|
||||
org.apache.spark.sql.AnalysisException
|
||||
cannot resolve 'CAST(1 AS SMALLINT) + INTERVAL '2 days'' due to data type mismatch: argument 1 requires (timestamp or timestamp without time zone) type, however, 'CAST(1 AS SMALLINT)' is of smallint type.; line 1 pos 7
|
||||
cannot resolve 'CAST(1 AS SMALLINT) + INTERVAL '2' DAY' due to data type mismatch: argument 1 requires (timestamp or timestamp without time zone) type, however, 'CAST(1 AS SMALLINT)' is of smallint type.; line 1 pos 7
|
||||
|
||||
|
||||
-- !query
|
||||
|
@ -147,7 +147,7 @@ select interval 2 day + cast(1 as int)
|
|||
struct<>
|
||||
-- !query output
|
||||
org.apache.spark.sql.AnalysisException
|
||||
cannot resolve 'CAST(1 AS INT) + INTERVAL '2 days'' due to data type mismatch: argument 1 requires (timestamp or timestamp without time zone) type, however, 'CAST(1 AS INT)' is of int type.; line 1 pos 7
|
||||
cannot resolve 'CAST(1 AS INT) + INTERVAL '2' DAY' due to data type mismatch: argument 1 requires (timestamp or timestamp without time zone) type, however, 'CAST(1 AS INT)' is of int type.; line 1 pos 7
|
||||
|
||||
|
||||
-- !query
|
||||
|
@ -156,7 +156,7 @@ select interval 2 day + cast(1 as bigint)
|
|||
struct<>
|
||||
-- !query output
|
||||
org.apache.spark.sql.AnalysisException
|
||||
cannot resolve 'CAST(1 AS BIGINT) + INTERVAL '2 days'' due to data type mismatch: argument 1 requires (timestamp or timestamp without time zone) type, however, 'CAST(1 AS BIGINT)' is of bigint type.; line 1 pos 7
|
||||
cannot resolve 'CAST(1 AS BIGINT) + INTERVAL '2' DAY' due to data type mismatch: argument 1 requires (timestamp or timestamp without time zone) type, however, 'CAST(1 AS BIGINT)' is of bigint type.; line 1 pos 7
|
||||
|
||||
|
||||
-- !query
|
||||
|
@ -165,7 +165,7 @@ select interval 2 day + cast(1 as float)
|
|||
struct<>
|
||||
-- !query output
|
||||
org.apache.spark.sql.AnalysisException
|
||||
cannot resolve 'CAST(1 AS FLOAT) + INTERVAL '2 days'' due to data type mismatch: argument 1 requires (timestamp or timestamp without time zone) type, however, 'CAST(1 AS FLOAT)' is of float type.; line 1 pos 7
|
||||
cannot resolve 'CAST(1 AS FLOAT) + INTERVAL '2' DAY' due to data type mismatch: argument 1 requires (timestamp or timestamp without time zone) type, however, 'CAST(1 AS FLOAT)' is of float type.; line 1 pos 7
|
||||
|
||||
|
||||
-- !query
|
||||
|
@ -174,7 +174,7 @@ select interval 2 day + cast(1 as double)
|
|||
struct<>
|
||||
-- !query output
|
||||
org.apache.spark.sql.AnalysisException
|
||||
cannot resolve 'CAST(1 AS DOUBLE) + INTERVAL '2 days'' due to data type mismatch: argument 1 requires (timestamp or timestamp without time zone) type, however, 'CAST(1 AS DOUBLE)' is of double type.; line 1 pos 7
|
||||
cannot resolve 'CAST(1 AS DOUBLE) + INTERVAL '2' DAY' due to data type mismatch: argument 1 requires (timestamp or timestamp without time zone) type, however, 'CAST(1 AS DOUBLE)' is of double type.; line 1 pos 7
|
||||
|
||||
|
||||
-- !query
|
||||
|
@ -183,13 +183,13 @@ select interval 2 day + cast(1 as decimal(10, 0))
|
|||
struct<>
|
||||
-- !query output
|
||||
org.apache.spark.sql.AnalysisException
|
||||
cannot resolve 'CAST(1 AS DECIMAL(10,0)) + INTERVAL '2 days'' due to data type mismatch: argument 1 requires (timestamp or timestamp without time zone) type, however, 'CAST(1 AS DECIMAL(10,0))' is of decimal(10,0) type.; line 1 pos 7
|
||||
cannot resolve 'CAST(1 AS DECIMAL(10,0)) + INTERVAL '2' DAY' due to data type mismatch: argument 1 requires (timestamp or timestamp without time zone) type, however, 'CAST(1 AS DECIMAL(10,0))' is of decimal(10,0) type.; line 1 pos 7
|
||||
|
||||
|
||||
-- !query
|
||||
select interval 2 day + cast('2017-12-11' as string)
|
||||
-- !query schema
|
||||
struct<CAST(2017-12-11 AS STRING) + INTERVAL '2 days':string>
|
||||
struct<CAST(2017-12-11 AS STRING) + INTERVAL '2' DAY:string>
|
||||
-- !query output
|
||||
2017-12-13 00:00:00
|
||||
|
||||
|
@ -197,7 +197,7 @@ struct<CAST(2017-12-11 AS STRING) + INTERVAL '2 days':string>
|
|||
-- !query
|
||||
select interval 2 day + cast('2017-12-11 09:30:00' as string)
|
||||
-- !query schema
|
||||
struct<CAST(2017-12-11 09:30:00 AS STRING) + INTERVAL '2 days':string>
|
||||
struct<CAST(2017-12-11 09:30:00 AS STRING) + INTERVAL '2' DAY:string>
|
||||
-- !query output
|
||||
2017-12-13 09:30:00
|
||||
|
||||
|
@ -208,7 +208,7 @@ select interval 2 day + cast('1' as binary)
|
|||
struct<>
|
||||
-- !query output
|
||||
org.apache.spark.sql.AnalysisException
|
||||
cannot resolve 'CAST('1' AS BINARY) + INTERVAL '2 days'' due to data type mismatch: argument 1 requires (timestamp or timestamp without time zone) type, however, 'CAST('1' AS BINARY)' is of binary type.; line 1 pos 7
|
||||
cannot resolve 'CAST('1' AS BINARY) + INTERVAL '2' DAY' due to data type mismatch: argument 1 requires (timestamp or timestamp without time zone) type, however, 'CAST('1' AS BINARY)' is of binary type.; line 1 pos 7
|
||||
|
||||
|
||||
-- !query
|
||||
|
@ -217,13 +217,13 @@ select interval 2 day + cast(1 as boolean)
|
|||
struct<>
|
||||
-- !query output
|
||||
org.apache.spark.sql.AnalysisException
|
||||
cannot resolve 'CAST(1 AS BOOLEAN) + INTERVAL '2 days'' due to data type mismatch: argument 1 requires (timestamp or timestamp without time zone) type, however, 'CAST(1 AS BOOLEAN)' is of boolean type.; line 1 pos 7
|
||||
cannot resolve 'CAST(1 AS BOOLEAN) + INTERVAL '2' DAY' due to data type mismatch: argument 1 requires (timestamp or timestamp without time zone) type, however, 'CAST(1 AS BOOLEAN)' is of boolean type.; line 1 pos 7
|
||||
|
||||
|
||||
-- !query
|
||||
select interval 2 day + cast('2017-12-11 09:30:00.0' as timestamp)
|
||||
-- !query schema
|
||||
struct<CAST(2017-12-11 09:30:00.0 AS TIMESTAMP) + INTERVAL '2 days':timestamp>
|
||||
struct<CAST(2017-12-11 09:30:00.0 AS TIMESTAMP) + INTERVAL '2' DAY:timestamp>
|
||||
-- !query output
|
||||
2017-12-13 09:30:00
|
||||
|
||||
|
@ -231,7 +231,7 @@ struct<CAST(2017-12-11 09:30:00.0 AS TIMESTAMP) + INTERVAL '2 days':timestamp>
|
|||
-- !query
|
||||
select interval 2 day + cast('2017-12-11 09:30:00' as date)
|
||||
-- !query schema
|
||||
struct<CAST(2017-12-11 09:30:00 AS DATE) + INTERVAL '2 days':date>
|
||||
struct<date_add(CAST(2017-12-11 09:30:00 AS DATE), extractansiintervaldays(INTERVAL '2' DAY)):date>
|
||||
-- !query output
|
||||
2017-12-13
|
||||
|
||||
|
@ -242,7 +242,7 @@ select cast(1 as tinyint) - interval 2 day
|
|||
struct<>
|
||||
-- !query output
|
||||
org.apache.spark.sql.AnalysisException
|
||||
cannot resolve 'CAST(1 AS TINYINT) + (- INTERVAL '2 days')' due to data type mismatch: argument 1 requires (timestamp or timestamp without time zone) type, however, 'CAST(1 AS TINYINT)' is of tinyint type.; line 1 pos 7
|
||||
cannot resolve 'CAST(1 AS TINYINT) + (- INTERVAL '2' DAY)' due to data type mismatch: argument 1 requires (timestamp or timestamp without time zone) type, however, 'CAST(1 AS TINYINT)' is of tinyint type.; line 1 pos 7
|
||||
|
||||
|
||||
-- !query
|
||||
|
@ -251,7 +251,7 @@ select cast(1 as smallint) - interval 2 day
|
|||
struct<>
|
||||
-- !query output
|
||||
org.apache.spark.sql.AnalysisException
|
||||
cannot resolve 'CAST(1 AS SMALLINT) + (- INTERVAL '2 days')' due to data type mismatch: argument 1 requires (timestamp or timestamp without time zone) type, however, 'CAST(1 AS SMALLINT)' is of smallint type.; line 1 pos 7
|
||||
cannot resolve 'CAST(1 AS SMALLINT) + (- INTERVAL '2' DAY)' due to data type mismatch: argument 1 requires (timestamp or timestamp without time zone) type, however, 'CAST(1 AS SMALLINT)' is of smallint type.; line 1 pos 7
|
||||
|
||||
|
||||
-- !query
|
||||
|
@ -260,7 +260,7 @@ select cast(1 as int) - interval 2 day
|
|||
struct<>
|
||||
-- !query output
|
||||
org.apache.spark.sql.AnalysisException
|
||||
cannot resolve 'CAST(1 AS INT) + (- INTERVAL '2 days')' due to data type mismatch: argument 1 requires (timestamp or timestamp without time zone) type, however, 'CAST(1 AS INT)' is of int type.; line 1 pos 7
|
||||
cannot resolve 'CAST(1 AS INT) + (- INTERVAL '2' DAY)' due to data type mismatch: argument 1 requires (timestamp or timestamp without time zone) type, however, 'CAST(1 AS INT)' is of int type.; line 1 pos 7
|
||||
|
||||
|
||||
-- !query
|
||||
|
@ -269,7 +269,7 @@ select cast(1 as bigint) - interval 2 day
|
|||
struct<>
|
||||
-- !query output
|
||||
org.apache.spark.sql.AnalysisException
|
||||
cannot resolve 'CAST(1 AS BIGINT) + (- INTERVAL '2 days')' due to data type mismatch: argument 1 requires (timestamp or timestamp without time zone) type, however, 'CAST(1 AS BIGINT)' is of bigint type.; line 1 pos 7
|
||||
cannot resolve 'CAST(1 AS BIGINT) + (- INTERVAL '2' DAY)' due to data type mismatch: argument 1 requires (timestamp or timestamp without time zone) type, however, 'CAST(1 AS BIGINT)' is of bigint type.; line 1 pos 7
|
||||
|
||||
|
||||
-- !query
|
||||
|
@ -278,7 +278,7 @@ select cast(1 as float) - interval 2 day
|
|||
struct<>
|
||||
-- !query output
|
||||
org.apache.spark.sql.AnalysisException
|
||||
cannot resolve 'CAST(1 AS FLOAT) + (- INTERVAL '2 days')' due to data type mismatch: argument 1 requires (timestamp or timestamp without time zone) type, however, 'CAST(1 AS FLOAT)' is of float type.; line 1 pos 7
|
||||
cannot resolve 'CAST(1 AS FLOAT) + (- INTERVAL '2' DAY)' due to data type mismatch: argument 1 requires (timestamp or timestamp without time zone) type, however, 'CAST(1 AS FLOAT)' is of float type.; line 1 pos 7
|
||||
|
||||
|
||||
-- !query
|
||||
|
@ -287,7 +287,7 @@ select cast(1 as double) - interval 2 day
|
|||
struct<>
|
||||
-- !query output
|
||||
org.apache.spark.sql.AnalysisException
|
||||
cannot resolve 'CAST(1 AS DOUBLE) + (- INTERVAL '2 days')' due to data type mismatch: argument 1 requires (timestamp or timestamp without time zone) type, however, 'CAST(1 AS DOUBLE)' is of double type.; line 1 pos 7
|
||||
cannot resolve 'CAST(1 AS DOUBLE) + (- INTERVAL '2' DAY)' due to data type mismatch: argument 1 requires (timestamp or timestamp without time zone) type, however, 'CAST(1 AS DOUBLE)' is of double type.; line 1 pos 7
|
||||
|
||||
|
||||
-- !query
|
||||
|
@ -296,13 +296,13 @@ select cast(1 as decimal(10, 0)) - interval 2 day
|
|||
struct<>
|
||||
-- !query output
|
||||
org.apache.spark.sql.AnalysisException
|
||||
cannot resolve 'CAST(1 AS DECIMAL(10,0)) + (- INTERVAL '2 days')' due to data type mismatch: argument 1 requires (timestamp or timestamp without time zone) type, however, 'CAST(1 AS DECIMAL(10,0))' is of decimal(10,0) type.; line 1 pos 7
|
||||
cannot resolve 'CAST(1 AS DECIMAL(10,0)) + (- INTERVAL '2' DAY)' due to data type mismatch: argument 1 requires (timestamp or timestamp without time zone) type, however, 'CAST(1 AS DECIMAL(10,0))' is of decimal(10,0) type.; line 1 pos 7
|
||||
|
||||
|
||||
-- !query
|
||||
select cast('2017-12-11' as string) - interval 2 day
|
||||
-- !query schema
|
||||
struct<CAST(2017-12-11 AS STRING) - INTERVAL '2 days':string>
|
||||
struct<CAST(2017-12-11 AS STRING) - INTERVAL '2' DAY:string>
|
||||
-- !query output
|
||||
2017-12-09 00:00:00
|
||||
|
||||
|
@ -310,7 +310,7 @@ struct<CAST(2017-12-11 AS STRING) - INTERVAL '2 days':string>
|
|||
-- !query
|
||||
select cast('2017-12-11 09:30:00' as string) - interval 2 day
|
||||
-- !query schema
|
||||
struct<CAST(2017-12-11 09:30:00 AS STRING) - INTERVAL '2 days':string>
|
||||
struct<CAST(2017-12-11 09:30:00 AS STRING) - INTERVAL '2' DAY:string>
|
||||
-- !query output
|
||||
2017-12-09 09:30:00
|
||||
|
||||
|
@ -321,7 +321,7 @@ select cast('1' as binary) - interval 2 day
|
|||
struct<>
|
||||
-- !query output
|
||||
org.apache.spark.sql.AnalysisException
|
||||
cannot resolve 'CAST('1' AS BINARY) + (- INTERVAL '2 days')' due to data type mismatch: argument 1 requires (timestamp or timestamp without time zone) type, however, 'CAST('1' AS BINARY)' is of binary type.; line 1 pos 7
|
||||
cannot resolve 'CAST('1' AS BINARY) + (- INTERVAL '2' DAY)' due to data type mismatch: argument 1 requires (timestamp or timestamp without time zone) type, however, 'CAST('1' AS BINARY)' is of binary type.; line 1 pos 7
|
||||
|
||||
|
||||
-- !query
|
||||
|
@ -330,13 +330,13 @@ select cast(1 as boolean) - interval 2 day
|
|||
struct<>
|
||||
-- !query output
|
||||
org.apache.spark.sql.AnalysisException
|
||||
cannot resolve 'CAST(1 AS BOOLEAN) + (- INTERVAL '2 days')' due to data type mismatch: argument 1 requires (timestamp or timestamp without time zone) type, however, 'CAST(1 AS BOOLEAN)' is of boolean type.; line 1 pos 7
|
||||
cannot resolve 'CAST(1 AS BOOLEAN) + (- INTERVAL '2' DAY)' due to data type mismatch: argument 1 requires (timestamp or timestamp without time zone) type, however, 'CAST(1 AS BOOLEAN)' is of boolean type.; line 1 pos 7
|
||||
|
||||
|
||||
-- !query
|
||||
select cast('2017-12-11 09:30:00.0' as timestamp) - interval 2 day
|
||||
-- !query schema
|
||||
struct<CAST(2017-12-11 09:30:00.0 AS TIMESTAMP) - INTERVAL '2 days':timestamp>
|
||||
struct<CAST(2017-12-11 09:30:00.0 AS TIMESTAMP) - INTERVAL '2' DAY:timestamp>
|
||||
-- !query output
|
||||
2017-12-09 09:30:00
|
||||
|
||||
|
@ -344,6 +344,6 @@ struct<CAST(2017-12-11 09:30:00.0 AS TIMESTAMP) - INTERVAL '2 days':timestamp>
|
|||
-- !query
|
||||
select cast('2017-12-11 09:30:00' as date) - interval 2 day
|
||||
-- !query schema
|
||||
struct<CAST(2017-12-11 09:30:00 AS DATE) - INTERVAL '2 days':date>
|
||||
struct<date_add(CAST(2017-12-11 09:30:00 AS DATE), (- extractansiintervaldays(INTERVAL '2' DAY))):date>
|
||||
-- !query output
|
||||
2017-12-09
|
||||
|
|
|
@ -154,7 +154,7 @@ SELECT val_timestamp, udf(cate), avg(val_timestamp) OVER(PARTITION BY udf(cate)
|
|||
RANGE BETWEEN CURRENT ROW AND interval 23 days 4 hours FOLLOWING) FROM testData
|
||||
ORDER BY udf(cate), val_timestamp
|
||||
-- !query schema
|
||||
struct<val_timestamp:timestamp,udf(cate):string,avg(val_timestamp) OVER (PARTITION BY udf(cate) ORDER BY val_timestamp ASC NULLS FIRST RANGE BETWEEN CURRENT ROW AND INTERVAL '23 days 4 hours' FOLLOWING):double>
|
||||
struct<val_timestamp:timestamp,udf(cate):string,avg(val_timestamp) OVER (PARTITION BY udf(cate) ORDER BY val_timestamp ASC NULLS FIRST RANGE BETWEEN CURRENT ROW AND INTERVAL '23 04' DAY TO HOUR FOLLOWING):double>
|
||||
-- !query output
|
||||
NULL NULL NULL
|
||||
2017-07-31 17:00:00 NULL 1.5015456E9
|
||||
|
|
|
@ -198,7 +198,7 @@ SELECT val_timestamp, cate, avg(val_timestamp) OVER(PARTITION BY cate ORDER BY v
|
|||
RANGE BETWEEN CURRENT ROW AND interval 23 days 4 hours FOLLOWING) FROM testData
|
||||
ORDER BY cate, val_timestamp
|
||||
-- !query schema
|
||||
struct<val_timestamp:timestamp,cate:string,avg(val_timestamp) OVER (PARTITION BY cate ORDER BY val_timestamp ASC NULLS FIRST RANGE BETWEEN CURRENT ROW AND INTERVAL '23 days 4 hours' FOLLOWING):double>
|
||||
struct<val_timestamp:timestamp,cate:string,avg(val_timestamp) OVER (PARTITION BY cate ORDER BY val_timestamp ASC NULLS FIRST RANGE BETWEEN CURRENT ROW AND INTERVAL '23 04' DAY TO HOUR FOLLOWING):double>
|
||||
-- !query output
|
||||
NULL NULL NULL
|
||||
2017-07-31 17:00:00 NULL 1.5015456E9
|
||||
|
@ -216,7 +216,7 @@ SELECT val_timestamp, cate, avg(val_timestamp) OVER(PARTITION BY cate ORDER BY t
|
|||
RANGE BETWEEN CURRENT ROW AND interval 23 days 4 hours FOLLOWING) FROM testData
|
||||
ORDER BY cate, to_timestamp_ntz(val_timestamp)
|
||||
-- !query schema
|
||||
struct<val_timestamp:timestamp,cate:string,avg(val_timestamp) OVER (PARTITION BY cate ORDER BY to_timestamp_ntz(val_timestamp) ASC NULLS FIRST RANGE BETWEEN CURRENT ROW AND INTERVAL '23 days 4 hours' FOLLOWING):double>
|
||||
struct<val_timestamp:timestamp,cate:string,avg(val_timestamp) OVER (PARTITION BY cate ORDER BY to_timestamp_ntz(val_timestamp) ASC NULLS FIRST RANGE BETWEEN CURRENT ROW AND INTERVAL '23 04' DAY TO HOUR FOLLOWING):double>
|
||||
-- !query output
|
||||
NULL NULL NULL
|
||||
2017-07-31 17:00:00 NULL 1.5015456E9
|
||||
|
|
|
@ -220,7 +220,7 @@ Output [4]: [d_date_sk#23, d_date#24, d_week_seq#25, d_date_sk#26]
|
|||
(34) BroadcastHashJoin [codegen id : 10]
|
||||
Left keys [1]: [cs_sold_date_sk#8]
|
||||
Right keys [1]: [d_date_sk#23]
|
||||
Join condition: (d_date#17 > d_date#24 + 5 days)
|
||||
Join condition: (d_date#17 > date_add(d_date#24, 5))
|
||||
|
||||
(35) Project [codegen id : 10]
|
||||
Output [7]: [cs_item_sk#4, cs_promo_sk#5, cs_order_number#6, cs_quantity#7, i_item_desc#21, d_week_seq#25, d_date_sk#26]
|
||||
|
|
|
@ -290,7 +290,7 @@ Arguments: HashedRelationBroadcastMode(List(cast(input[0, int, false] as bigint)
|
|||
(49) BroadcastHashJoin [codegen id : 10]
|
||||
Left keys [1]: [cs_ship_date_sk#1]
|
||||
Right keys [1]: [d_date_sk#33]
|
||||
Join condition: (d_date#34 > d_date#28 + 5 days)
|
||||
Join condition: (d_date#34 > date_add(d_date#28, 5))
|
||||
|
||||
(50) Project [codegen id : 10]
|
||||
Output [6]: [cs_item_sk#4, cs_promo_sk#5, cs_order_number#6, w_warehouse_name#16, i_item_desc#19, d_week_seq#29]
|
||||
|
|
|
@ -220,7 +220,7 @@ Output [4]: [d_date_sk#23, d_date#24, d_week_seq#25, d_date_sk#26]
|
|||
(34) BroadcastHashJoin [codegen id : 10]
|
||||
Left keys [1]: [cs_sold_date_sk#8]
|
||||
Right keys [1]: [d_date_sk#23]
|
||||
Join condition: (d_date#17 > d_date#24 + 5 days)
|
||||
Join condition: (d_date#17 > date_add(d_date#24, 5))
|
||||
|
||||
(35) Project [codegen id : 10]
|
||||
Output [7]: [cs_item_sk#4, cs_promo_sk#5, cs_order_number#6, cs_quantity#7, i_item_desc#21, d_week_seq#25, d_date_sk#26]
|
||||
|
|
|
@ -290,7 +290,7 @@ Arguments: HashedRelationBroadcastMode(List(cast(input[0, int, false] as bigint)
|
|||
(49) BroadcastHashJoin [codegen id : 10]
|
||||
Left keys [1]: [cs_ship_date_sk#1]
|
||||
Right keys [1]: [d_date_sk#33]
|
||||
Join condition: (d_date#34 > d_date#28 + 5 days)
|
||||
Join condition: (d_date#34 > date_add(d_date#28, 5))
|
||||
|
||||
(50) Project [codegen id : 10]
|
||||
Output [6]: [cs_item_sk#4, cs_promo_sk#5, cs_order_number#6, w_warehouse_name#16, i_item_desc#19, d_week_seq#29]
|
||||
|
|
|
@ -20,6 +20,8 @@ package org.apache.spark.sql
|
|||
import java.io.File
|
||||
import java.net.{MalformedURLException, URL}
|
||||
import java.sql.{Date, Timestamp}
|
||||
import java.time.{Duration, Period}
|
||||
import java.util.Locale
|
||||
import java.util.concurrent.atomic.AtomicBoolean
|
||||
|
||||
import org.apache.commons.io.FileUtils
|
||||
|
@ -1441,30 +1443,52 @@ class SQLQuerySuite extends QueryTest with SharedSparkSession with AdaptiveSpark
|
|||
test("SPARK-8753: add interval type") {
|
||||
import org.apache.spark.unsafe.types.CalendarInterval
|
||||
|
||||
val df = sql("select interval 3 years -3 month 7 week 123 microseconds")
|
||||
checkAnswer(df, Row(new CalendarInterval(12 * 3 - 3, 7 * 7, 123 )))
|
||||
val ymDF = sql("select interval 3 years -3 month")
|
||||
checkAnswer(ymDF, Row(Period.of(2, 9, 0)))
|
||||
withTempPath(f => {
|
||||
// Currently we don't yet support saving out values of interval data type.
|
||||
val e = intercept[AnalysisException] {
|
||||
df.write.json(f.getCanonicalPath)
|
||||
ymDF.write.json(f.getCanonicalPath)
|
||||
}
|
||||
e.message.contains("Cannot save interval data type into external storage")
|
||||
})
|
||||
|
||||
val dtDF = sql("select interval 5 days 8 hours 12 minutes 50 seconds")
|
||||
checkAnswer(dtDF, Row(Duration.ofDays(5).plusHours(8).plusMinutes(12).plusSeconds(50)))
|
||||
withTempPath(f => {
|
||||
val e = intercept[AnalysisException] {
|
||||
dtDF.write.json(f.getCanonicalPath)
|
||||
}
|
||||
e.message.contains("Cannot save interval data type into external storage")
|
||||
})
|
||||
|
||||
withSQLConf(SQLConf.LEGACY_INTERVAL_ENABLED.key -> "true") {
|
||||
val df = sql("select interval 3 years -3 month 7 week 123 microseconds")
|
||||
checkAnswer(df, Row(new CalendarInterval(12 * 3 - 3, 7 * 7, 123)))
|
||||
withTempPath(f => {
|
||||
// Currently we don't yet support saving out values of interval data type.
|
||||
val e = intercept[AnalysisException] {
|
||||
df.write.json(f.getCanonicalPath)
|
||||
}
|
||||
e.message.contains("Cannot save interval data type into external storage")
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
test("SPARK-8945: add and subtract expressions for interval type") {
|
||||
val df = sql("select interval 3 years -3 month 7 week 123 microseconds as i")
|
||||
checkAnswer(df, Row(new CalendarInterval(12 * 3 - 3, 7 * 7, 123)))
|
||||
withSQLConf(SQLConf.LEGACY_INTERVAL_ENABLED.key -> "true") {
|
||||
val df = sql("select interval 3 years -3 month 7 week 123 microseconds as i")
|
||||
checkAnswer(df, Row(new CalendarInterval(12 * 3 - 3, 7 * 7, 123)))
|
||||
|
||||
checkAnswer(df.select(df("i") + new CalendarInterval(2, 1, 123)),
|
||||
Row(new CalendarInterval(12 * 3 - 3 + 2, 7 * 7 + 1, 123 + 123)))
|
||||
checkAnswer(df.select(df("i") + new CalendarInterval(2, 1, 123)),
|
||||
Row(new CalendarInterval(12 * 3 - 3 + 2, 7 * 7 + 1, 123 + 123)))
|
||||
|
||||
checkAnswer(df.select(df("i") - new CalendarInterval(2, 1, 123)),
|
||||
Row(new CalendarInterval(12 * 3 - 3 - 2, 7 * 7 - 1, 123 - 123)))
|
||||
checkAnswer(df.select(df("i") - new CalendarInterval(2, 1, 123)),
|
||||
Row(new CalendarInterval(12 * 3 - 3 - 2, 7 * 7 - 1, 123 - 123)))
|
||||
|
||||
// unary minus
|
||||
checkAnswer(df.select(-df("i")),
|
||||
Row(new CalendarInterval(-(12 * 3 - 3), -7 * 7, -123)))
|
||||
// unary minus
|
||||
checkAnswer(df.select(-df("i")),
|
||||
Row(new CalendarInterval(-(12 * 3 - 3), -7 * 7, -123)))
|
||||
}
|
||||
}
|
||||
|
||||
test("aggregation with codegen updates peak execution memory") {
|
||||
|
@ -4011,18 +4035,102 @@ class SQLQuerySuite extends QueryTest with SharedSparkSession with AdaptiveSpark
|
|||
}
|
||||
|
||||
test("SPARK-35737: Parse day-time interval literals to tightest types") {
|
||||
import DayTimeIntervalType._
|
||||
val dayToSecDF = spark.sql("SELECT INTERVAL '13 02:02:10' DAY TO SECOND")
|
||||
assert(dayToSecDF.schema.head.dataType === DayTimeIntervalType(0, 3))
|
||||
assert(dayToSecDF.schema.head.dataType === DayTimeIntervalType(DAY, SECOND))
|
||||
val dayToMinuteDF = spark.sql("SELECT INTERVAL '-2 13:00' DAY TO MINUTE")
|
||||
assert(dayToMinuteDF.schema.head.dataType === DayTimeIntervalType(0, 2))
|
||||
assert(dayToMinuteDF.schema.head.dataType === DayTimeIntervalType(DAY, MINUTE))
|
||||
val dayToHourDF = spark.sql("SELECT INTERVAL '0 15' DAY TO HOUR")
|
||||
assert(dayToHourDF.schema.head.dataType === DayTimeIntervalType(0, 1))
|
||||
assert(dayToHourDF.schema.head.dataType === DayTimeIntervalType(DAY, HOUR))
|
||||
val dayDF = spark.sql("SELECT INTERVAL '23' DAY")
|
||||
assert(dayDF.schema.head.dataType === DayTimeIntervalType(DAY))
|
||||
val hourToSecDF = spark.sql("SELECT INTERVAL '00:21:02.03' HOUR TO SECOND")
|
||||
assert(hourToSecDF.schema.head.dataType === DayTimeIntervalType(1, 3))
|
||||
assert(hourToSecDF.schema.head.dataType === DayTimeIntervalType(HOUR, SECOND))
|
||||
val hourToMinuteDF = spark.sql("SELECT INTERVAL '01:02' HOUR TO MINUTE")
|
||||
assert(hourToMinuteDF.schema.head.dataType === DayTimeIntervalType(1, 2))
|
||||
assert(hourToMinuteDF.schema.head.dataType === DayTimeIntervalType(HOUR, MINUTE))
|
||||
val hourDF1 = spark.sql("SELECT INTERVAL '17' HOUR")
|
||||
assert(hourDF1.schema.head.dataType === DayTimeIntervalType(HOUR))
|
||||
val minuteToSecDF = spark.sql("SELECT INTERVAL '10:03.775808000' MINUTE TO SECOND")
|
||||
assert(minuteToSecDF.schema.head.dataType === DayTimeIntervalType(2, 3))
|
||||
assert(minuteToSecDF.schema.head.dataType === DayTimeIntervalType(MINUTE, SECOND))
|
||||
val minuteDF1 = spark.sql("SELECT INTERVAL '03' MINUTE")
|
||||
assert(minuteDF1.schema.head.dataType === DayTimeIntervalType(MINUTE))
|
||||
val secondDF1 = spark.sql("SELECT INTERVAL '11' SECOND")
|
||||
assert(secondDF1.schema.head.dataType === DayTimeIntervalType(SECOND))
|
||||
|
||||
// Seconds greater than 1 minute
|
||||
val secondDF2 = spark.sql("SELECT INTERVAL '75' SECONDS")
|
||||
assert(secondDF2.schema.head.dataType === DayTimeIntervalType(SECOND))
|
||||
|
||||
// Minutes and seconds greater than 1 hour
|
||||
val minuteDF2 = spark.sql("SELECT INTERVAL '68' MINUTES")
|
||||
assert(minuteDF2.schema.head.dataType === DayTimeIntervalType(MINUTE))
|
||||
val secondDF3 = spark.sql("SELECT INTERVAL '11112' SECONDS")
|
||||
assert(secondDF3.schema.head.dataType === DayTimeIntervalType(SECOND))
|
||||
|
||||
// Hours, minutes and seconds greater than 1 day
|
||||
val hourDF2 = spark.sql("SELECT INTERVAL '27' HOURS")
|
||||
assert(hourDF2.schema.head.dataType === DayTimeIntervalType(HOUR))
|
||||
val minuteDF3 = spark.sql("SELECT INTERVAL '2883' MINUTES")
|
||||
assert(minuteDF3.schema.head.dataType === DayTimeIntervalType(MINUTE))
|
||||
val secondDF4 = spark.sql("SELECT INTERVAL '266582' SECONDS")
|
||||
assert(secondDF4.schema.head.dataType === DayTimeIntervalType(SECOND))
|
||||
}
|
||||
|
||||
test("SPARK-35773: Parse year-month interval literals to tightest types") {
|
||||
import YearMonthIntervalType._
|
||||
val yearToMonthDF = spark.sql("SELECT INTERVAL '2021-06' YEAR TO MONTH")
|
||||
assert(yearToMonthDF.schema.head.dataType === YearMonthIntervalType(YEAR, MONTH))
|
||||
val yearDF = spark.sql("SELECT INTERVAL '2022' YEAR")
|
||||
assert(yearDF.schema.head.dataType === YearMonthIntervalType(YEAR))
|
||||
val monthDF1 = spark.sql("SELECT INTERVAL '08' MONTH")
|
||||
assert(monthDF1.schema.head.dataType === YearMonthIntervalType(MONTH))
|
||||
// Months greater than 1 year
|
||||
val monthDF2 = spark.sql("SELECT INTERVAL '25' MONTHS")
|
||||
assert(monthDF2.schema.head.dataType === YearMonthIntervalType(MONTH))
|
||||
}
|
||||
|
||||
test("SPARK-35749: Parse multiple unit fields interval literals as day-time interval types") {
|
||||
def evalAsSecond(query: String): Long = {
|
||||
spark.sql(query).map(_.getAs[Duration](0)).collect.head.getSeconds
|
||||
}
|
||||
|
||||
Seq(
|
||||
("SELECT INTERVAL '7' DAY", 604800),
|
||||
("SELECT INTERVAL '5' HOUR", 18000),
|
||||
("SELECT INTERVAL '2' MINUTE", 120),
|
||||
("SELECT INTERVAL '30' SECOND", 30),
|
||||
("SELECT INTERVAL '10' DAY '20' HOUR '30' MINUTE '40' SECOND", 937840),
|
||||
// Units end with 's'
|
||||
("SELECT INTERVAL '2' DAYS '18' HOURS '34' MINUTES '53' SECONDS", 239693),
|
||||
// A unit occurs more than one time
|
||||
("SELECT INTERVAL '1' DAY '23' HOURS '3' DAYS '70' MINUTES " +
|
||||
"'5' SECONDS '24' HOURS '10' MINUTES '80' SECONDS", 519685)
|
||||
).foreach { case (query, expect) =>
|
||||
assert(evalAsSecond(query) === expect)
|
||||
// Units are lower case
|
||||
assert(evalAsSecond(query.toLowerCase(Locale.ROOT)) === expect)
|
||||
}
|
||||
}
|
||||
|
||||
test("SPARK-35749: Parse multiple unit fields interval literals as year-month interval types") {
|
||||
def evalAsYearAndMonth(query: String): (Int, Int) = {
|
||||
val result = spark.sql(query).map(_.getAs[Period](0)).collect.head
|
||||
(result.getYears, result.getMonths)
|
||||
}
|
||||
|
||||
Seq(
|
||||
("SELECT INTERVAL '10' YEAR", (10, 0)),
|
||||
("SELECT INTERVAL '7' MONTH", (0, 7)),
|
||||
("SELECT INTERVAL '8' YEAR '3' MONTH", (8, 3)),
|
||||
// Units end with 's'
|
||||
("SELECT INTERVAL '5' YEARS '10' MONTHS", (5, 10)),
|
||||
// A unit is appears more than one time
|
||||
("SELECT INTERVAL '3' YEARS '5' MONTHS '1' YEAR '8' MONTHS", (5, 1))
|
||||
).foreach { case (query, expect) =>
|
||||
assert(evalAsYearAndMonth(query) === expect)
|
||||
// Units are lower case
|
||||
assert(evalAsYearAndMonth(query.toLowerCase(Locale.ROOT)) === expect)
|
||||
}
|
||||
}
|
||||
|
||||
test("SPARK-35937: Extract date field from timestamp should work in ANSI mode") {
|
||||
|
|
|
@ -25,6 +25,7 @@ import org.apache.spark.sql.catalyst.plans.logical.{AppendData, LogicalPlan}
|
|||
import org.apache.spark.sql.connector.catalog.{Identifier, InMemoryTableCatalog}
|
||||
import org.apache.spark.sql.execution.QueryExecution
|
||||
import org.apache.spark.sql.execution.datasources.v2.DataSourceV2Relation
|
||||
import org.apache.spark.sql.internal.SQLConf
|
||||
import org.apache.spark.sql.types.StructType
|
||||
import org.apache.spark.sql.util.QueryExecutionListener
|
||||
|
||||
|
@ -171,20 +172,22 @@ class DataSourceV2DataFrameSuite
|
|||
}
|
||||
|
||||
test("Cannot write data with intervals to v2") {
|
||||
withTable("testcat.table_name") {
|
||||
val testCatalog = spark.sessionState.catalogManager.catalog("testcat").asTableCatalog
|
||||
testCatalog.createTable(
|
||||
Identifier.of(Array(), "table_name"),
|
||||
new StructType().add("i", "interval"),
|
||||
Array.empty, Collections.emptyMap[String, String])
|
||||
val df = sql("select interval 1 day as i")
|
||||
val v2Writer = df.writeTo("testcat.table_name")
|
||||
val e1 = intercept[AnalysisException](v2Writer.append())
|
||||
assert(e1.getMessage.contains(s"Cannot use interval type in the table schema."))
|
||||
val e2 = intercept[AnalysisException](v2Writer.overwrite(df("i")))
|
||||
assert(e2.getMessage.contains(s"Cannot use interval type in the table schema."))
|
||||
val e3 = intercept[AnalysisException](v2Writer.overwritePartitions())
|
||||
assert(e3.getMessage.contains(s"Cannot use interval type in the table schema."))
|
||||
withSQLConf(SQLConf.LEGACY_INTERVAL_ENABLED.key -> "true") {
|
||||
withTable("testcat.table_name") {
|
||||
val testCatalog = spark.sessionState.catalogManager.catalog("testcat").asTableCatalog
|
||||
testCatalog.createTable(
|
||||
Identifier.of(Array(), "table_name"),
|
||||
new StructType().add("i", "interval"),
|
||||
Array.empty, Collections.emptyMap[String, String])
|
||||
val df = sql(s"select interval 1 millisecond as i")
|
||||
val v2Writer = df.writeTo("testcat.table_name")
|
||||
val e1 = intercept[AnalysisException](v2Writer.append())
|
||||
assert(e1.getMessage.contains(s"Cannot use interval type in the table schema."))
|
||||
val e2 = intercept[AnalysisException](v2Writer.overwrite(df("i")))
|
||||
assert(e2.getMessage.contains(s"Cannot use interval type in the table schema."))
|
||||
val e3 = intercept[AnalysisException](v2Writer.overwritePartitions())
|
||||
assert(e3.getMessage.contains(s"Cannot use interval type in the table schema."))
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
|
|
|
@ -663,11 +663,34 @@ class HiveThriftBinaryServerSuite extends HiveThriftServer2Test {
|
|||
|
||||
test("Support interval type") {
|
||||
withJdbcStatement() { statement =>
|
||||
val rs = statement.executeQuery("SELECT interval 3 months 1 hours")
|
||||
val rs = statement.executeQuery("SELECT interval 5 years 7 months")
|
||||
assert(rs.next())
|
||||
assert(rs.getString(1) === "3 months 1 hours")
|
||||
assert(rs.getString(1) === "5-7")
|
||||
}
|
||||
withJdbcStatement() { statement =>
|
||||
val rs = statement.executeQuery("SELECT interval 8 days 10 hours 5 minutes 10 seconds")
|
||||
assert(rs.next())
|
||||
assert(rs.getString(1) === "8 10:05:10.000000000")
|
||||
}
|
||||
withJdbcStatement() { statement =>
|
||||
val rs = statement.executeQuery("SELECT interval 3 days 1 hours")
|
||||
assert(rs.next())
|
||||
assert(rs.getString(1) === "3 01:00:00.000000000")
|
||||
}
|
||||
|
||||
// Invalid interval value
|
||||
withJdbcStatement() { statement =>
|
||||
val e = intercept[SQLException] {
|
||||
statement.executeQuery("SELECT interval 5 yea 7 months")
|
||||
}
|
||||
assert(e.getMessage.contains("org.apache.spark.sql.catalyst.parser.ParseException"))
|
||||
}
|
||||
withJdbcStatement() { statement =>
|
||||
val e = intercept[SQLException] {
|
||||
statement.executeQuery("SELECT interval 8 days 10 hours 5 minutes 10 secon")
|
||||
}
|
||||
assert(e.getMessage.contains("org.apache.spark.sql.catalyst.parser.ParseException"))
|
||||
}
|
||||
withJdbcStatement() { statement =>
|
||||
val e = intercept[SQLException] {
|
||||
statement.executeQuery("SELECT interval 3 months 1 hou")
|
||||
|
@ -715,8 +738,8 @@ class HiveThriftBinaryServerSuite extends HiveThriftServer2Test {
|
|||
| AND v1.c = v2.c
|
||||
|""".stripMargin)
|
||||
while (rs.next()) {
|
||||
assert(rs.getString("a1") === "1 days")
|
||||
assert(rs.getString("a2") === "1 days")
|
||||
assert(rs.getString("a1") === "1 00:00:00.000000000")
|
||||
assert(rs.getString("a2") === "1 00:00:00.000000000")
|
||||
assert(rs.getString("b1") === "2-1")
|
||||
assert(rs.getString("b2") === "2-1")
|
||||
assert(rs.getString("c1") === "3 01:01:01.000000000")
|
||||
|
|
|
@ -24,6 +24,7 @@ import org.apache.hive.service.cli.HiveSQLException
|
|||
|
||||
import org.apache.spark.SPARK_VERSION
|
||||
import org.apache.spark.sql.catalyst.analysis.FunctionRegistry
|
||||
import org.apache.spark.sql.internal.SQLConf
|
||||
import org.apache.spark.sql.types._
|
||||
import org.apache.spark.util.VersionUtils
|
||||
|
||||
|
@ -355,6 +356,7 @@ class SparkMetadataOperationSuite extends HiveThriftServer2TestBase {
|
|||
val ddl = s"CREATE GLOBAL TEMP VIEW $viewName as select interval 1 day as i"
|
||||
|
||||
withJdbcStatement(viewName) { statement =>
|
||||
statement.execute(s"SET ${SQLConf.LEGACY_INTERVAL_ENABLED.key}=true")
|
||||
statement.execute(ddl)
|
||||
val data = statement.getConnection.getMetaData
|
||||
val rowSet = data.getColumns("", "global_temp", viewName, null)
|
||||
|
|
|
@ -362,14 +362,26 @@ class SparkThriftServerProtocolVersionsSuite extends HiveThriftServer2TestBase {
|
|||
}
|
||||
|
||||
test(s"$version get interval type") {
|
||||
testExecuteStatementWithProtocolVersion(version, "SELECT interval '1' year '2' day") { rs =>
|
||||
testExecuteStatementWithProtocolVersion(version,
|
||||
"SELECT interval '1' year '2' month") { rs =>
|
||||
assert(rs.next())
|
||||
assert(rs.getString(1) === "1 years 2 days")
|
||||
assert(rs.getString(1) === "1-2")
|
||||
val metaData = rs.getMetaData
|
||||
assert(metaData.getColumnName(1) === "INTERVAL '1 years 2 days'")
|
||||
assert(metaData.getColumnTypeName(1) === "string")
|
||||
assert(metaData.getColumnType(1) === java.sql.Types.VARCHAR)
|
||||
assert(metaData.getPrecision(1) === Int.MaxValue)
|
||||
assert(metaData.getColumnName(1) === "INTERVAL '1-2' YEAR TO MONTH")
|
||||
assert(metaData.getColumnTypeName(1) === "interval_year_month")
|
||||
assert(metaData.getColumnType(1) === java.sql.Types.OTHER)
|
||||
assert(metaData.getPrecision(1) === 11)
|
||||
assert(metaData.getScale(1) === 0)
|
||||
}
|
||||
testExecuteStatementWithProtocolVersion(version,
|
||||
"SELECT interval '1' day '2' hour '3' minute '4.005006' second") { rs =>
|
||||
assert(rs.next())
|
||||
assert(rs.getString(1) === "1 02:03:04.005006000")
|
||||
val metaData = rs.getMetaData
|
||||
assert(metaData.getColumnName(1) === "INTERVAL '1 02:03:04.005006' DAY TO SECOND")
|
||||
assert(metaData.getColumnTypeName(1) === "interval_day_time")
|
||||
assert(metaData.getColumnType(1) === java.sql.Types.OTHER)
|
||||
assert(metaData.getPrecision(1) === 29)
|
||||
assert(metaData.getScale(1) === 0)
|
||||
}
|
||||
}
|
||||
|
|
Loading…
Reference in a new issue