You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
[SPARK-39272][SQL] Increase the start position of query context by 1
### What changes were proposed in this pull request?
Increase the start position of query context by 1
### Why are the changes needed?
Currently, the line number starts from 1, while the start position starts from 0.
Thus it's better to increase the start position by 1 for consistency.
### Does this PR introduce _any_ user-facing change?
No
### How was this patch tested?
UT
Closes#36651 from gengliangwang/increase1.
Authored-by: Gengliang Wang <[email protected]>
Signed-off-by: Gengliang Wang <[email protected]>
Copy file name to clipboardExpand all lines: sql/core/src/test/resources/sql-tests/results/ansi/date.sql.out
+3-3Lines changed: 3 additions & 3 deletions
Original file line number
Diff line number
Diff line change
@@ -233,7 +233,7 @@ struct<>
233
233
-- !query output
234
234
org.apache.spark.SparkDateTimeException
235
235
[CAST_INVALID_INPUT] The value 'xx' of the type "STRING" cannot be cast to "DATE" because it is malformed. Correct the value as per the syntax, or change its target type. To return NULL instead, use `try_cast`. If necessary set "spark.sql.ansi.enabled" to "false" to bypass this error.
236
-
== SQL(line 1, position 7) ==
236
+
== SQL(line 1, position 8) ==
237
237
select next_day("xx", "Mon")
238
238
^^^^^^^^^^^^^^^^^^^^^
239
239
@@ -328,7 +328,7 @@ struct<>
328
328
-- !query output
329
329
org.apache.spark.SparkNumberFormatException
330
330
[CAST_INVALID_INPUT] The value '1.2' of the type "STRING" cannot be cast to "INT" because it is malformed. Correct the value as per the syntax, or change its target type. To return NULL instead, use `try_cast`. If necessary set "spark.sql.ansi.enabled" to "false" to bypass this error.
331
-
== SQL(line 1, position 7) ==
331
+
== SQL(line 1, position 8) ==
332
332
select date_add('2011-11-11', '1.2')
333
333
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
334
334
@@ -439,7 +439,7 @@ struct<>
439
439
-- !query output
440
440
org.apache.spark.SparkNumberFormatException
441
441
[CAST_INVALID_INPUT] The value '1.2' of the type "STRING" cannot be cast to "INT" because it is malformed. Correct the value as per the syntax, or change its target type. To return NULL instead, use `try_cast`. If necessary set "spark.sql.ansi.enabled" to "false" to bypass this error.
Copy file name to clipboardExpand all lines: sql/core/src/test/resources/sql-tests/results/ansi/datetime-parsing-invalid.sql.out
+2-2Lines changed: 2 additions & 2 deletions
Original file line number
Diff line number
Diff line change
@@ -251,7 +251,7 @@ struct<>
251
251
-- !query output
252
252
org.apache.spark.SparkDateTimeException
253
253
[CAST_INVALID_INPUT] The value 'Unparseable' of the type "STRING" cannot be cast to "TIMESTAMP" because it is malformed. Correct the value as per the syntax, or change its target type. To return NULL instead, use `try_cast`. If necessary set "spark.sql.ansi.enabled" to "false" to bypass this error.
254
-
== SQL(line 1, position 7) ==
254
+
== SQL(line 1, position 8) ==
255
255
select cast("Unparseable" as timestamp)
256
256
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
257
257
@@ -263,6 +263,6 @@ struct<>
263
263
-- !query output
264
264
org.apache.spark.SparkDateTimeException
265
265
[CAST_INVALID_INPUT] The value 'Unparseable' of the type "STRING" cannot be cast to "DATE" because it is malformed. Correct the value as per the syntax, or change its target type. To return NULL instead, use `try_cast`. If necessary set "spark.sql.ansi.enabled" to "false" to bypass this error.
Copy file name to clipboardExpand all lines: sql/core/src/test/resources/sql-tests/results/ansi/decimalArithmeticOperations.sql.out
+10-10Lines changed: 10 additions & 10 deletions
Original file line number
Diff line number
Diff line change
@@ -77,7 +77,7 @@ struct<>
77
77
-- !query output
78
78
org.apache.spark.SparkArithmeticException
79
79
[CANNOT_CHANGE_DECIMAL_PRECISION] Decimal(expanded, 10000000000000000000000000000000000000.1, 39, 1) cannot be represented as Decimal(38, 1). If necessary set "spark.sql.ansi.enabled" to "false" to bypass this error.
80
-
== SQL(line 1, position 7) ==
80
+
== SQL(line 1, position 8) ==
81
81
select (5e36BD + 0.1) + 5e36BD
82
82
^^^^^^^^^^^^^^^^^^^^^^^
83
83
@@ -89,7 +89,7 @@ struct<>
89
89
-- !query output
90
90
org.apache.spark.SparkArithmeticException
91
91
[CANNOT_CHANGE_DECIMAL_PRECISION] Decimal(expanded, -11000000000000000000000000000000000000.1, 39, 1) cannot be represented as Decimal(38, 1). If necessary set "spark.sql.ansi.enabled" to "false" to bypass this error.
92
-
== SQL(line 1, position 7) ==
92
+
== SQL(line 1, position 8) ==
93
93
select (-4e36BD - 0.1) - 7e36BD
94
94
^^^^^^^^^^^^^^^^^^^^^^^^
95
95
@@ -101,7 +101,7 @@ struct<>
101
101
-- !query output
102
102
org.apache.spark.SparkArithmeticException
103
103
[CANNOT_CHANGE_DECIMAL_PRECISION] Decimal(expanded, 152415787532388367501905199875019052100, 39, 0) cannot be represented as Decimal(38, 2). If necessary set "spark.sql.ansi.enabled" to "false" to bypass this error.
[CANNOT_CHANGE_DECIMAL_PRECISION] Decimal(expanded, 1000000000000000000000000000000000000.00000000000000000000000000000000000000, 75, 38) cannot be represented as Decimal(38, 6). If necessary set "spark.sql.ansi.enabled" to "false" to bypass this error.
116
-
== SQL(line 1, position 7) ==
116
+
== SQL(line 1, position 8) ==
117
117
select 1e35BD / 0.1
118
118
^^^^^^^^^^^^
119
119
@@ -149,7 +149,7 @@ struct<>
149
149
-- !query output
150
150
org.apache.spark.SparkArithmeticException
151
151
[CANNOT_CHANGE_DECIMAL_PRECISION] Decimal(expanded, 10123456789012345678901234567890123456.00000000000000000000000000000000000000, 76, 38) cannot be represented as Decimal(38, 6). If necessary set "spark.sql.ansi.enabled" to "false" to bypass this error.
[CANNOT_CHANGE_DECIMAL_PRECISION] Decimal(expanded, 101234567890123456789012345678901234.56000000000000000000000000000000000000, 74, 38) cannot be represented as Decimal(38, 6). If necessary set "spark.sql.ansi.enabled" to "false" to bypass this error.
[CANNOT_CHANGE_DECIMAL_PRECISION] Decimal(expanded, 10123456789012345678901234567890123.45600000000000000000000000000000000000, 73, 38) cannot be represented as Decimal(38, 6). If necessary set "spark.sql.ansi.enabled" to "false" to bypass this error.
[CANNOT_CHANGE_DECIMAL_PRECISION] Decimal(expanded, 1012345678901234567890123456789012.34560000000000000000000000000000000000, 72, 38) cannot be represented as Decimal(38, 6). If necessary set "spark.sql.ansi.enabled" to "false" to bypass this error.
[CANNOT_CHANGE_DECIMAL_PRECISION] Decimal(expanded, 101234567890123456789012345678901.23456000000000000000000000000000000000, 71, 38) cannot be represented as Decimal(38, 6). If necessary set "spark.sql.ansi.enabled" to "false" to bypass this error.
[CANNOT_CHANGE_DECIMAL_PRECISION] Decimal(expanded, 101234567890123456789012345678901.23456000000000000000000000000000000000, 71, 38) cannot be represented as Decimal(38, 6). If necessary set "spark.sql.ansi.enabled" to "false" to bypass this error.
Copy file name to clipboardExpand all lines: sql/core/src/test/resources/sql-tests/results/ansi/interval.sql.out
+17-17Lines changed: 17 additions & 17 deletions
Original file line number
Diff line number
Diff line change
@@ -123,7 +123,7 @@ struct<>
123
123
-- !query output
124
124
org.apache.spark.SparkNumberFormatException
125
125
[CAST_INVALID_INPUT] The value 'a' of the type "STRING" cannot be cast to "DOUBLE" because it is malformed. Correct the value as per the syntax, or change its target type. To return NULL instead, use `try_cast`. If necessary set "spark.sql.ansi.enabled" to "false" to bypass this error.
126
-
== SQL(line 1, position 7) ==
126
+
== SQL(line 1, position 8) ==
127
127
select interval 2 second * 'a'
128
128
^^^^^^^^^^^^^^^^^^^^^^^
129
129
@@ -135,7 +135,7 @@ struct<>
135
135
-- !query output
136
136
org.apache.spark.SparkNumberFormatException
137
137
[CAST_INVALID_INPUT] The value 'a' of the type "STRING" cannot be cast to "DOUBLE" because it is malformed. Correct the value as per the syntax, or change its target type. To return NULL instead, use `try_cast`. If necessary set "spark.sql.ansi.enabled" to "false" to bypass this error.
138
-
== SQL(line 1, position 7) ==
138
+
== SQL(line 1, position 8) ==
139
139
select interval 2 second / 'a'
140
140
^^^^^^^^^^^^^^^^^^^^^^^
141
141
@@ -147,7 +147,7 @@ struct<>
147
147
-- !query output
148
148
org.apache.spark.SparkNumberFormatException
149
149
[CAST_INVALID_INPUT] The value 'a' of the type "STRING" cannot be cast to "DOUBLE" because it is malformed. Correct the value as per the syntax, or change its target type. To return NULL instead, use `try_cast`. If necessary set "spark.sql.ansi.enabled" to "false" to bypass this error.
150
-
== SQL(line 1, position 7) ==
150
+
== SQL(line 1, position 8) ==
151
151
select interval 2 year * 'a'
152
152
^^^^^^^^^^^^^^^^^^^^^
153
153
@@ -159,7 +159,7 @@ struct<>
159
159
-- !query output
160
160
org.apache.spark.SparkNumberFormatException
161
161
[CAST_INVALID_INPUT] The value 'a' of the type "STRING" cannot be cast to "DOUBLE" because it is malformed. Correct the value as per the syntax, or change its target type. To return NULL instead, use `try_cast`. If necessary set "spark.sql.ansi.enabled" to "false" to bypass this error.
162
-
== SQL(line 1, position 7) ==
162
+
== SQL(line 1, position 8) ==
163
163
select interval 2 year / 'a'
164
164
^^^^^^^^^^^^^^^^^^^^^
165
165
@@ -187,7 +187,7 @@ struct<>
187
187
-- !query output
188
188
org.apache.spark.SparkNumberFormatException
189
189
[CAST_INVALID_INPUT] The value 'a' of the type "STRING" cannot be cast to "DOUBLE" because it is malformed. Correct the value as per the syntax, or change its target type. To return NULL instead, use `try_cast`. If necessary set "spark.sql.ansi.enabled" to "false" to bypass this error.
190
-
== SQL(line 1, position 7) ==
190
+
== SQL(line 1, position 8) ==
191
191
select 'a' * interval 2 second
192
192
^^^^^^^^^^^^^^^^^^^^^^^
193
193
@@ -199,7 +199,7 @@ struct<>
199
199
-- !query output
200
200
org.apache.spark.SparkNumberFormatException
201
201
[CAST_INVALID_INPUT] The value 'a' of the type "STRING" cannot be cast to "DOUBLE" because it is malformed. Correct the value as per the syntax, or change its target type. To return NULL instead, use `try_cast`. If necessary set "spark.sql.ansi.enabled" to "false" to bypass this error.
202
-
== SQL(line 1, position 7) ==
202
+
== SQL(line 1, position 8) ==
203
203
select 'a' * interval 2 year
204
204
^^^^^^^^^^^^^^^^^^^^^
205
205
@@ -229,7 +229,7 @@ struct<>
229
229
-- !query output
230
230
org.apache.spark.SparkArithmeticException
231
231
[DIVIDE_BY_ZERO] Division by zero. To return NULL instead, use `try_divide`. If necessary set "spark.sql.ansi.enabled" to "false" (except for ANSI interval type) to bypass this error.
232
-
== SQL(line 1, position 7) ==
232
+
== SQL(line 1, position 8) ==
233
233
select interval '2 seconds' / 0
234
234
^^^^^^^^^^^^^^^^^^^^^^^^
235
235
@@ -265,7 +265,7 @@ struct<>
265
265
-- !query output
266
266
org.apache.spark.SparkArithmeticException
267
267
[DIVIDE_BY_ZERO] Division by zero. To return NULL instead, use `try_divide`. If necessary set "spark.sql.ansi.enabled" to "false" (except for ANSI interval type) to bypass this error.
268
-
== SQL(line 1, position 7) ==
268
+
== SQL(line 1, position 8) ==
269
269
select interval '2' year / 0
270
270
^^^^^^^^^^^^^^^^^^^^^
271
271
@@ -665,7 +665,7 @@ struct<>
665
665
-- !query output
666
666
org.apache.spark.SparkArithmeticException
667
667
[CANNOT_CHANGE_DECIMAL_PRECISION] Decimal(expanded, 1234567890123456789, 20, 0) cannot be represented as Decimal(18, 6). If necessary set "spark.sql.ansi.enabled" to "false" to bypass this error.
[CAST_INVALID_INPUT] The value '4 11:11' of the type "STRING" cannot be cast to "TIMESTAMP" because it is malformed. Correct the value as per the syntax, or change its target type. To return NULL instead, use `try_cast`. If necessary set "spark.sql.ansi.enabled" to "false" to bypass this error.
1520
-
== SQL(line 1, position 7) ==
1520
+
== SQL(line 1, position 8) ==
1521
1521
select '4 11:11' - interval '4 22:12' day to minute
1522
1522
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
1523
1523
@@ -1529,7 +1529,7 @@ struct<>
1529
1529
-- !query output
1530
1530
org.apache.spark.SparkDateTimeException
1531
1531
[CAST_INVALID_INPUT] The value '4 12:12:12' of the type "STRING" cannot be cast to "TIMESTAMP" because it is malformed. Correct the value as per the syntax, or change its target type. To return NULL instead, use `try_cast`. If necessary set "spark.sql.ansi.enabled" to "false" to bypass this error.
1532
-
== SQL(line 1, position 7) ==
1532
+
== SQL(line 1, position 8) ==
1533
1533
select '4 12:12:12' + interval '4 22:12' day to minute
1534
1534
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
1535
1535
@@ -1567,7 +1567,7 @@ struct<>
1567
1567
-- !query output
1568
1568
org.apache.spark.SparkDateTimeException
1569
1569
[CAST_INVALID_INPUT] The value '1' of the type "STRING" cannot be cast to "TIMESTAMP" because it is malformed. Correct the value as per the syntax, or change its target type. To return NULL instead, use `try_cast`. If necessary set "spark.sql.ansi.enabled" to "false" to bypass this error.
1570
-
== SQL(line 1, position 7) ==
1570
+
== SQL(line 1, position 8) ==
1571
1571
select str - interval '4 22:12' day to minute from interval_view
1572
1572
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
1573
1573
@@ -1579,7 +1579,7 @@ struct<>
1579
1579
-- !query output
1580
1580
org.apache.spark.SparkDateTimeException
1581
1581
[CAST_INVALID_INPUT] The value '1' of the type "STRING" cannot be cast to "TIMESTAMP" because it is malformed. Correct the value as per the syntax, or change its target type. To return NULL instead, use `try_cast`. If necessary set "spark.sql.ansi.enabled" to "false" to bypass this error.
1582
-
== SQL(line 1, position 7) ==
1582
+
== SQL(line 1, position 8) ==
1583
1583
select str + interval '4 22:12' day to minute from interval_view
1584
1584
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
1585
1585
@@ -2037,7 +2037,7 @@ struct<>
2037
2037
-- !query output
2038
2038
org.apache.spark.SparkArithmeticException
2039
2039
[ARITHMETIC_OVERFLOW] Overflow in integral divide. To return NULL instead, use 'try_divide'. If necessary set spark.sql.ansi.enabled to "false" (except for ANSI interval type) to bypass this error.
2040
-
== SQL(line 1, position 7) ==
2040
+
== SQL(line 1, position 8) ==
2041
2041
SELECT (INTERVAL '-178956970-8' YEAR TO MONTH) / -1
2042
2042
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2043
2043
@@ -2049,7 +2049,7 @@ struct<>
2049
2049
-- !query output
2050
2050
org.apache.spark.SparkArithmeticException
2051
2051
[ARITHMETIC_OVERFLOW] Overflow in integral divide. To return NULL instead, use 'try_divide'. If necessary set spark.sql.ansi.enabled to "false" (except for ANSI interval type) to bypass this error.
2052
-
== SQL(line 1, position 7) ==
2052
+
== SQL(line 1, position 8) ==
2053
2053
SELECT (INTERVAL '-178956970-8' YEAR TO MONTH) / -1L
2054
2054
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2055
2055
@@ -2095,7 +2095,7 @@ struct<>
2095
2095
-- !query output
2096
2096
org.apache.spark.SparkArithmeticException
2097
2097
[ARITHMETIC_OVERFLOW] Overflow in integral divide. To return NULL instead, use 'try_divide'. If necessary set spark.sql.ansi.enabled to "false" (except for ANSI interval type) to bypass this error.
2098
-
== SQL(line 1, position 7) ==
2098
+
== SQL(line 1, position 8) ==
2099
2099
SELECT (INTERVAL '-106751991 04:00:54.775808' DAY TO SECOND) / -1
[ARITHMETIC_OVERFLOW] Overflow in integral divide. To return NULL instead, use 'try_divide'. If necessary set spark.sql.ansi.enabled to "false" (except for ANSI interval type) to bypass this error.
2110
-
== SQL(line 1, position 7) ==
2110
+
== SQL(line 1, position 8) ==
2111
2111
SELECT (INTERVAL '-106751991 04:00:54.775808' DAY TO SECOND) / -1L
Copy file name to clipboardExpand all lines: sql/core/src/test/resources/sql-tests/results/ansi/map.sql.out
+4-4Lines changed: 4 additions & 4 deletions
Original file line number
Diff line number
Diff line change
@@ -9,7 +9,7 @@ struct<>
9
9
-- !query output
10
10
org.apache.spark.SparkNoSuchElementException
11
11
[MAP_KEY_DOES_NOT_EXIST] Key 5 does not exist. To return NULL instead, use `try_element_at`. If necessary set "spark.sql.ansi.enabled" to "false" to bypass this error.
12
-
== SQL(line 1, position 7) ==
12
+
== SQL(line 1, position 8) ==
13
13
select element_at(map(1, 'a', 2, 'b'), 5)
14
14
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
15
15
@@ -21,7 +21,7 @@ struct<>
21
21
-- !query output
22
22
org.apache.spark.SparkNoSuchElementException
23
23
[MAP_KEY_DOES_NOT_EXIST] Key 5 does not exist. To return NULL instead, use `try_element_at`. If necessary set "spark.sql.ansi.enabled" to "false" to bypass this error.
24
-
== SQL(line 1, position 7) ==
24
+
== SQL(line 1, position 8) ==
25
25
select map(1, 'a', 2, 'b')[5]
26
26
^^^^^^^^^^^^^^^^^^^^^^
27
27
@@ -115,7 +115,7 @@ struct<>
115
115
-- !query output
116
116
org.apache.spark.SparkNoSuchElementException
117
117
[MAP_KEY_DOES_NOT_EXIST] Key 5 does not exist. To return NULL instead, use `try_element_at`. If necessary set "spark.sql.ansi.enabled" to "false" to bypass this error.
118
-
== SQL(line 1, position 7) ==
118
+
== SQL(line 1, position 8) ==
119
119
select element_at(map(1, 'a', 2, 'b'), 5)
120
120
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
121
121
@@ -127,6 +127,6 @@ struct<>
127
127
-- !query output
128
128
org.apache.spark.SparkNoSuchElementException
129
129
[MAP_KEY_DOES_NOT_EXIST] Key 'c' does not exist. To return NULL instead, use `try_element_at`. If necessary set "spark.sql.ansi.enabled" to "false" to bypass this error.
0 commit comments