Skip to content

Commit 7120e6b

Browse files
committed
[SPARK-45905][SQL] Least common type between decimal types should retain integral digits first
### What changes were proposed in this pull request? This is kind of a followup of #20023 . It's simply wrong to cut the decimal precision to 38 if a wider decimal type exceeds the max precision, which drops the integral digits and makes the decimal value very likely to overflow. In #20023 , we fixed this issue for arithmetic operations, but many other operations suffer from the same issue: Union, binary comparison, in subquery, create_array, coalesce, etc. This PR fixes all the remaining operators, without the min scale limitation, which should be applied to division and multiple only according to the SQL server doc: https://learn.microsoft.com/en-us/sql/t-sql/data-types/precision-scale-and-length-transact-sql?view=sql-server-ver15 ### Why are the changes needed? To produce reasonable wider decimal type. ### Does this PR introduce _any_ user-facing change? Yes, the final data type of these operators will be changed if it's decimal type and its precision exceeds the max and the scale is not 0. ### How was this patch tested? updated tests ### Was this patch authored or co-authored using generative AI tooling? No Closes #43781 from cloud-fan/decimal. Lead-authored-by: Wenchen Fan <[email protected]> Co-authored-by: Wenchen Fan <[email protected]> Signed-off-by: Wenchen Fan <[email protected]>
1 parent 6c8bd94 commit 7120e6b

File tree

12 files changed

+191
-214
lines changed

12 files changed

+191
-214
lines changed

docs/sql-ref-ansi-compliance.md

+19
Original file line numberDiff line numberDiff line change
@@ -240,6 +240,25 @@ The least common type resolution is used to:
240240
- Derive the result type for expressions such as the case expression.
241241
- Derive the element, key, or value types for array and map constructors.
242242
Special rules are applied if the least common type resolves to FLOAT. With float type values, if any of the types is INT, BIGINT, or DECIMAL the least common type is pushed to DOUBLE to avoid potential loss of digits.
243+
244+
Decimal type is a bit more complicated here, as it's not a simple type but has parameters: precision and scale.
245+
A `decimal(precision, scale)` means the value can have at most `precision - scale` digits in the integral part and `scale` digits in the fractional part.
246+
A least common type between decimal types should have enough digits in both integral and fractional parts to represent all values.
247+
More precisely, a least common type between `decimal(p1, s1)` and `decimal(p2, s2)` has the scale of `max(s1, s2)` and precision of `max(s1, s2) + max(p1 - s1, p2 - s2)`.
248+
However, decimal types in Spark have a maximum precision: 38. If the final decimal type need more precision, we must do truncation.
249+
Since the digits in the integral part are more significant, Spark truncates the digits in the fractional part first. For example, `decimal(48, 20)` will be reduced to `decimal(38, 10)`.
250+
251+
Note, arithmetic operations have special rules to calculate the least common type for decimal inputs:
252+
253+
| Operation | Result precision | Result scale |
254+
|------------|------------------------------------------|---------------------|
255+
| e1 + e2 | max(s1, s2) + max(p1 - s1, p2 - s2) + 1 | max(s1, s2) |
256+
| e1 - e2 | max(s1, s2) + max(p1 - s1, p2 - s2) + 1 | max(s1, s2) |
257+
| e1 * e2 | p1 + p2 + 1 | s1 + s2 |
258+
| e1 / e2 | p1 - s1 + s2 + max(6, s1 + p2 + 1) | max(6, s1 + p2 + 1) |
259+
| e1 % e2 | min(p1 - s1, p2 - s2) + max(s1, s2) | max(s1, s2) |
260+
261+
The truncation rule is also different for arithmetic operations: they retain at least 6 digits in the fractional part, which means we can only reduce `scale` to 6. Overflow may happen in this case.
243262

244263
```sql
245264
-- The coalesce function accepts any set of argument types as long as they share a least common type.

sql/api/src/main/scala/org/apache/spark/sql/types/DecimalType.scala

+12
Original file line numberDiff line numberDiff line change
@@ -146,6 +146,18 @@ object DecimalType extends AbstractDataType {
146146
DecimalType(min(precision, MAX_PRECISION), min(scale, MAX_SCALE))
147147
}
148148

149+
private[sql] def boundedPreferIntegralDigits(precision: Int, scale: Int): DecimalType = {
150+
if (precision <= MAX_PRECISION) {
151+
DecimalType(precision, scale)
152+
} else {
153+
// If we have to reduce the precision, we should retain the digits in the integral part first,
154+
// as they are more significant to the value. Here we reduce the scale as well to drop the
155+
// digits in the fractional part.
156+
val diff = precision - MAX_PRECISION
157+
DecimalType(MAX_PRECISION, math.max(0, scale - diff))
158+
}
159+
}
160+
149161
private[sql] def checkNegativeScale(scale: Int): Unit = {
150162
if (scale < 0 && !SqlApiConf.get.allowNegativeScaleOfDecimalEnabled) {
151163
throw DataTypeErrors.negativeScaleNotAllowedError(scale)

sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/DecimalPrecision.scala

+6-1
Original file line numberDiff line numberDiff line change
@@ -20,6 +20,7 @@ package org.apache.spark.sql.catalyst.analysis
2020
import org.apache.spark.sql.catalyst.expressions._
2121
import org.apache.spark.sql.catalyst.expressions.Literal._
2222
import org.apache.spark.sql.catalyst.types.DataTypeUtils
23+
import org.apache.spark.sql.internal.SQLConf
2324
import org.apache.spark.sql.types._
2425

2526

@@ -64,7 +65,11 @@ object DecimalPrecision extends TypeCoercionRule {
6465
def widerDecimalType(p1: Int, s1: Int, p2: Int, s2: Int): DecimalType = {
6566
val scale = max(s1, s2)
6667
val range = max(p1 - s1, p2 - s2)
67-
DecimalType.bounded(range + scale, scale)
68+
if (conf.getConf(SQLConf.LEGACY_RETAIN_FRACTION_DIGITS_FIRST)) {
69+
DecimalType.bounded(range + scale, scale)
70+
} else {
71+
DecimalType.boundedPreferIntegralDigits(range + scale, scale)
72+
}
6873
}
6974

7075
override def transform: PartialFunction[Expression, Expression] = {

sql/catalyst/src/main/scala/org/apache/spark/sql/internal/SQLConf.scala

+11-1
Original file line numberDiff line numberDiff line change
@@ -4550,6 +4550,16 @@ object SQLConf {
45504550
.booleanConf
45514551
.createWithDefault(false)
45524552

4553+
val LEGACY_RETAIN_FRACTION_DIGITS_FIRST =
4554+
buildConf("spark.sql.legacy.decimal.retainFractionDigitsOnTruncate")
4555+
.internal()
4556+
.doc("When set to true, we will try to retain the fraction digits first rather than " +
4557+
"integral digits as prior Spark 4.0, when getting a least common type between decimal " +
4558+
"types, and the result decimal precision exceeds the max precision.")
4559+
.version("4.0.0")
4560+
.booleanConf
4561+
.createWithDefault(false)
4562+
45534563
/**
45544564
* Holds information about keys that have been deprecated.
45554565
*
@@ -5434,7 +5444,7 @@ class SQLConf extends Serializable with Logging with SqlApiConf {
54345444
}
54355445

54365446
def legacyRaiseErrorWithoutErrorClass: Boolean =
5437-
getConf(SQLConf.LEGACY_RAISE_ERROR_WITHOUT_ERROR_CLASS)
5447+
getConf(SQLConf.LEGACY_RAISE_ERROR_WITHOUT_ERROR_CLASS)
54385448

54395449
/** ********************** SQLConf functionality methods ************ */
54405450

sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/analysis/AnsiTypeCoercionSuite.scala

+5-5
Original file line numberDiff line numberDiff line change
@@ -481,9 +481,9 @@ class AnsiTypeCoercionSuite extends TypeCoercionSuiteBase {
481481
:: Literal.create(null, DecimalType(22, 10))
482482
:: Literal.create(null, DecimalType(38, 38))
483483
:: Nil),
484-
CreateArray(Literal.create(null, DecimalType(5, 3)).cast(DecimalType(38, 38))
485-
:: Literal.create(null, DecimalType(22, 10)).cast(DecimalType(38, 38))
486-
:: Literal.create(null, DecimalType(38, 38))
484+
CreateArray(Literal.create(null, DecimalType(5, 3)).cast(DecimalType(38, 26))
485+
:: Literal.create(null, DecimalType(22, 10)).cast(DecimalType(38, 26))
486+
:: Literal.create(null, DecimalType(38, 38)).cast(DecimalType(38, 26))
487487
:: Nil))
488488
}
489489

@@ -530,9 +530,9 @@ class AnsiTypeCoercionSuite extends TypeCoercionSuiteBase {
530530
:: Literal.create(null, DecimalType(38, 38))
531531
:: Nil),
532532
CreateMap(Literal(1)
533-
:: Literal.create(null, DecimalType(38, 0)).cast(DecimalType(38, 38))
533+
:: Literal.create(null, DecimalType(38, 0))
534534
:: Literal(2)
535-
:: Literal.create(null, DecimalType(38, 38))
535+
:: Literal.create(null, DecimalType(38, 38)).cast(DecimalType(38, 0))
536536
:: Nil))
537537
// type coercion for both map keys and values
538538
ruleTest(AnsiTypeCoercion.FunctionArgumentConversion,

sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/analysis/TypeCoercionSuite.scala

+9-9
Original file line numberDiff line numberDiff line change
@@ -782,7 +782,7 @@ class TypeCoercionSuite extends TypeCoercionSuiteBase {
782782
widenTestWithStringPromotion(
783783
ArrayType(DecimalType(36, 0), containsNull = false),
784784
ArrayType(DecimalType(36, 35), containsNull = false),
785-
Some(ArrayType(DecimalType(38, 35), containsNull = true)))
785+
Some(ArrayType(DecimalType(38, 2), containsNull = false)))
786786

787787
// MapType
788788
widenTestWithStringPromotion(
@@ -808,15 +808,15 @@ class TypeCoercionSuite extends TypeCoercionSuiteBase {
808808
widenTestWithStringPromotion(
809809
MapType(StringType, DecimalType(36, 0), valueContainsNull = false),
810810
MapType(StringType, DecimalType(36, 35), valueContainsNull = false),
811-
Some(MapType(StringType, DecimalType(38, 35), valueContainsNull = true)))
811+
Some(MapType(StringType, DecimalType(38, 2), valueContainsNull = false)))
812812
widenTestWithStringPromotion(
813813
MapType(IntegerType, StringType, valueContainsNull = false),
814814
MapType(DecimalType.IntDecimal, StringType, valueContainsNull = false),
815815
Some(MapType(DecimalType.IntDecimal, StringType, valueContainsNull = false)))
816816
widenTestWithStringPromotion(
817817
MapType(DecimalType(36, 0), StringType, valueContainsNull = false),
818818
MapType(DecimalType(36, 35), StringType, valueContainsNull = false),
819-
None)
819+
Some(MapType(DecimalType(38, 2), StringType, valueContainsNull = false)))
820820

821821
// StructType
822822
widenTestWithStringPromotion(
@@ -847,7 +847,7 @@ class TypeCoercionSuite extends TypeCoercionSuiteBase {
847847
widenTestWithStringPromotion(
848848
new StructType().add("num", DecimalType(36, 0), nullable = false),
849849
new StructType().add("num", DecimalType(36, 35), nullable = false),
850-
Some(new StructType().add("num", DecimalType(38, 35), nullable = true)))
850+
Some(new StructType().add("num", DecimalType(38, 2), nullable = false)))
851851

852852
widenTestWithStringPromotion(
853853
new StructType().add("num", IntegerType),
@@ -1046,9 +1046,9 @@ class TypeCoercionSuite extends TypeCoercionSuiteBase {
10461046
:: Literal.create(null, DecimalType(22, 10))
10471047
:: Literal.create(null, DecimalType(38, 38))
10481048
:: Nil),
1049-
CreateArray(Literal.create(null, DecimalType(5, 3)).cast(DecimalType(38, 38))
1050-
:: Literal.create(null, DecimalType(22, 10)).cast(DecimalType(38, 38))
1051-
:: Literal.create(null, DecimalType(38, 38))
1049+
CreateArray(Literal.create(null, DecimalType(5, 3)).cast(DecimalType(38, 26))
1050+
:: Literal.create(null, DecimalType(22, 10)).cast(DecimalType(38, 26))
1051+
:: Literal.create(null, DecimalType(38, 38)).cast(DecimalType(38, 26))
10521052
:: Nil))
10531053
}
10541054

@@ -1095,9 +1095,9 @@ class TypeCoercionSuite extends TypeCoercionSuiteBase {
10951095
:: Literal.create(null, DecimalType(38, 38))
10961096
:: Nil),
10971097
CreateMap(Literal(1)
1098-
:: Literal.create(null, DecimalType(38, 0)).cast(DecimalType(38, 38))
1098+
:: Literal.create(null, DecimalType(38, 0))
10991099
:: Literal(2)
1100-
:: Literal.create(null, DecimalType(38, 38))
1100+
:: Literal.create(null, DecimalType(38, 38)).cast(DecimalType(38, 0))
11011101
:: Nil))
11021102
// type coercion for both map keys and values
11031103
ruleTest(TypeCoercion.FunctionArgumentConversion,

sql/core/src/test/resources/sql-tests/analyzer-results/typeCoercion/native/mapZipWith.sql.out

+14-36
Original file line numberDiff line numberDiff line change
@@ -128,24 +128,13 @@ Project [map_zip_with(double_map#x, cast(float_map#x as map<double,float>), lamb
128128
SELECT map_zip_with(decimal_map1, decimal_map2, (k, v1, v2) -> struct(k, v1, v2)) m
129129
FROM various_maps
130130
-- !query analysis
131-
org.apache.spark.sql.catalyst.ExtendedAnalysisException
132-
{
133-
"errorClass" : "DATATYPE_MISMATCH.MAP_ZIP_WITH_DIFF_TYPES",
134-
"sqlState" : "42K09",
135-
"messageParameters" : {
136-
"functionName" : "`map_zip_with`",
137-
"leftType" : "\"DECIMAL(36,0)\"",
138-
"rightType" : "\"DECIMAL(36,35)\"",
139-
"sqlExpr" : "\"map_zip_with(decimal_map1, decimal_map2, lambdafunction(struct(k, v1, v2), k, v1, v2))\""
140-
},
141-
"queryContext" : [ {
142-
"objectType" : "",
143-
"objectName" : "",
144-
"startIndex" : 8,
145-
"stopIndex" : 81,
146-
"fragment" : "map_zip_with(decimal_map1, decimal_map2, (k, v1, v2) -> struct(k, v1, v2))"
147-
} ]
148-
}
131+
Project [map_zip_with(cast(decimal_map1#x as map<decimal(38,2),decimal(36,0)>), cast(decimal_map2#x as map<decimal(38,2),decimal(36,35)>), lambdafunction(struct(k, lambda k#x, v1, lambda v1#x, v2, lambda v2#x), lambda k#x, lambda v1#x, lambda v2#x, false)) AS m#x]
132+
+- SubqueryAlias various_maps
133+
+- View (`various_maps`, [boolean_map#x,tinyint_map#x,smallint_map#x,int_map#x,bigint_map#x,decimal_map1#x,decimal_map2#x,double_map#x,float_map#x,date_map#x,timestamp_map#x,string_map1#x,string_map2#x,string_map3#x,string_map4#x,array_map1#x,array_map2#x,struct_map1#x,struct_map2#x])
134+
+- Project [cast(boolean_map#x as map<boolean,boolean>) AS boolean_map#x, cast(tinyint_map#x as map<tinyint,tinyint>) AS tinyint_map#x, cast(smallint_map#x as map<smallint,smallint>) AS smallint_map#x, cast(int_map#x as map<int,int>) AS int_map#x, cast(bigint_map#x as map<bigint,bigint>) AS bigint_map#x, cast(decimal_map1#x as map<decimal(36,0),decimal(36,0)>) AS decimal_map1#x, cast(decimal_map2#x as map<decimal(36,35),decimal(36,35)>) AS decimal_map2#x, cast(double_map#x as map<double,double>) AS double_map#x, cast(float_map#x as map<float,float>) AS float_map#x, cast(date_map#x as map<date,date>) AS date_map#x, cast(timestamp_map#x as map<timestamp,timestamp>) AS timestamp_map#x, cast(string_map1#x as map<string,string>) AS string_map1#x, cast(string_map2#x as map<string,string>) AS string_map2#x, cast(string_map3#x as map<string,string>) AS string_map3#x, cast(string_map4#x as map<string,string>) AS string_map4#x, cast(array_map1#x as map<array<bigint>,array<bigint>>) AS array_map1#x, cast(array_map2#x as map<array<int>,array<int>>) AS array_map2#x, cast(struct_map1#x as map<struct<col1:smallint,col2:bigint>,struct<col1:smallint,col2:bigint>>) AS struct_map1#x, cast(struct_map2#x as map<struct<col1:int,col2:int>,struct<col1:int,col2:int>>) AS struct_map2#x]
135+
+- Project [boolean_map#x, tinyint_map#x, smallint_map#x, int_map#x, bigint_map#x, decimal_map1#x, decimal_map2#x, double_map#x, float_map#x, date_map#x, timestamp_map#x, string_map1#x, string_map2#x, string_map3#x, string_map4#x, array_map1#x, array_map2#x, struct_map1#x, struct_map2#x]
136+
+- SubqueryAlias various_maps
137+
+- LocalRelation [boolean_map#x, tinyint_map#x, smallint_map#x, int_map#x, bigint_map#x, decimal_map1#x, decimal_map2#x, double_map#x, float_map#x, date_map#x, timestamp_map#x, string_map1#x, string_map2#x, string_map3#x, string_map4#x, array_map1#x, array_map2#x, struct_map1#x, struct_map2#x]
149138

150139

151140
-- !query
@@ -178,24 +167,13 @@ Project [map_zip_with(cast(decimal_map1#x as map<double,decimal(36,0)>), double_
178167
SELECT map_zip_with(decimal_map2, int_map, (k, v1, v2) -> struct(k, v1, v2)) m
179168
FROM various_maps
180169
-- !query analysis
181-
org.apache.spark.sql.catalyst.ExtendedAnalysisException
182-
{
183-
"errorClass" : "DATATYPE_MISMATCH.MAP_ZIP_WITH_DIFF_TYPES",
184-
"sqlState" : "42K09",
185-
"messageParameters" : {
186-
"functionName" : "`map_zip_with`",
187-
"leftType" : "\"DECIMAL(36,35)\"",
188-
"rightType" : "\"INT\"",
189-
"sqlExpr" : "\"map_zip_with(decimal_map2, int_map, lambdafunction(struct(k, v1, v2), k, v1, v2))\""
190-
},
191-
"queryContext" : [ {
192-
"objectType" : "",
193-
"objectName" : "",
194-
"startIndex" : 8,
195-
"stopIndex" : 76,
196-
"fragment" : "map_zip_with(decimal_map2, int_map, (k, v1, v2) -> struct(k, v1, v2))"
197-
} ]
198-
}
170+
Project [map_zip_with(cast(decimal_map2#x as map<decimal(38,28),decimal(36,35)>), cast(int_map#x as map<decimal(38,28),int>), lambdafunction(struct(k, lambda k#x, v1, lambda v1#x, v2, lambda v2#x), lambda k#x, lambda v1#x, lambda v2#x, false)) AS m#x]
171+
+- SubqueryAlias various_maps
172+
+- View (`various_maps`, [boolean_map#x,tinyint_map#x,smallint_map#x,int_map#x,bigint_map#x,decimal_map1#x,decimal_map2#x,double_map#x,float_map#x,date_map#x,timestamp_map#x,string_map1#x,string_map2#x,string_map3#x,string_map4#x,array_map1#x,array_map2#x,struct_map1#x,struct_map2#x])
173+
+- Project [cast(boolean_map#x as map<boolean,boolean>) AS boolean_map#x, cast(tinyint_map#x as map<tinyint,tinyint>) AS tinyint_map#x, cast(smallint_map#x as map<smallint,smallint>) AS smallint_map#x, cast(int_map#x as map<int,int>) AS int_map#x, cast(bigint_map#x as map<bigint,bigint>) AS bigint_map#x, cast(decimal_map1#x as map<decimal(36,0),decimal(36,0)>) AS decimal_map1#x, cast(decimal_map2#x as map<decimal(36,35),decimal(36,35)>) AS decimal_map2#x, cast(double_map#x as map<double,double>) AS double_map#x, cast(float_map#x as map<float,float>) AS float_map#x, cast(date_map#x as map<date,date>) AS date_map#x, cast(timestamp_map#x as map<timestamp,timestamp>) AS timestamp_map#x, cast(string_map1#x as map<string,string>) AS string_map1#x, cast(string_map2#x as map<string,string>) AS string_map2#x, cast(string_map3#x as map<string,string>) AS string_map3#x, cast(string_map4#x as map<string,string>) AS string_map4#x, cast(array_map1#x as map<array<bigint>,array<bigint>>) AS array_map1#x, cast(array_map2#x as map<array<int>,array<int>>) AS array_map2#x, cast(struct_map1#x as map<struct<col1:smallint,col2:bigint>,struct<col1:smallint,col2:bigint>>) AS struct_map1#x, cast(struct_map2#x as map<struct<col1:int,col2:int>,struct<col1:int,col2:int>>) AS struct_map2#x]
174+
+- Project [boolean_map#x, tinyint_map#x, smallint_map#x, int_map#x, bigint_map#x, decimal_map1#x, decimal_map2#x, double_map#x, float_map#x, date_map#x, timestamp_map#x, string_map1#x, string_map2#x, string_map3#x, string_map4#x, array_map1#x, array_map2#x, struct_map1#x, struct_map2#x]
175+
+- SubqueryAlias various_maps
176+
+- LocalRelation [boolean_map#x, tinyint_map#x, smallint_map#x, int_map#x, bigint_map#x, decimal_map1#x, decimal_map2#x, double_map#x, float_map#x, date_map#x, timestamp_map#x, string_map1#x, string_map2#x, string_map3#x, string_map4#x, array_map1#x, array_map2#x, struct_map1#x, struct_map2#x]
199177

200178

201179
-- !query

sql/core/src/test/resources/sql-tests/results/typeCoercion/native/mapZipWith.sql.out

+4-38
Original file line numberDiff line numberDiff line change
@@ -79,26 +79,9 @@ struct<m:map<double,struct<k:double,v1:double,v2:float>>>
7979
SELECT map_zip_with(decimal_map1, decimal_map2, (k, v1, v2) -> struct(k, v1, v2)) m
8080
FROM various_maps
8181
-- !query schema
82-
struct<>
82+
struct<m:map<decimal(38,2),struct<k:decimal(38,2),v1:decimal(36,0),v2:decimal(36,35)>>>
8383
-- !query output
84-
org.apache.spark.sql.catalyst.ExtendedAnalysisException
85-
{
86-
"errorClass" : "DATATYPE_MISMATCH.MAP_ZIP_WITH_DIFF_TYPES",
87-
"sqlState" : "42K09",
88-
"messageParameters" : {
89-
"functionName" : "`map_zip_with`",
90-
"leftType" : "\"DECIMAL(36,0)\"",
91-
"rightType" : "\"DECIMAL(36,35)\"",
92-
"sqlExpr" : "\"map_zip_with(decimal_map1, decimal_map2, lambdafunction(struct(k, v1, v2), k, v1, v2))\""
93-
},
94-
"queryContext" : [ {
95-
"objectType" : "",
96-
"objectName" : "",
97-
"startIndex" : 8,
98-
"stopIndex" : 81,
99-
"fragment" : "map_zip_with(decimal_map1, decimal_map2, (k, v1, v2) -> struct(k, v1, v2))"
100-
} ]
101-
}
84+
{9.22:{"k":9.22,"v1":null,"v2":9.22337203685477897945456575809789456},922337203685477897945456575809789456.00:{"k":922337203685477897945456575809789456.00,"v1":922337203685477897945456575809789456,"v2":null}}
10285

10386

10487
-- !query
@@ -123,26 +106,9 @@ struct<m:map<double,struct<k:double,v1:decimal(36,0),v2:double>>>
123106
SELECT map_zip_with(decimal_map2, int_map, (k, v1, v2) -> struct(k, v1, v2)) m
124107
FROM various_maps
125108
-- !query schema
126-
struct<>
109+
struct<m:map<decimal(38,28),struct<k:decimal(38,28),v1:decimal(36,35),v2:int>>>
127110
-- !query output
128-
org.apache.spark.sql.catalyst.ExtendedAnalysisException
129-
{
130-
"errorClass" : "DATATYPE_MISMATCH.MAP_ZIP_WITH_DIFF_TYPES",
131-
"sqlState" : "42K09",
132-
"messageParameters" : {
133-
"functionName" : "`map_zip_with`",
134-
"leftType" : "\"DECIMAL(36,35)\"",
135-
"rightType" : "\"INT\"",
136-
"sqlExpr" : "\"map_zip_with(decimal_map2, int_map, lambdafunction(struct(k, v1, v2), k, v1, v2))\""
137-
},
138-
"queryContext" : [ {
139-
"objectType" : "",
140-
"objectName" : "",
141-
"startIndex" : 8,
142-
"stopIndex" : 76,
143-
"fragment" : "map_zip_with(decimal_map2, int_map, (k, v1, v2) -> struct(k, v1, v2))"
144-
} ]
145-
}
111+
{2.0000000000000000000000000000:{"k":2.0000000000000000000000000000,"v1":null,"v2":1},9.2233720368547789794545657581:{"k":9.2233720368547789794545657581,"v1":9.22337203685477897945456575809789456,"v2":null}}
146112

147113

148114
-- !query

0 commit comments

Comments
 (0)