Skip to content

Commit

Permalink
[SPARK-38211][SQL][DOCS] Add SQL migration guide on restoring loose u…
Browse files Browse the repository at this point in the history
…pcast from string to other types

### What changes were proposed in this pull request?
Add doc on restoring loose upcast from string to other types (behavior before 2.4.1) to SQL migration guide.

### Why are the changes needed?
After [SPARK-24586](https://issues.apache.org/jira/browse/SPARK-24586), loose upcasting from string to other types are not allowed by default. User can still set `spark.sql.legacy.looseUpcast=true` to restore old behavior but it's not documented.

### Does this PR introduce _any_ user-facing change?
No.

### How was this patch tested?
Only doc change.

Closes #35519 from manuzhang/spark-38211.

Authored-by: tianlzhang <tianlzhang@ebay.com>
Signed-off-by: Hyukjin Kwon <gurwls223@apache.org>
(cherry picked from commit 78514e3)
Signed-off-by: Hyukjin Kwon <gurwls223@apache.org>
  • Loading branch information
manuzhang authored and HyukjinKwon committed Feb 15, 2022
1 parent 4098d26 commit d1ca91c
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion docs/sql-migration-guide.md
Original file line number Diff line number Diff line change
Expand Up @@ -299,7 +299,7 @@ license: |
need to specify a value with units like "30s" now, to avoid being interpreted as milliseconds; otherwise,
the extremely short interval that results will likely cause applications to fail.

- When turning a Dataset to another Dataset, Spark will up cast the fields in the original Dataset to the type of corresponding fields in the target DataSet. In version 2.4 and earlier, this up cast is not very strict, e.g. `Seq("str").toDS.as[Int]` fails, but `Seq("str").toDS.as[Boolean]` works and throw NPE during execution. In Spark 3.0, the up cast is stricter and turning String into something else is not allowed, i.e. `Seq("str").toDS.as[Boolean]` will fail during analysis.
- When turning a Dataset to another Dataset, Spark will up cast the fields in the original Dataset to the type of corresponding fields in the target DataSet. In version 2.4 and earlier, this up cast is not very strict, e.g. `Seq("str").toDS.as[Int]` fails, but `Seq("str").toDS.as[Boolean]` works and throw NPE during execution. In Spark 3.0, the up cast is stricter and turning String into something else is not allowed, i.e. `Seq("str").toDS.as[Boolean]` will fail during analysis. To restore the behavior before 2.4.1, set `spark.sql.legacy.looseUpcast` to `true`.

## Upgrading from Spark SQL 2.3 to 2.4

Expand Down

0 comments on commit d1ca91c

Please sign in to comment.