Skip to content

Commit

Permalink
[SPARK-26060][SQL][FOLLOW-UP] Rename the config name.
Browse files Browse the repository at this point in the history
## What changes were proposed in this pull request?

This is a follow-up of #23031 to rename the config name to `spark.sql.legacy.setCommandRejectsSparkCoreConfs`.

## How was this patch tested?

Existing tests.

Closes #23245 from ueshin/issues/SPARK-26060/rename_config.

Authored-by: Takuya UESHIN <ueshin@databricks.com>
Signed-off-by: Dongjoon Hyun <dongjoon@apache.org>
  • Loading branch information
ueshin authored and dongjoon-hyun committed Dec 7, 2018
1 parent 4772265 commit 1ab3d3e
Show file tree
Hide file tree
Showing 3 changed files with 6 additions and 5 deletions.
2 changes: 1 addition & 1 deletion docs/sql-migration-guide-upgrade.md
Original file line number Diff line number Diff line change
Expand Up @@ -31,7 +31,7 @@ displayTitle: Spark SQL Upgrading Guide

- In Spark version 2.4 and earlier, partition column value is converted as null if it can't be casted to corresponding user provided schema. Since 3.0, partition column value is validated with user provided schema. An exception is thrown if the validation fails. You can disable such validation by setting `spark.sql.sources.validatePartitionColumns` to `false`.

- In Spark version 2.4 and earlier, the `SET` command works without any warnings even if the specified key is for `SparkConf` entries and it has no effect because the command does not update `SparkConf`, but the behavior might confuse users. Since 3.0, the command fails if a `SparkConf` key is used. You can disable such a check by setting `spark.sql.legacy.execution.setCommandRejectsSparkConfs` to `false`.
- In Spark version 2.4 and earlier, the `SET` command works without any warnings even if the specified key is for `SparkConf` entries and it has no effect because the command does not update `SparkConf`, but the behavior might confuse users. Since 3.0, the command fails if a `SparkConf` key is used. You can disable such a check by setting `spark.sql.legacy.setCommandRejectsSparkCoreConfs` to `false`.

- Spark applications which are built with Spark version 2.4 and prior, and call methods of `UserDefinedFunction`, need to be re-compiled with Spark 3.0, as they are not binary compatible with Spark 3.0.

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -1621,8 +1621,8 @@ object SQLConf {
.intConf
.createWithDefault(25)

val SET_COMMAND_REJECTS_SPARK_CONFS =
buildConf("spark.sql.legacy.execution.setCommandRejectsSparkConfs")
val SET_COMMAND_REJECTS_SPARK_CORE_CONFS =
buildConf("spark.sql.legacy.setCommandRejectsSparkCoreConfs")
.internal()
.doc("If it is set to true, SET command will fail when the key is registered as " +
"a SparkConf entry.")
Expand Down Expand Up @@ -2057,7 +2057,8 @@ class SQLConf extends Serializable with Logging {

def maxToStringFields: Int = getConf(SQLConf.MAX_TO_STRING_FIELDS)

def setCommandRejectsSparkConfs: Boolean = getConf(SQLConf.SET_COMMAND_REJECTS_SPARK_CONFS)
def setCommandRejectsSparkCoreConfs: Boolean =
getConf(SQLConf.SET_COMMAND_REJECTS_SPARK_CORE_CONFS)

def legacyTimeParserEnabled: Boolean = getConf(SQLConf.LEGACY_TIME_PARSER_ENABLED)

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -153,7 +153,7 @@ class RuntimeConfig private[sql](sqlConf: SQLConf = new SQLConf) {
if (SQLConf.staticConfKeys.contains(key)) {
throw new AnalysisException(s"Cannot modify the value of a static config: $key")
}
if (sqlConf.setCommandRejectsSparkConfs &&
if (sqlConf.setCommandRejectsSparkCoreConfs &&
ConfigEntry.findEntry(key) != null && !SQLConf.sqlConfEntries.containsKey(key)) {
throw new AnalysisException(s"Cannot modify the value of a Spark config: $key")
}
Expand Down

0 comments on commit 1ab3d3e

Please sign in to comment.