-
Notifications
You must be signed in to change notification settings - Fork 2.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[HUDI-1493] Fixed schema compatibility check for fields. #2350
Changes from all commits
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -296,7 +296,7 @@ public MessageType convertAvroSchemaToParquet(Schema schema) { | |
public static boolean isSchemaCompatible(Schema oldSchema, Schema newSchema) { | ||
if (oldSchema.getType() == newSchema.getType() && newSchema.getType() == Schema.Type.RECORD) { | ||
// record names must match: | ||
if (!SchemaCompatibility.schemaNameEquals(oldSchema, newSchema)) { | ||
if (!SchemaCompatibility.schemaNameEquals(newSchema, oldSchema)) { | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Hi @nsivabalan @prashantwason ,I have test write a "int" type to "long" type.It fails in the the schema validate . |
||
return false; | ||
} | ||
|
||
|
@@ -329,9 +329,11 @@ public static boolean isSchemaCompatible(Schema oldSchema, Schema newSchema) { | |
// All fields in the newSchema record can be populated from the oldSchema record | ||
return true; | ||
} else { | ||
// Use the checks implemented by | ||
// Use the checks implemented by Avro | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. @prashantwason : doesn't line 299 needs fixing too? basically any calls to SchemaCompatibility.* should need a fix wrt reader and writer schema arg right? There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. and if you agree on this, can we have a test to cover that scenario(which should fail if not for this patch) There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Done |
||
// newSchema is the schema which will be used to read the records written earlier using oldSchema. Hence, in the | ||
// check below, use newSchema as the reader schema and oldSchema as the writer schema. | ||
org.apache.avro.SchemaCompatibility.SchemaPairCompatibility compatResult = | ||
org.apache.avro.SchemaCompatibility.checkReaderWriterCompatibility(oldSchema, newSchema); | ||
org.apache.avro.SchemaCompatibility.checkReaderWriterCompatibility(newSchema, oldSchema); | ||
return compatResult.getType() == org.apache.avro.SchemaCompatibility.SchemaCompatibilityType.COMPATIBLE; | ||
} | ||
} | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
can we also add vice versa i.e. long to int, and similar evolutions and ensure schema compatibility returns false.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Done