Skip to content
This repository has been archived by the owner on Sep 2, 2024. It is now read-only.

Revert PARQUET-1414 because it causes empty pages to be written in Spark #37

Merged
merged 4 commits into from
Feb 13, 2019

Conversation

mccheah
Copy link

@mccheah mccheah commented Feb 13, 2019

PARQUET-1414 isn't compatible with the Spark Parquet Data Source, as Spark can inadvertently cause empty pages to be written.

The problem is that in Spark's ParquetWriteSupport, Spark may choose not to write records, particularly if they are empty in optional columns, but will still indicate to the Parquet file writer that records are processed. When Spark indicates to Parquet that records are processed, Parquet might attempt to flush the page to the column chunk, since the number of records counting towards the page row limit introduced in PARQUET-1414 only accounts for the number of signals of processed rows, and not the number of values in the page directly. Suppose the Parquet page row limit is N. Then if Spark receives N null cells for a column, the Parquet writer will think there are N records filling the page (when in fact there are zero), and then write the empty page.

This is most likely to occur in Parquet writes where a column is sparsely populated.

We revert here for now, targeting to patch Spark in the near future, and then re-applying PARQUET-1414 following.

@vinooganesh
Copy link

@mccheah can you add this to the FORK.md file so that we remember to bring it back? Also seems like there are 2 commits here, was that intentional?

@mccheah
Copy link
Author

mccheah commented Feb 13, 2019

Yeah notice both of them apply to PARQUET-1414. Think the second commit was an improvement made to the first.

@mccheah mccheah closed this Feb 13, 2019
@mccheah mccheah reopened this Feb 13, 2019
@mccheah
Copy link
Author

mccheah commented Feb 13, 2019

Updated the FORK.md file.

@vinooganesh
Copy link

lgtm

Copy link

@robert3005 robert3005 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can this just be a git revert for clarity?

@vinooganesh
Copy link

@robert3005 if we squash and merge, we can just revert this PR in the future to get these changes back, right?

@robert3005
Copy link

yeah, mostly concerned about the title to maintain clean history

@robert3005 robert3005 merged commit 22a26ec into palantir:master Feb 13, 2019
@mccheah
Copy link
Author

mccheah commented Feb 14, 2019

Tracking upstream Spark at https://issues.apache.org/jira/browse/SPARK-26874

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants