Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Special submodules for different Spark versions #12

Open
benedeki opened this issue Dec 14, 2021 · 0 comments
Open

Special submodules for different Spark versions #12

benedeki opened this issue Dec 14, 2021 · 0 comments

Comments

@benedeki
Copy link
Contributor

To be able to separate behavior for Spark 2.x and Spark 3.x at compile time dedicated sub-modules can be created implementing a common trait. The appropriate sub-module then would be used at compile time based on the version matrix.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant