Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Create script action to copy the azure-cosmosdb-spark JARs to all HDI head and worker nodes #79

Closed
dennyglee opened this issue Jul 20, 2017 · 1 comment

Comments

@dennyglee
Copy link
Contributor

Create a script action similar to the one documented in Use Script Action to install external Python packages for Jupyter notebooks in Apache Spark clusters on HDInsight; specifically the TensorFlow Install Script Action.

@AnisTss
Copy link

AnisTss commented Oct 19, 2017

I recommend this !!
I spent 2 days with no result trying to configure HDInsight spark cluster to use the connector.
I need a more detailed guide or a script action until i get to know the spark environment as i should.

@nomiero nomiero closed this as completed Feb 5, 2019
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants