Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

bug docker image path #916

Closed
cometta opened this issue Mar 5, 2024 · 7 comments · Fixed by #919
Closed

bug docker image path #916

cometta opened this issue Mar 5, 2024 · 7 comments · Fixed by #919

Comments

@cometta
Copy link
Contributor

cometta commented Mar 5, 2024

i specified "spark.kubernetes.container.image" in LIGHTER_SESSION_DEFAULT_CONF , but the juypter notebook can't identify it. in order to work, i need to insert this key in %%configure jupyter notebook. Also tested with latest docker image 0.1.0-spark3.5.0 and 0.0.50-spark3.5.0. Does LIGHTER_SESSION_DEFAULT_CONF key work on latest version of lighter?

@pdambrauskas
Copy link
Collaborator

Just checked locally it seems to work for me. Can you provide more details, how are you trying to set LIGHTER_SESSION_DEFAULT_CONF variable? snippet of your kubernetes deployment configuration maybe?

I've tested it with this this value: {"spark.kubernetes.container.image":"test"}

@cometta
Copy link
Contributor Author

cometta commented Mar 5, 2024

on 0.0.50-spark3.5.0

env:
-   name: LIGHTER_SESSION_DEFAULT_CONF
                            value: '{ "spark.kubernetes.container.image": "something/something:latest",  }

due to my code has extra comma at the back. after removed, issue solved.

on 0.1.0-spark3.5.0 , i still cant get it works after removed the comma. can you verify this?

@pdambrauskas
Copy link
Collaborator

Are you sure you did not have an extra comma when testing on 0.1.0? It seems to work locally for me.

@cometta
Copy link
Contributor Author

cometta commented Mar 6, 2024

I reran again. i checked, no extra comma. i used same deployment.yaml able to run on 0.0.50-spark3.5.0 , but cant run on tag 0.1.0-spark3.5.0, i get below error in jupyter notebook, the session pod spun up and die immediately. dont have enough time to see any error message

The code failed because of a fatal error:
	Invalid status code '200' from http://spark-something:8080/lighter/api/sessions/c134a609-836e-4bfd-aa02-5e7d1b1f7ccd/statements with error payload: {"id":"866268c5-a2aa-4338-9a40-61a36cede9de","code":"spark","output":null,"state":"waiting","createdAt":"2024-03-06T01:53:20.102609"}.

Some things to try:
a) Make sure Spark has enough available resources for Jupyter to create a Spark context.
b) Contact your Jupyter administrator to make sure the Spark magics library is configured correctly.
c) Restart the kernel.

did you tested on k8s?

@pdambrauskas
Copy link
Collaborator

Thanks for detailed response. I see that statement api returns http status 200, while sparkmagic is expecting 201.

It is not related to lighter_session_default_conf. It was probably broken, with one of new features introduced in 0.1.0.

Will try to fix it and release as soon as possible.

@pdambrauskas
Copy link
Collaborator

I've prepared an MR for the fix, and built an image for it: ghcr.io/exacaster/lighter:919-spark3.5.1

@cometta
Copy link
Contributor Author

cometta commented Mar 8, 2024

solved on ghcr.io/exacaster/lighter:919-spark3.5.1

@cometta cometta closed this as completed Mar 8, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants