Skip to content

Commit

Permalink
Refactor Spark session creation in notebook and JavaScript model
Browse files Browse the repository at this point in the history
- Renamed `create_spark_dev` to `create_spark` in `startup.py` to simplify the function name.
- Made `create_spark` available in IPython's global namespace for easier access.
- Removed the default Spark instance creation to allow for manual session management.
- Updated `SparkModel.js` to use the new `create_spark` function for initializing Spark sessions, enhancing integration with the backend API.
  • Loading branch information
xuwenyihust committed Dec 10, 2024
1 parent ce98b64 commit a2d176d
Show file tree
Hide file tree
Showing 2 changed files with 12 additions and 9 deletions.
11 changes: 9 additions & 2 deletions docker/notebook/startup.py
Original file line number Diff line number Diff line change
Expand Up @@ -106,7 +106,7 @@ def _repr_html_(self):
</div>
"""

def create_spark_dev():
def create_spark():
logger.info("Creating Spark session")
try:
config_json = requests.get("http://server:5002/spark_app/config").json()
Expand All @@ -133,4 +133,11 @@ def create_spark_dev():

return spark

spark = create_spark_dev()
# Make create_spark_dev available in IPython's global namespace
ip = get_ipython()
if ip is not None:
# Add to global namespace
ip.user_global_ns['create_spark'] = create_spark

# Don't create spark instance by default
# Remove or comment out: spark = create_spark_dev()
10 changes: 3 additions & 7 deletions webapp/src/models/SparkModel.js
Original file line number Diff line number Diff line change
Expand Up @@ -60,13 +60,9 @@ class SparkModel {
// Generate a unique spark app ID
const sparkAppId = `spark-${Date.now()}`;

// Create a cell with Spark initialization code that uses the config
const sparkInitCode = `
from startup import create_spark_dev
spark = create_spark_dev()
spark
`;
// Create a cell with Spark initialization code that uses the existing spark instance
const sparkInitCode = `spark = create_spark()
spark`;

// Create the Spark session with this config
const response = await fetch(`${config.serverBaseUrl}/spark_app/${sparkAppId}`, {
Expand Down

0 comments on commit a2d176d

Please sign in to comment.