-
Notifications
You must be signed in to change notification settings - Fork 121
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Support for DML Statements #68
Comments
I would also look for support for any kind of SQL Statement to Execute, not necessarily DML, but for example also Exec procedure. As this was possible with the prev version of the drivers ( with com.microsoft.azure:azure-sqldb-spark:1.0.2 I could just do sqlContext.sqlDBQuery and the Config could contain any String as queryCustom). Maybe we just miss examples for that. |
I am also really looking forward to this! |
Merge Functionality will be super helpful and avoid multiple code base in SQL and Databricks. Looking forward to Merge functionality either supporting Delta table or with Parquet files. |
Hi all, As mentioned in #21 While the older Azure SQL DB Spark connector did include this functionality to execute DML and DDL statements, since this connector is based on the Spark DataSource APIs, it is out of scope. This functionality is provided by libraries like pyodbc. B4PJS has also provided another way in #21 |
Is there support for any DML available e.g. UPDATE, DELETE, MERGE on the datasets as I could not find them on the example.
If no, what are the suggested options.
The text was updated successfully, but these errors were encountered: