-
Notifications
You must be signed in to change notification settings - Fork 30
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
support nebula source for flink sql connector #58
Comments
/assign |
This issue is created to better track for OSPP 2022. Connector Options
Data Type Mapping
more information coming soon. |
PR is committed as #67 |
Hello, @spike-liu. I am new to Flink and Nebula, your work(#57) inspires me, I have a few questions about the implementation of flink sql as sink.
for example:
Shoule we add a parameter in with clause? At the same time, I noticed the listTables function in NebulaCatalog,
Should we customize type conversions instead of using the internal Rowdata to Row? Looking forward for your reply, thank you. |
Oops, this is awkward. It seems there are duplicated work here. Anyway, first come and first served. #67 has been closed. Go ahead, @liuxiaocs7 . |
Hello, @liuxiaocs7 . As a matter of fact, we are also using Flink in our project recently and glad to discuss these details with you. For question 1: I agree with you. Would you please create an issue to track this enhancement? https://nightlies.apache.org/flink/flink-docs-release-1.15/docs/dev/table/types/#casting If not, I would think there are two recipes to resolve data type cast issue: 1. put them in FlinkSQL. 2. put them in nebula-flink-connector. If these casts are specific for nebula, I would prefer recipe 2. Otherwise recipe 1 would be better because it is shared among all connectors, like mysql, kafka, hbase and etc. |
@liuxiaocs7 just a friendly suggestion, how about creating separate issues for discussion in the future? Your question is valuable for us I think. |
Thanks for your suggestion, I'm going to create a new issue to discuss this question. And develop the habit of discussing only one question in an issue. |
I'll try my best to get it done, thanks a lot for your help. |
@spike-liu , sorry for the late reply, now we can discuss question 1 in #70. welcome to discuss with us there. |
as title
The text was updated successfully, but these errors were encountered: