Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

support ML function #450

Conversation

lizhou1111
Copy link
Contributor

@lizhou1111 lizhou1111 commented Dec 27, 2023

support ML function: stochastic_linear_regression_state and stochastic_logistic_regression_state
For example:

drop stream if exists train_data;
CREATE STREAM IF NOT EXISTS train_data
(
    param1 int,
    param2 int,
    target int
) engine=Memory;
insert into train_data(param1, param2, target) values (1, 1, 1);
insert into train_data(param1, param2, target) values (2, 2, 2);

-- train model
drop stream if exists your_model;
CREATE stream if not exists your_model ENGINE = Memory AS SELECT
stochastic_linear_regression_state(0.01, 0.0, 10, 'Adam', target, param1, param2) AS state FROM train_data;

drop stream if exists test_data;
CREATE stream IF NOT EXISTS test_data
(
    param1 int,
    param2 int
);

-- streaming inference 
WITH (SELECT state FROM your_model) AS model SELECT  
eval_ml_method(model, param1, param2) FROM test_data;

-- insert data into test_data, and it will output the predict value.
insert into test_data(param1, param2) values (3, 3);
insert into test_data(param1, param2) values (4, 4);

About the ML methed arguments, You can refer to this page.You have to give these 4 parameters

This closes #430

@lizhou1111 lizhou1111 self-assigned this Dec 27, 2023
@lizhou1111 lizhou1111 merged commit 942f10e into develop Dec 28, 2023
21 checks passed
@lizhou1111 lizhou1111 deleted the feature/issue-430-support-ML-function-with-streaming-inference branch December 28, 2023 01:37
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Support ML function with streaming inference
2 participants