Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Adding inference to Linear Regression model #6672

Closed
wants to merge 5 commits into from

Conversation

kavyasrinet
Copy link

@kavyasrinet kavyasrinet commented Dec 15, 2017

Adding inference to the fluid fit_a_line model, this will help us when we re-write the models for the book.
It will also help @kexinzhao and me understand how inference works, before we start working on inference wrapper for Mobile.

We have a few questions:

  1. Is the way we perform inference in this example correct ?
  2. How do we determine the program that gets run inside of exe.run during inference ? (Should we clone the fluid.default_main_program() before we start training or is the method I used here, correct ? )

print("Now performing inference...")
fluid.io.load_persistables(exe, "./fit_a_line.model/")
for data in test_reader():
out, y_pred, y_label = exe.run(fluid.default_main_program(),
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You must clone the main_program first.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@dzhwinter
Copy link
Contributor

How do we determine the program that gets run inside of exe.run during inference? (Should we clone the fluid.default_main_program() before we start training or is the method I used here, correct ? )


inference_program = fluid.default_main_program().clone()

After discussed with @reyoung @QiJune , we believe that we should provide a general interface of train/test. Which means for most users, the framework will do the clone inside.

@kavyasrinet
Copy link
Author

Thanks for pointing that out @dzhwinter and @QiJune .
I have added an inference_program for the inference part as of now, and that is just a clone of the main_program since in this example, I am not fetching the accuracy or other attributes. Is my understanding correct here ?

@kavyasrinet kavyasrinet closed this Feb 8, 2018
@kavyasrinet kavyasrinet deleted the flui_inference branch February 8, 2018 18:41
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants