Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix bug in default GPU memory allocating policy #6268

Merged
merged 1 commit into from
Dec 5, 2017

Conversation

QiJune
Copy link
Member

@QiJune QiJune commented Dec 5, 2017

fix #6265
Since we have to reserve 0.05 GPU memory, the default GPU memory allocating should be a little less than 0.95. This PR set it to 0.92.

Copy link
Contributor

@qingqing01 qingqing01 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I tested the default fraction_of_gpu_memory_to_use=0.92 on Tesla K40, it is ok now.

@QiJune QiJune merged commit 96a5f96 into PaddlePaddle:develop Dec 5, 2017
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

There is still a bug in memory allocation by the default arguments.
2 participants