Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Large-scale data cause the server down #66

Closed
kkyyhh96 opened this issue Jul 12, 2019 · 1 comment
Closed

Large-scale data cause the server down #66

kkyyhh96 opened this issue Jul 12, 2019 · 1 comment

Comments

@kkyyhh96
Copy link

When the data set has about 3000 polygons/points with more than 20 attributes, the program will make the server down. My server was shutdown because of this for several times. I was running the code on a Linux server with python 3.6, mgwr 2.0.0, more than 72 CPU cores and 300G+ memories.

@Ziqi-Li
Copy link
Member

Ziqi-Li commented Jul 15, 2019

Hi @kkyyhh96.

We have applied some optimizations recently #52. Could you please install the latest version we currently have on Github master brach (v2.1.0).

For the size of the problem you are having (3000 with 20 features), the current master brach on GitHub should be able to handle it. To increase the performance, I would also suggest to install numba via pip. Also if you are encountering any memory issue, please try increasing the n_chunks parameters in MGWR.fit() to get over the memory problem.

I am closing this issue, if you are still having this problem, please let us know.

@Ziqi-Li Ziqi-Li closed this as completed Jul 15, 2019
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants