You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The taxonomy-from-table is currently being killed once it reaches about 50% memory usage on a 16 GB VM.
The introduction post warned about this
NOTE: Just like filter-features, this command right now will require around 8-10GB of memory.
It's possible my system is overly aggressive with memory management, but either way I'm interested in tracking this issue for all the folks with potatoes
The text was updated successfully, but these errors were encountered:
Ya at the moment the method is quite burdensome, even with the short cuts we already implemented. Its original implementation was much worse in memory... What I'm considering is representing the taxonomic data in a SQLite3 database which would avoid the resident overhead, and likely would not greatly impacting performance. My hope is to have this in for the next release which I'm currently working on the upstream pieces for.
Out of curiosity, is this something you'd have time and interest in working on?
I'm testing out the new GreenGenes! 💚
The
taxonomy-from-table
is currently being killed once it reaches about 50% memory usage on a 16 GB VM.The introduction post warned about this
It's possible my system is overly aggressive with memory management, but either way I'm interested in tracking this issue for all the folks with potatoes
The text was updated successfully, but these errors were encountered: