-
Notifications
You must be signed in to change notification settings - Fork 4
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
ESPRESSO test not working with 112 threads #56
Comments
Here's the line for that error: https://github.com/Xinglab/espresso/blob/v1.4.0/src/ESPRESSO_C.pl#L1732 The code is checking that the split is at least 0.01 of the total file size and that check essentially limits the number of threads to 100. I'm not sure why that check was put in the code. I think it could probably be removed in a future version |
Hi @EricKutschera |
By default ESPRESSO creates 1 C step job per input file and within each C step job the threads work on different read groups. ESPRESSO defines the read groups by looking for alignments with overlapping coordinates. It sounds like you have 1 input file with 3 million reads for the same gene (FLNC). In that case all 3 million reads are being worked on by the same perl thread. If you're seeing 2 threads being used it could be because ESPRESSO runs The snakemake workflow for ESPRESSO includes a parameter You could use the snakemake, or manually use the scripts that the snakemake does to split up the C step into more jobs: Another option is to split up your reads into multiple input files so that each input file will be a separate C job. You can give each split file the same sample name in your |
Thanks Eric, |
I've installed ESPRESSO and I was running the test and I find out that it can not run with 112 threads but 100,99 and 48 threads do work for some reason. I am running with 100 threads at the moment (and it works) so it is not a huge issue but just for your information I will attach the logs.
I am running this in a cluster with slurm in a 112 threads node.
Script:
The text was updated successfully, but these errors were encountered: