-
Notifications
You must be signed in to change notification settings - Fork 311
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Modified CalculateDepth to get coverage on whole alignment adam files #1010
Conversation
Test FAILed. Build result: FAILUREGitHub pull request #1010 of commit 4d0510d automatically merged.Notifying endpoint 'HTTP:https://webhooks.gitter.im/e/ac8bb6e9f53357bc8aa8'[EnvInject] - Loading node environment variables.Building remotely on amp-jenkins-worker-05 (centos spark-test) in workspace /home/jenkins/workspace/ADAM-prb > git rev-parse --is-inside-work-tree # timeout=10Fetching changes from the remote Git repository > git config remote.origin.url https://github.com/bigdatagenomics/adam.git # timeout=10Fetching upstream changes from https://github.com/bigdatagenomics/adam.git > git --version # timeout=10 > git fetch --tags --progress https://github.com/bigdatagenomics/adam.git +refs/pull/:refs/remotes/origin/pr/ > git rev-parse origin/pr/1010/merge^{commit} # timeout=10 > git branch -a --contains 31569d7f0f819af91bf84f92e278e563c9fc5944 # timeout=10 > git rev-parse remotes/origin/pr/1010/merge^{commit} # timeout=10Checking out Revision 31569d7f0f819af91bf84f92e278e563c9fc5944 (origin/pr/1010/merge) > git config core.sparsecheckout # timeout=10 > git checkout -f 31569d7f0f819af91bf84f92e278e563c9fc5944First time build. Skipping changelog.Triggering ADAM-prb ? 2.6.0,2.10,1.5.2,centosTriggering ADAM-prb ? 2.6.0,2.11,1.5.2,centosTouchstone configurations resulted in FAILURE, so aborting...Notifying endpoint 'HTTP:https://webhooks.gitter.im/e/ac8bb6e9f53357bc8aa8'Test FAILed. |
<dependency> | ||
<groupId>org.apache.spark</groupId> | ||
<artifactId>spark-sql_2.10</artifactId> | ||
<version>${spark.version}</version> |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Delete here.
Since this is a major modification of |
val sqlContext = new org.apache.spark.sql.SQLContext(sc) | ||
|
||
val frequencies = sqlContext.createDataFrame(depths.map(r => Coverage(r._1.referenceName, r._1.start, r._2))) | ||
frequencies.write.parquet(args.outputPath) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
let's save this as a feature
@akmorrow13 I'm going to close this in favor of bigdatagenomics/quinine#14. LMK if you object. |
No description provided.