Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Allow spark tools to write bcf files #4303

Closed
lbergelson opened this issue Jan 29, 2018 · 0 comments
Closed

Allow spark tools to write bcf files #4303

lbergelson opened this issue Jan 29, 2018 · 0 comments

Comments

@lbergelson
Copy link
Member

lbergelson commented Jan 29, 2018

Currently we can't write bcf files from VariantSparkSink. This may require a fix in hadoop bam.
We should also enable writing g.bcf and bcf.gz variants.

@lbergelson lbergelson changed the title Allow spark to write g.bcf files Allow spark tools to write g.bcf files Jan 29, 2018
@droazen droazen modified the milestones: Engine-4.1, Engine-1Q2018 Feb 5, 2018
@droazen droazen modified the milestones: Engine-1Q2018, Engine-2Q2018 Apr 6, 2018
@lbergelson lbergelson changed the title Allow spark tools to write g.bcf files Allow spark tools to write bcf files May 10, 2018
lbergelson pushed a commit that referenced this issue May 10, 2018
* Support g.vcf.gz files in Spark tools
* fixes #4274 
* upgrade hadoop-bam 7.9.1 -> 7.10.0
* Remove bcf files from Spark tests since spark currently can't write bcf files correctly 
   * this is tracked by #4303 
   * a file called named .bcf is produced, but the file is actually encoded as a vcf
   * updated tests to verify that the file extension matches the actual datatype in the file
@droazen droazen removed this from the Engine-2Q2018 milestone Oct 4, 2018
zaneChou1 added a commit to zaneChou1/gatk that referenced this issue Nov 12, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

3 participants