Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Problem with read IDs repeated in the same BAM file #50

Open
gymreklab opened this issue Jan 23, 2018 · 6 comments
Open

Problem with read IDs repeated in the same BAM file #50

gymreklab opened this issue Jan 23, 2018 · 6 comments
Assignees
Labels

Comments

@gymreklab
Copy link

We get the following error if there is more than one read pair in a given region with the same read ID:

"ERROR: Failed to extract passes filters tag from BAM alignment"

These reads were from supplementary alignments. Ignoring alignments marked as supplementary would likely solve this issue. This is similar to a previous issue when read ids were repeated across BAM files (rather than within).

CC @shubhamsaini

@tfwillems tfwillems self-assigned this Jan 24, 2018
@tfwillems tfwillems added the bug label Jan 24, 2018
@tfwillems
Copy link
Owner

@gymreklab @shubhamsaini
Could you send me a BAM file that on its own can reproduce this issue?

@gymreklab
Copy link
Author

Using HipSTR version v.0.6.1

./HipSTR/HipSTR --version
HipSTR version v0.6.1

Attached files for the command below
chr9.testregions.bed.zip
hipstr_error_reads.bam.zip

Command:

HipSTR \
    --bams hipstr_error_reads.bam \
    --fasta /storage/resources/dbase/human/GRCh38/GRCh38_full_analysis_set_plus_decoy_hla.fa \
    --regions chr9.testregions.bed \
    --output-gls \
    --min-reads 1 \
    --def-stutter-model \
    --str-vcf test.vcf.gz

@kkapuria3
Copy link

I am getting the same error,

"ERROR: Failed to extract passes filters tag from BAM alignment"

I don't even know the reason why. I have 136 samples, how do I know which one is busted ?

@shadrinams
Copy link

Any updates / recommendations on the error? Thank you.

@douym
Copy link

douym commented Feb 12, 2022

We get the following error if there is more than one read pair in a given region with the same read ID:

"ERROR: Failed to extract passes filters tag from BAM alignment"

These reads were from supplementary alignments. Ignoring alignments marked as supplementary would likely solve this issue. This is similar to a previous issue when read ids were repeated across BAM files (rather than within).

CC @shubhamsaini

same problem here. To "ignore alignments marked as supplementary", do we need to change the bam file or is there a parameter for HipSTR to do that? THX

@CyrusWang57
Copy link

CyrusWang57 commented Oct 2, 2024

For me, this error was not caused by a supplementary read. When I removed supplementary reads from my bam file, the error was still there. I don't know what caused the error, even after checking reads reported the error (they look all fine). But I simply modified the source code to skip this error. The following steps are for your reference.

  1. Locate the passes_filters function in the bam_processor.cpp source file.
  2. Modify the function as the provided changes.
void BamProcessor::passes_filters(BamAlignment& aln, std::vector<bool>& region_passes) {
    assert(region_passes.empty());
    std::string passes;
    if (!aln.GetStringTag("PF", passes)) {
        // PF tag not found, add a new one with value "0"
        passes = "0";
        if (!aln.AddStringTag("PF", passes)) {
            printErrorAndDie("Failed to add PF tag with value '0' to BAM alignment");
        }
        std::cout << "Added PF tag with value '0' to alignment" << std::endl;
    }
    for (int i = 0; i < passes.size(); i++) {
        region_passes.push_back(passes[i] == '1' ? true : false);
    }
}
  1. Recompile the project to implement the changes through command ‘make’.

This process should address the problem.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

6 participants