Hi iVar team,
I’m currently developing a tiled amplicon scheme for HEV (ONT sequencing) and using iVar both for primer trimming and for generating the consensus sequence. So far, this works well for my data.
I have a question about how iVar handles primer-derived bases during consensus generation:
When primers are removed (via iVar primer trimming, resulting in soft-clipped/trimmed primer regions), are those primer bases ever reflected in the final consensus sequence, or are they always excluded from the consensus?
Put differently: does ivar consensus operate strictly on the post-trimming alignments (excluding primer bases), or can primer sequence still influence the consensus at primer-binding sites depending on the workflow/settings?
I will also verify this empirically by comparing consensus sequences generated from primer-trimmed vs. untrimmed alignments, but I’d really appreciate understanding the intended/default behavior in iVar (and whether there are any caveats for ONT data).
Thanks a lot for your help!
Greetings,
Mona
Hi iVar team,
I’m currently developing a tiled amplicon scheme for HEV (ONT sequencing) and using iVar both for primer trimming and for generating the consensus sequence. So far, this works well for my data.
I have a question about how iVar handles primer-derived bases during consensus generation:
When primers are removed (via iVar primer trimming, resulting in soft-clipped/trimmed primer regions), are those primer bases ever reflected in the final consensus sequence, or are they always excluded from the consensus?
Put differently: does ivar consensus operate strictly on the post-trimming alignments (excluding primer bases), or can primer sequence still influence the consensus at primer-binding sites depending on the workflow/settings?
I will also verify this empirically by comparing consensus sequences generated from primer-trimmed vs. untrimmed alignments, but I’d really appreciate understanding the intended/default behavior in iVar (and whether there are any caveats for ONT data).
Thanks a lot for your help!
Greetings,
Mona