BCC2020 has ended
➞ Set your timezone before doing anything else on this site (home page, on the right)
Limit what is shown by Type, Category, or Hemisphere
Registration closed July 15.

BCC2020 is online, global, and affordable. The meeting and training are now done, and the CoFest is under way.

The 2020 Bioinformatics Community Conference brings together the Bioinformatics Open Source Conference (BOSC) and the Galaxy Community Conference into a single event featuring training, a meeting, and a CollaborationFest. Events run from July 17 through July 25, and is held in both the eastern and western hemispheres.

Back To Schedule
Monday, July 20 • 00:15 - 00:20
A comprehensive benchmarking of WGS-based structural variant callers 🍐

Sign up or log in to save this to your schedule, view media, leave feedback and see who's attending!


The presenter(s) will be available for live Q&A in this session (BCC East).

Varuni Sarwal 1,2, Sebastian Niehus 3,4, Ram Ayyala 1, Serghei Mangul 5

1 University of California, Los Angeles, CA 90095, USA. Email: sarwal8@gmail.com
2 Indian Institute of Technology Delhi, Hauz Khas, New Delhi, Delhi 110016, India
3 Berlin Institute of Health (BIH), Anna-Louisa-Karsch-Str. 2, 10178 Berlin, Germany
4 Charité-Universitätsmedizin Berlin, corporate member of Freie Universität Berlin,
Humboldt-Universität zu Berlin, and Berlin Institute of Health, Charitéplatz 1, 10117 Berlin, Germany
5 University of Southern California, Los Angeles, CA 90089, USA

Project Website: https://github.com/Mangul-Lab-USC/benchmarking_SV_publication
Source Code: https://github.com/Mangul-Lab-USC/benchmarking_SV_publication
License: MIT License

Structural variants (SVs) are genomic regions that contain an altered DNA sequence due to
deletion, duplication, insertion, or inversion, and have varying pathogenicity of disease.
Dissecting SVs from whole genome sequencing (WGS) data presents a number of challenges
and a plethora of SV-detection methods have been developed. Currently, there is a paucity of
evidence which investigators can use to select appropriate SV-detection tools. We evaluated the
performance of 15 SV-detection tools based on their ability to detect deletions from aligned
WGS reads using a comprehensive PCR-confirmed gold standard set of SVs to find methods
with a good balance between sensitivity and precision. While the number of true deletions is
3710, the number of deletions detected by the tools ranged from 899 to 82,225. 53% of the
methods reported fewer deletions than are known to be present in the sample. The length
distribution of detected deletions varied across tools and was substantially different from the
distribution of true deletions. 53% of tools underestimate the true size of SVs and deletions
detected by BreakDancer were the closest to the true median deletion length. We allowed
deviation in the coordinates of the detected deletions and compared deviations to the coordinates
of the true deletions from 0 to 10,000 bp. Manta achieved the highest f-score for all thresholds.
Methods with high specificity rates tend to also have significantly higher f-score and precision
rates. CLEVER was able to achieve the highest sensitivity while the most precise method was
PopDel. We assessed the performance of SV callers at coverages from 32x to 0.1x generated by
down-sampling the original WGS data. DELLY showed the highest F-score for coverage below
4x while Manta was the best performing tool from 8x to 32x. We assessed the effect of deletion
length on the accuracy of detection. Manta and CREST were the only tools with high specificity
for deletions shorter than 500bp. LUMPY was the only method able to deliver an F-score above
30% across all categories. Manta and LUMPY were the best performing tools for general
applications. Our recommendations can help researchers choose the best SV detection software,
as well as inform the developer community of the challenges of SV detection.

avatar for Varuni Sarwal

Varuni Sarwal

Undergraduate student, UC Los Angeles

Monday July 20, 2020 00:15 - 00:20 EDT