PrecisionFDA
Truth Challenge

Engage and improve DNA test results with our community challenges

Explore HG002 comparison results
Use this interactive explorer to filter all results across submission entries and multiple dimensions.
Entry TypeSubtypeSubsetGenotypeF-scoreRecallPrecisionFrac_NATruth TPTruth FNQuery TPQuery FPFP gt% FP ma
52601-52650 / 86044 show all
gduggal-bwafbINDELD16_PLUSsegduphetalt
66.6667
100.0000
63000
gduggal-bwafbINDELD16_PLUSsegduphomalt
86.9565
83.3333
90.9091
94.1489
1021011
100.0000
gduggal-bwafbINDELD16_PLUSsegdupwithalt*
0.0000
100.0000
00000
gduggal-bwafbINDELD16_PLUSsegdupwithalthet
0.0000
100.0000
00000
gduggal-bwafbINDELD16_PLUSsegdupwithalthetalt
0.0000
100.0000
00000
gduggal-bwafbINDELD16_PLUSsegdupwithalthomalt
0.0000
100.0000
00000
gduggal-bwafbINDELD16_PLUStech_badpromoters*
85.7143
75.0000
100.0000
40.0000
31300
gduggal-bwafbINDELD16_PLUStech_badpromotershet
85.7143
75.0000
100.0000
25.0000
31300
gduggal-bwafbINDELD16_PLUStech_badpromotershetalt
0.0000
0.0000
0.0000
00000
gduggal-bwafbINDELD16_PLUStech_badpromotershomalt
0.0000
100.0000
00000
gduggal-bwafbINDELD1_5**
98.5809
98.0538
99.1138
59.5290
14388928561447271294903
69.7836
gduggal-bwafbINDELD1_5*het
98.9590
98.4961
99.4263
56.5657
86257131792550534170
31.8352
gduggal-bwafbINDELD1_5*hetalt
93.2065
88.0527
99.0011
79.2540
9021122435683636
100.0000
gduggal-bwafbINDELD1_5*homalt
98.9426
99.3562
98.5324
61.7935
4861131548609724697
96.2707
gduggal-bwafbINDELD1_5HG002complexvar*
98.0343
97.0289
99.0607
56.8977
3174397231851302219
72.5166
gduggal-bwafbINDELD1_5HG002complexvarhet
98.1510
96.8794
99.4564
54.4051
201176482085611444
38.5965
gduggal-bwafbINDELD1_5HG002complexvarhetalt
91.5873
87.6479
95.8974
82.7281
11851675612424
100.0000
gduggal-bwafbINDELD1_5HG002complexvarhomalt
98.4856
98.5186
98.4525
57.9745
1044115710434164151
92.0732
gduggal-bwafbINDELD1_5HG002compoundhet*
91.0728
88.7045
93.5709
64.7882
10853138211658801716
89.3883
gduggal-bwafbINDELD1_5HG002compoundhethet
94.6747
91.3773
98.2190
47.8412
1579149783114274
52.1127
gduggal-bwafbINDELD1_5HG002compoundhethetalt
93.3119
88.0482
99.2450
76.1026
8995122135492727
100.0000
gduggal-bwafbINDELD1_5HG002compoundhethomalt
46.3350
95.8763
30.5495
82.2716
27912278632615
97.3101
gduggal-bwafbINDELD1_5decoy*
100.0000
100.0000
100.0000
99.9676
40300
gduggal-bwafbINDELD1_5decoyhet
100.0000
100.0000
100.0000
99.9726
20200
gduggal-bwafbINDELD1_5decoyhetalt
100.0000
100.0000
10000
gduggal-bwafbINDELD1_5decoyhomalt
100.0000
100.0000
100.0000
99.9426
10100
gduggal-bwafbINDELD1_5func_cds*
98.7421
98.7421
98.7421
37.1542
157215721
50.0000
gduggal-bwafbINDELD1_5func_cdshet
98.8372
100.0000
97.7011
43.1373
8508521
50.0000
gduggal-bwafbINDELD1_5func_cdshetalt
0.0000
100.0000
00000
gduggal-bwafbINDELD1_5func_cdshomalt
98.6301
97.2973
100.0000
27.2727
7227200
gduggal-bwafbINDELD1_5lowcmp_AllRepeats_51to200bp_gt95identity_merged*
86.8762
81.9779
92.3970
60.2527
32667183342275210
76.3636
gduggal-bwafbINDELD1_5lowcmp_AllRepeats_51to200bp_gt95identity_mergedhet
89.0929
83.6170
95.3363
63.5012
117923119429538
40.0000
gduggal-bwafbINDELD1_5lowcmp_AllRepeats_51to200bp_gt95identity_mergedhetalt
84.7595
74.5335
98.2379
37.2350
13584646691212
100.0000
gduggal-bwafbINDELD1_5lowcmp_AllRepeats_51to200bp_gt95identity_mergedhomalt
88.4419
96.9415
81.3126
63.0649
72923731168160
95.2381
gduggal-bwafbINDELD1_5lowcmp_AllRepeats_gt200bp_gt95identity_merged*
95.2381
90.9091
100.0000
99.3857
1011000
gduggal-bwafbINDELD1_5lowcmp_AllRepeats_gt200bp_gt95identity_mergedhet
93.3333
87.5000
100.0000
99.2467
71800
gduggal-bwafbINDELD1_5lowcmp_AllRepeats_gt200bp_gt95identity_mergedhetalt
100.0000
100.0000
10000
gduggal-bwafbINDELD1_5lowcmp_AllRepeats_gt200bp_gt95identity_mergedhomalt
100.0000
100.0000
100.0000
99.6296
20200
gduggal-bwafbINDELD1_5lowcmp_AllRepeats_lt51bp_gt95identity_merged*
97.7831
96.8990
98.6834
73.3610
62433199863262844717
84.9526
gduggal-bwafbINDELD1_5lowcmp_AllRepeats_lt51bp_gt95identity_mergedhet
98.3040
97.2229
99.4094
71.0102
3231392338374228113
49.5614
gduggal-bwafbINDELD1_5lowcmp_AllRepeats_lt51bp_gt95identity_mergedhetalt
94.5232
90.1786
99.3077
80.7697
838391331562222
100.0000
gduggal-bwafbINDELD1_5lowcmp_AllRepeats_lt51bp_gt95identity_mergedhomalt
98.2904
99.2602
97.3394
75.4562
2173716221732594582
97.9798
gduggal-bwafbINDELD1_5lowcmp_Human_Full_Genome_TRDB_hg19_150331*
94.7146
92.7089
96.8091
59.8061
29118229029975988774
78.3401
gduggal-bwafbINDELD1_5lowcmp_Human_Full_Genome_TRDB_hg19_150331het
95.6799
93.0581
98.4538
59.4166
1280295519039299108
36.1204
gduggal-bwafbINDELD1_5lowcmp_Human_Full_Genome_TRDB_hg19_150331hetalt
92.7620
87.2242
99.0507
45.1542
8104118727132626
100.0000
gduggal-bwafbINDELD1_5lowcmp_Human_Full_Genome_TRDB_hg19_150331homalt
95.2994
98.2297
92.5388
63.5670
82121488223663640
96.5309
gduggal-bwafbINDELD1_5lowcmp_Human_Full_Genome_TRDB_hg19_150331_TRgt6_51to200bp_gt95identity_merged*
95.3524
95.1444
95.5614
69.5669
725377323414
41.1765
gduggal-bwafbINDELD1_5lowcmp_Human_Full_Genome_TRDB_hg19_150331_TRgt6_51to200bp_gt95identity_mergedhet
95.0905
95.4023
94.7808
71.1793
41520454255
20.0000
gduggal-bwafbINDELD1_5lowcmp_Human_Full_Genome_TRDB_hg19_150331_TRgt6_51to200bp_gt95identity_mergedhetalt
87.3162
82.6087
92.5926
73.5294
57122522
100.0000
gduggal-bwafbINDELD1_5lowcmp_Human_Full_Genome_TRDB_hg19_150331_TRgt6_51to200bp_gt95identity_mergedhomalt
97.6834
98.0620
97.3077
65.4714
253525377
100.0000