PrecisionFDA
Truth Challenge

Engage and improve DNA test results with our community challenges

Explore HG002 comparison results
Use this interactive explorer to filter all results across submission entries and multiple dimensions.
EntryTypeSubtypeSubsetGenotype F-scoreRecallPrecisionFrac_NATruth TPTruth FNQuery TPQuery FPFP gt% FP ma
651-700 / 86044 show all
astatham-gatkINDEL*lowcmp_Human_Full_Genome_TRDB_hg19_150331_TRlt7_51to200bp_gt95identity_merged*
94.0371
93.4243
94.6580
64.2218
56834005564314298
94.9045
astatham-gatkINDEL*lowcmp_Human_Full_Genome_TRDB_hg19_150331_TRlt7_gt200bp_gt95identity_merged*
100.0000
100.0000
100.0000
97.7612
30300
astatham-gatkINDEL*lowcmp_Human_Full_Genome_TRDB_hg19_150331_TRlt7_lt101bp_gt95identity_merged*
97.6838
97.4223
97.9467
56.1751
3692597736731770739
95.9740
astatham-gatkINDEL*lowcmp_Human_Full_Genome_TRDB_hg19_150331_TRlt7_lt51bp_gt95identity_merged*
98.3493
98.1589
98.5404
55.2323
3140358931326464446
96.1207
astatham-gatkINDEL*lowcmp_Human_Full_Genome_TRDB_hg19_150331_all_gt95identity_merged*
97.8510
97.6024
98.1008
61.8066
42296103942098815759
93.1288
astatham-gatkINDEL*lowcmp_Human_Full_Genome_TRDB_hg19_150331_all_merged*
98.2824
98.0398
98.5261
67.2886
64070128163840955849
88.9005
astatham-gatkINDEL*lowcmp_SimpleRepeat_diTR_11to50*
98.5463
98.3740
98.7193
52.5147
3599759535921466435
93.3476
astatham-gatkINDEL*lowcmp_SimpleRepeat_diTR_51to200*
85.8405
84.9119
86.7896
57.5804
17843171695258252
97.6744
astatham-gatkINDEL*lowcmp_SimpleRepeat_homopolymer_6to10*
99.8087
99.7134
99.9043
58.1701
2817981281812715
55.5556
astatham-gatkINDEL*lowcmp_SimpleRepeat_homopolymer_gt10*
94.7563
95.1613
94.3548
99.9176
118611770
0.0000
astatham-gatkINDEL*lowcmp_SimpleRepeat_quadTR_11to50*
99.5288
99.4462
99.6115
59.5139
19752110197457749
63.6364
astatham-gatkINDEL*lowcmp_SimpleRepeat_quadTR_51to200*
97.6175
97.2128
98.0256
69.1545
25817425325139
76.4706
astatham-gatkINDEL*lowcmp_SimpleRepeat_quadTR_gt200*
0.0000
100.0000
00000
astatham-gatkINDEL*lowcmp_SimpleRepeat_triTR_11to50*
99.7622
99.6881
99.8364
49.1224
6712216714116
54.5455
astatham-gatkINDEL*lowcmp_SimpleRepeat_triTR_51to200*
96.1232
95.4955
96.7593
64.9351
2121020974
57.1429
astatham-gatkINDEL*map_l100_m0_e0*
96.7251
96.2892
97.1649
87.6728
1505581508449
20.4545
astatham-gatkINDEL*map_l100_m1_e0*
96.5907
95.1478
98.0780
85.9214
341217434196717
25.3731
astatham-gatkINDEL*map_l100_m2_e0*
96.5801
95.1530
98.0507
86.7138
351417935217018
25.7143
astatham-gatkINDEL*map_l100_m2_e1*
96.5544
95.0745
98.0811
86.7793
357118535787018
25.7143
astatham-gatkINDEL*map_l125_m0_e0*
96.6572
96.5986
96.7157
90.5095
85230854296
20.6897
astatham-gatkINDEL*map_l125_m1_e0*
96.6598
95.3963
97.9572
88.3361
2010972014429
21.4286
astatham-gatkINDEL*map_l125_m2_e0*
96.5138
95.1275
97.9410
89.1008
20891072093449
20.4545
astatham-gatkINDEL*map_l125_m2_e1*
96.4891
95.0562
97.9658
89.1866
21151102119449
20.4545
astatham-gatkINDEL*map_l150_m0_e0*
96.2251
96.4981
95.9538
92.9541
49618498214
19.0476
astatham-gatkINDEL*map_l150_m1_e0*
96.6569
96.0389
97.2830
90.5512
1285531289367
19.4444
astatham-gatkINDEL*map_l150_m2_e0*
96.6049
95.8807
97.3400
91.1929
1350581354377
18.9189
astatham-gatkINDEL*map_l150_m2_e1*
96.4999
95.6915
97.3221
91.2120
1377621381388
21.0526
astatham-gatkINDEL*map_l250_m0_e0*
90.3614
96.1538
85.2273
97.7873
75375132
15.3846
astatham-gatkINDEL*map_l250_m1_e0*
95.1613
96.7213
93.6508
96.0377
29510295204
20.0000
astatham-gatkINDEL*map_l250_m2_e0*
95.3800
96.6767
94.1176
96.2801
32011320204
20.0000
astatham-gatkINDEL*map_l250_m2_e1*
95.4074
96.6967
94.1520
96.3590
32211322204
20.0000
astatham-gatkINDEL*map_siren*
97.4708
96.1673
98.8100
83.5327
712628471418620
23.2558
astatham-gatkINDEL*segdup*
98.7115
98.8654
98.5581
94.6872
25272925293710
27.0270
astatham-gatkINDEL*segdupwithalt*
100.0000
100.0000
100.0000
99.9974
10100
astatham-gatkINDEL*tech_badpromoters*
99.3377
98.6842
100.0000
55.0898
7517500
astatham-gatkINDELC16_PLUS**
0.0000
0.0000
0.0000
00000
astatham-gatkINDELC16_PLUSHG002complexvar*
0.0000
0.0000
0.0000
00000
astatham-gatkINDELC16_PLUSHG002compoundhet*
0.0000
0.0000
0.0000
00000
astatham-gatkINDELC16_PLUSdecoy*
0.0000
0.0000
0.0000
00000
astatham-gatkINDELC16_PLUSfunc_cds*
0.0000
0.0000
0.0000
00000
astatham-gatkINDELC16_PLUSlowcmp_AllRepeats_51to200bp_gt95identity_merged*
0.0000
0.0000
0.0000
00000
astatham-gatkINDELC16_PLUSlowcmp_AllRepeats_gt200bp_gt95identity_merged*
0.0000
0.0000
0.0000
00000
astatham-gatkINDELC16_PLUSlowcmp_AllRepeats_lt51bp_gt95identity_merged*
0.0000
0.0000
0.0000
00000
astatham-gatkINDELC16_PLUSlowcmp_Human_Full_Genome_TRDB_hg19_150331*
0.0000
0.0000
0.0000
00000
astatham-gatkINDELC16_PLUSlowcmp_Human_Full_Genome_TRDB_hg19_150331_TRgt6_51to200bp_gt95identity_merged*
0.0000
0.0000
0.0000
00000
astatham-gatkINDELC16_PLUSlowcmp_Human_Full_Genome_TRDB_hg19_150331_TRgt6_gt200bp_gt95identity_merged*
0.0000
0.0000
0.0000
00000
astatham-gatkINDELC16_PLUSlowcmp_Human_Full_Genome_TRDB_hg19_150331_TRgt6_lt101bp_gt95identity_merged*
0.0000
0.0000
0.0000
00000
astatham-gatkINDELC16_PLUSlowcmp_Human_Full_Genome_TRDB_hg19_150331_TRgt6_lt51bp_gt95identity_merged*
0.0000
0.0000
0.0000
00000
astatham-gatkINDELC16_PLUSlowcmp_Human_Full_Genome_TRDB_hg19_150331_TRlt7_51to200bp_gt95identity_merged*
0.0000
0.0000
0.0000
00000
astatham-gatkINDELC16_PLUSlowcmp_Human_Full_Genome_TRDB_hg19_150331_TRlt7_gt200bp_gt95identity_merged*
0.0000
0.0000
0.0000
00000