PrecisionFDA
Truth Challenge

Engage and improve DNA test results with our community challenges

Explore HG002 comparison results
Use this interactive explorer to filter all results across submission entries and multiple dimensions.
Entry TypeSubtypeSubsetGenotypeF-scoreRecallPrecisionFrac_NATruth TPTruth FNQuery TPQuery FPFP gt% FP ma
51251-51300 / 86044 show all
gduggal-bwafbINDEL*lowcmp_Human_Full_Genome_TRDB_hg19_150331_TRlt7_gt200bp_gt95identity_mergedhet
66.6667
50.0000
100.0000
98.5075
11100
gduggal-bwafbINDEL*lowcmp_Human_Full_Genome_TRDB_hg19_150331_TRlt7_gt200bp_gt95identity_mergedhetalt
0.0000
100.0000
00000
gduggal-bwafbINDEL*lowcmp_Human_Full_Genome_TRDB_hg19_150331_TRlt7_gt200bp_gt95identity_mergedhomalt
0.0000
100.0000
01000
gduggal-bwafbINDEL*lowcmp_Human_Full_Genome_TRDB_hg19_150331_TRlt7_lt101bp_gt95identity_merged*
92.2149
89.1589
95.4879
47.4893
3379341093955318691772
94.8101
gduggal-bwafbINDEL*lowcmp_Human_Full_Genome_TRDB_hg19_150331_TRlt7_lt101bp_gt95identity_mergedhet
93.1643
88.4244
98.4410
44.1529
12925169227215431341
79.1183
gduggal-bwafbINDEL*lowcmp_Human_Full_Genome_TRDB_hg19_150331_TRlt7_lt101bp_gt95identity_mergedhetalt
89.8585
83.7475
96.9315
44.2062
11527223730019594
98.9474
gduggal-bwafbINDEL*lowcmp_Human_Full_Genome_TRDB_hg19_150331_TRlt7_lt101bp_gt95identity_mergedhomalt
92.4596
98.1094
87.4251
55.1844
9341180933713431337
99.5532
gduggal-bwafbINDEL*lowcmp_Human_Full_Genome_TRDB_hg19_150331_TRlt7_lt51bp_gt95identity_merged*
93.8883
91.5760
96.3203
46.9726
2929726953405513011239
95.2344
gduggal-bwafbINDEL*lowcmp_Human_Full_Genome_TRDB_hg19_150331_TRlt7_lt51bp_gt95identity_mergedhet
94.6807
90.9033
98.7857
42.8578
11382113923592290233
80.3448
gduggal-bwafbINDEL*lowcmp_Human_Full_Genome_TRDB_hg19_150331_TRlt7_lt51bp_gt95identity_mergedhetalt
91.7011
87.0959
96.8206
45.2421
9672143322237372
98.6301
gduggal-bwafbINDEL*lowcmp_Human_Full_Genome_TRDB_hg19_150331_TRlt7_lt51bp_gt95identity_mergedhomalt
93.9516
98.5298
89.7799
55.6361
82431238240938934
99.5736
gduggal-bwafbINDEL*lowcmp_Human_Full_Genome_TRDB_hg19_150331_all_gt95identity_merged*
91.7140
88.5981
95.0570
53.8425
3839449414423023002147
93.3478
gduggal-bwafbINDEL*lowcmp_Human_Full_Genome_TRDB_hg19_150331_all_gt95identity_mergedhet
92.7218
88.4618
97.4128
51.1450
15763205630197802660
82.2943
gduggal-bwafbINDEL*lowcmp_Human_Full_Genome_TRDB_hg19_150331_all_gt95identity_mergedhetalt
89.5071
83.4165
96.5571
49.6329
1170523273113111109
98.1982
gduggal-bwafbINDEL*lowcmp_Human_Full_Genome_TRDB_hg19_150331_all_gt95identity_mergedhomalt
91.8238
95.1411
88.7300
60.2423
109265581092013871378
99.3511
gduggal-bwafbINDEL*lowcmp_Human_Full_Genome_TRDB_hg19_150331_all_merged*
91.1270
87.3774
95.2128
60.8815
5710282496396332162822
87.7488
gduggal-bwafbINDEL*lowcmp_Human_Full_Genome_TRDB_hg19_150331_all_mergedhet
91.6563
86.8570
97.0171
59.4271
264744006429971322971
73.4493
gduggal-bwafbINDEL*lowcmp_Human_Full_Genome_TRDB_hg19_150331_all_mergedhetalt
87.8267
81.6103
95.0682
57.7013
1363330723971206202
98.0583
gduggal-bwafbINDEL*lowcmp_Human_Full_Genome_TRDB_hg19_150331_all_mergedhomalt
92.2413
93.5539
90.9650
64.4972
1699511711699516881649
97.6896
gduggal-bwafbINDEL*lowcmp_SimpleRepeat_diTR_11to50*
94.4932
92.6377
96.4246
45.4241
3389826943775614001310
93.5714
gduggal-bwafbINDEL*lowcmp_SimpleRepeat_diTR_11to50het
95.7841
93.1472
98.5746
46.7673
14680108025103363284
78.2369
gduggal-bwafbINDEL*lowcmp_SimpleRepeat_diTR_11to50hetalt
91.3561
86.7685
96.4558
48.1730
9089138625319392
98.9247
gduggal-bwafbINDEL*lowcmp_SimpleRepeat_diTR_11to50homalt
94.5282
97.7986
91.4694
41.2757
1012922810122944934
98.9407
gduggal-bwafbINDEL*lowcmp_SimpleRepeat_diTR_51to200*
73.4367
65.0167
84.3621
45.9065
13667351845342338
98.8304
gduggal-bwafbINDEL*lowcmp_SimpleRepeat_diTR_51to200het
75.3138
61.2245
97.8261
47.4389
30019013052926
89.6552
gduggal-bwafbINDEL*lowcmp_SimpleRepeat_diTR_51to200hetalt
72.8203
57.4281
99.4845
34.4595
71953319311
100.0000
gduggal-bwafbINDEL*lowcmp_SimpleRepeat_diTR_51to200homalt
68.1729
96.6574
52.6555
45.4921
34712347312311
99.6795
gduggal-bwafbINDEL*lowcmp_SimpleRepeat_homopolymer_6to10*
99.2808
98.8889
99.6759
57.1851
27946314279849176
83.5165
gduggal-bwafbINDEL*lowcmp_SimpleRepeat_homopolymer_6to10het
99.2884
98.8435
99.7372
57.4242
16239190163214329
67.4419
gduggal-bwafbINDEL*lowcmp_SimpleRepeat_homopolymer_6to10hetalt
92.8240
87.8505
98.3945
74.4282
4706542977
100.0000
gduggal-bwafbINDEL*lowcmp_SimpleRepeat_homopolymer_6to10homalt
99.5570
99.4777
99.6364
55.6678
1123759112344140
97.5610
gduggal-bwafbINDEL*lowcmp_SimpleRepeat_homopolymer_gt10*
72.5395
63.7097
84.2105
99.9318
7945801512
80.0000
gduggal-bwafbINDEL*lowcmp_SimpleRepeat_homopolymer_gt10het
70.2459
57.4713
90.3226
99.9104
50375663
50.0000
gduggal-bwafbINDEL*lowcmp_SimpleRepeat_homopolymer_gt10hetalt
66.6667
50.0000
100.0000
99.9681
88400
gduggal-bwafbINDEL*lowcmp_SimpleRepeat_homopolymer_gt10homalt
81.6327
100.0000
68.9655
99.9497
2102099
100.0000
gduggal-bwafbINDEL*lowcmp_SimpleRepeat_quadTR_11to50*
93.9385
90.7361
97.3752
52.7808
18022184018957511394
77.1037
gduggal-bwafbINDEL*lowcmp_SimpleRepeat_quadTR_11to50het
93.7221
89.8362
97.9593
52.4760
9979112912193254153
60.2362
gduggal-bwafbINDEL*lowcmp_SimpleRepeat_quadTR_11to50hetalt
88.7892
84.1909
93.9189
55.0542
22584249736362
98.4127
gduggal-bwafbINDEL*lowcmp_SimpleRepeat_quadTR_11to50homalt
96.0102
95.2734
96.7586
52.9962
57852875791194179
92.2680
gduggal-bwafbINDEL*lowcmp_SimpleRepeat_quadTR_51to200*
80.9919
72.4294
91.8503
55.9362
19237322209196151
77.0408
gduggal-bwafbINDEL*lowcmp_SimpleRepeat_quadTR_51to200het
76.8190
64.3595
95.2607
61.2844
62334512066023
38.3333
gduggal-bwafbINDEL*lowcmp_SimpleRepeat_quadTR_51to200hetalt
81.5240
69.7071
98.1651
30.1282
8333625351010
100.0000
gduggal-bwafbINDEL*lowcmp_SimpleRepeat_quadTR_51to200homalt
86.1043
94.9187
78.7879
57.8125
46725468126118
93.6508
gduggal-bwafbINDEL*lowcmp_SimpleRepeat_quadTR_gt200*
0.0000
100.0000
00000
gduggal-bwafbINDEL*lowcmp_SimpleRepeat_quadTR_gt200het
0.0000
100.0000
00000
gduggal-bwafbINDEL*lowcmp_SimpleRepeat_quadTR_gt200hetalt
0.0000
0.0000
0.0000
00000
gduggal-bwafbINDEL*lowcmp_SimpleRepeat_quadTR_gt200homalt
0.0000
100.0000
00000
gduggal-bwafbINDEL*lowcmp_SimpleRepeat_triTR_11to50*
97.3102
95.9453
98.7144
44.6603
646027368348978
87.6404
gduggal-bwafbINDEL*lowcmp_SimpleRepeat_triTR_11to50het
96.9324
95.1886
98.7412
44.6522
348217644715746
80.7018
gduggal-bwafbINDEL*lowcmp_SimpleRepeat_triTR_11to50hetalt
95.1872
94.4745
95.9108
49.8134
872512581111
100.0000