PrecisionFDA
Truth Challenge

Engage and improve DNA test results with our community challenges

Explore HG002 comparison results
Use this interactive explorer to filter all results across submission entries and multiple dimensions.
Entry TypeSubtypeSubsetGenotypeF-scoreRecallPrecisionFrac_NATruth TPTruth FNQuery TPQuery FPFP gt% FP ma
35001-35050 / 86044 show all
gduggal-bwafbINDELI1_5lowcmp_Human_Full_Genome_TRDB_hg19_150331_TRlt7_51to200bp_gt95identity_merged*
81.0884
72.8543
91.4209
60.4663
7302726826449
76.5625
gduggal-bwafbINDELI1_5lowcmp_Human_Full_Genome_TRDB_hg19_150331_TRlt7_51to200bp_gt95identity_mergedhet
83.1329
78.1250
88.8268
67.2461
225633184025
62.5000
gduggal-bwafbINDELI1_5lowcmp_Human_Full_Genome_TRDB_hg19_150331_TRlt7_51to200bp_gt95identity_mergedhetalt
81.0877
69.6203
97.0779
42.3221
44019229999
100.0000
gduggal-bwafbINDELI1_5lowcmp_Human_Full_Genome_TRDB_hg19_150331_TRlt7_51to200bp_gt95identity_mergedhomalt
80.2469
79.2683
81.2500
69.2308
6517651515
100.0000
gduggal-bwafbINDELI1_5lowcmp_Human_Full_Genome_TRDB_hg19_150331_TRlt7_gt200bp_gt95identity_merged*
0.0000
100.0000
00000
gduggal-bwafbINDELI1_5lowcmp_Human_Full_Genome_TRDB_hg19_150331_TRlt7_gt200bp_gt95identity_mergedhet
0.0000
100.0000
00000
gduggal-bwafbINDELI1_5lowcmp_Human_Full_Genome_TRDB_hg19_150331_TRlt7_gt200bp_gt95identity_mergedhetalt
0.0000
100.0000
00000
gduggal-bwafbINDELI1_5lowcmp_Human_Full_Genome_TRDB_hg19_150331_TRlt7_gt200bp_gt95identity_mergedhomalt
0.0000
100.0000
00000
gduggal-bwafbINDELI1_5lowcmp_Human_Full_Genome_TRDB_hg19_150331_TRlt7_lt101bp_gt95identity_merged*
90.9106
86.5682
95.7115
51.1635
59819284419198174
87.8788
gduggal-bwafbINDELI1_5lowcmp_Human_Full_Genome_TRDB_hg19_150331_TRlt7_lt101bp_gt95identity_mergedhet
90.9017
87.1048
95.0448
58.1886
1405208201410583
79.0476
gduggal-bwafbINDELI1_5lowcmp_Human_Full_Genome_TRDB_hg19_150331_TRlt7_lt101bp_gt95identity_mergedhetalt
91.4107
85.7327
97.8942
35.5826
398466318133939
100.0000
gduggal-bwafbINDELI1_5lowcmp_Human_Full_Genome_TRDB_hg19_150331_TRlt7_lt101bp_gt95identity_mergedhomalt
91.4286
91.2173
91.6409
57.2469
592575925452
96.2963
gduggal-bwafbINDELI1_5lowcmp_Human_Full_Genome_TRDB_hg19_150331_TRlt7_lt51bp_gt95identity_merged*
92.4416
88.7728
96.4267
49.8323
52666663751139127
91.3669
gduggal-bwafbINDELI1_5lowcmp_Human_Full_Genome_TRDB_hg19_150331_TRlt7_lt51bp_gt95identity_mergedhet
92.3526
88.8889
96.0973
56.8254
118414816996959
85.5072
gduggal-bwafbINDELI1_5lowcmp_Human_Full_Genome_TRDB_hg19_150331_TRlt7_lt51bp_gt95identity_mergedhetalt
92.8504
88.1608
98.0670
34.4871
355247715223030
100.0000
gduggal-bwafbINDELI1_5lowcmp_Human_Full_Genome_TRDB_hg19_150331_TRlt7_lt51bp_gt95identity_mergedhomalt
92.9010
92.8196
92.9825
55.8140
530415304038
95.0000
gduggal-bwafbINDELI1_5lowcmp_Human_Full_Genome_TRDB_hg19_150331_all_gt95identity_merged*
91.6223
87.8582
95.7234
63.6163
738810215842261219
83.9080
gduggal-bwafbINDELI1_5lowcmp_Human_Full_Genome_TRDB_hg19_150331_all_gt95identity_mergedhet
92.6625
90.2806
95.1736
68.0131
22202392879146107
73.2877
gduggal-bwafbINDELI1_5lowcmp_Human_Full_Genome_TRDB_hg19_150331_all_gt95identity_mergedhetalt
91.1540
85.5407
97.5559
42.8699
408269018764746
97.8723
gduggal-bwafbINDELI1_5lowcmp_Human_Full_Genome_TRDB_hg19_150331_all_gt95identity_mergedhomalt
93.1414
92.1902
94.1126
70.7669
10869210876866
97.0588
gduggal-bwafbINDELI1_5lowcmp_Human_Full_Genome_TRDB_hg19_150331_all_merged*
91.7223
87.9766
95.8011
68.6010
14144193312617553435
78.6618
gduggal-bwafbINDELI1_5lowcmp_Human_Full_Genome_TRDB_hg19_150331_all_mergedhet
92.2978
89.3146
95.4870
71.2322
57346866813322214
66.4596
gduggal-bwafbINDELI1_5lowcmp_Human_Full_Genome_TRDB_hg19_150331_all_mergedhetalt
89.6510
84.0909
95.9984
54.3693
495893823519896
97.9592
gduggal-bwafbINDELI1_5lowcmp_Human_Full_Genome_TRDB_hg19_150331_all_mergedhomalt
93.9836
91.7841
96.2911
69.5456
34523093453133125
93.9850
gduggal-bwafbINDELI1_5lowcmp_SimpleRepeat_diTR_11to50*
93.1275
89.5499
97.0028
61.5094
5133599346310792
85.9813
gduggal-bwafbINDELI1_5lowcmp_SimpleRepeat_diTR_11to50het
94.4657
93.1034
95.8683
72.1692
12699413695947
79.6610
gduggal-bwafbINDELI1_5lowcmp_SimpleRepeat_diTR_11to50hetalt
92.7440
88.1997
97.7819
39.9778
335644915873636
100.0000
gduggal-bwafbINDELI1_5lowcmp_SimpleRepeat_diTR_11to50homalt
93.7249
90.0709
97.6879
63.9583
50856507129
75.0000
gduggal-bwafbINDELI1_5lowcmp_SimpleRepeat_diTR_51to200*
63.6988
47.5728
96.3636
56.6929
981085321
50.0000
gduggal-bwafbINDELI1_5lowcmp_SimpleRepeat_diTR_51to200het
57.1429
66.6667
50.0000
95.6522
189110
0.0000
gduggal-bwafbINDELI1_5lowcmp_SimpleRepeat_diTR_51to200hetalt
61.7761
44.6927
100.0000
30.6667
80995200
gduggal-bwafbINDELI1_5lowcmp_SimpleRepeat_diTR_51to200homalt
0.0000
0.0000
83.3333
00011
100.0000
gduggal-bwafbINDELI1_5lowcmp_SimpleRepeat_homopolymer_6to10*
97.8863
96.6892
99.1133
68.8334
25708825712322
95.6522
gduggal-bwafbINDELI1_5lowcmp_SimpleRepeat_homopolymer_6to10het
98.1652
97.0888
99.2658
70.8788
13344013521010
100.0000
gduggal-bwafbINDELI1_5lowcmp_SimpleRepeat_homopolymer_6to10hetalt
94.0050
89.8734
98.5348
73.8506
2843226944
100.0000
gduggal-bwafbINDELI1_5lowcmp_SimpleRepeat_homopolymer_6to10homalt
98.7030
98.3471
99.0615
63.1437
9521695098
88.8889
gduggal-bwafbINDELI1_5lowcmp_SimpleRepeat_homopolymer_gt10*
100.0000
100.0000
10000
gduggal-bwafbINDELI1_5lowcmp_SimpleRepeat_homopolymer_gt10het
0.0000
100.0000
00000
gduggal-bwafbINDELI1_5lowcmp_SimpleRepeat_homopolymer_gt10hetalt
100.0000
100.0000
10000
gduggal-bwafbINDELI1_5lowcmp_SimpleRepeat_homopolymer_gt10homalt
0.0000
100.0000
00000
gduggal-bwafbINDELI1_5lowcmp_SimpleRepeat_quadTR_11to50*
92.5216
89.4845
95.7722
64.1176
34894103330147104
70.7483
gduggal-bwafbINDELI1_5lowcmp_SimpleRepeat_quadTR_11to50het
92.4658
89.4891
95.6473
67.6067
152417916927741
53.2468
gduggal-bwafbINDELI1_5lowcmp_SimpleRepeat_quadTR_11to50hetalt
90.6525
87.0027
94.6221
51.2057
9841476513736
97.2973
gduggal-bwafbINDELI1_5lowcmp_SimpleRepeat_quadTR_11to50homalt
94.3814
92.1127
96.7647
63.8170
981849873327
81.8182
gduggal-bwafbINDELI1_5lowcmp_SimpleRepeat_quadTR_51to200*
79.1856
69.5167
91.9786
71.9640
18782172158
53.3333
gduggal-bwafbINDELI1_5lowcmp_SimpleRepeat_quadTR_51to200het
69.3333
66.6667
72.2222
89.6254
341726103
30.0000
gduggal-bwafbINDELI1_5lowcmp_SimpleRepeat_quadTR_51to200hetalt
82.2369
70.5069
98.6486
35.9307
1536414622
100.0000
gduggal-bwafbINDELI1_5lowcmp_SimpleRepeat_quadTR_51to200homalt
0.0000
0.0000
96.6292
01033
100.0000
gduggal-bwafbINDELI1_5lowcmp_SimpleRepeat_quadTR_gt200*
0.0000
100.0000
00000
gduggal-bwafbINDELI1_5lowcmp_SimpleRepeat_quadTR_gt200het
0.0000
100.0000
00000