PrecisionFDA
Truth Challenge

Engage and improve DNA test results with our community challenges

Explore HG002 comparison results
Use this interactive explorer to filter all results across submission entries and multiple dimensions.
Entry TypeSubtypeSubsetGenotypeF-scoreRecallPrecisionFrac_NATruth TPTruth FNQuery TPQuery FPFP gt% FP ma
53151-53200 / 86044 show all
gduggal-bwafbINDELI1_5segdupwithalthetalt
0.0000
100.0000
00000
gduggal-bwafbINDELI1_5segdupwithalthomalt
0.0000
100.0000
00000
gduggal-bwafbINDELI1_5tech_badpromoters*
100.0000
100.0000
100.0000
53.1915
2202200
gduggal-bwafbINDELI1_5tech_badpromotershet
100.0000
100.0000
100.0000
33.3333
80800
gduggal-bwafbINDELI1_5tech_badpromotershetalt
100.0000
100.0000
100.0000
75.0000
10100
gduggal-bwafbINDELI1_5tech_badpromotershomalt
100.0000
100.0000
100.0000
58.0645
1301300
gduggal-bwafbINDELI6_15**
87.8626
81.3399
95.5225
40.4474
20191463221206994970
97.5855
gduggal-bwafbINDELI6_15*het
88.0827
80.5342
97.1926
38.4775
8080195313848400379
94.7500
gduggal-bwafbINDELI6_15*hetalt
81.7221
72.8336
93.0818
54.0993
622823231480110109
99.0909
gduggal-bwafbINDELI6_15*homalt
93.3335
94.2940
92.3923
40.2909
58833565878484482
99.5868
gduggal-bwafbINDELI6_15HG002complexvar*
87.6466
81.2187
95.1793
49.2161
38929004008203196
96.5517
gduggal-bwafbINDELI6_15HG002complexvarhet
87.7628
80.4671
96.5134
49.2661
189546026029489
94.6809
gduggal-bwafbINDELI6_15HG002complexvarhetalt
81.3403
72.5266
92.5926
62.9291
8873363002423
95.8333
gduggal-bwafbINDELI6_15HG002complexvarhomalt
92.1427
91.4333
92.8631
43.3935
111010411068584
98.8235
gduggal-bwafbINDELI6_15HG002compoundhet*
80.9656
72.6869
91.3725
27.9246
637923977403699688
98.4263
gduggal-bwafbINDELI6_15HG002compoundhethet
78.5751
65.8654
97.3628
22.2080
137715907160152
95.0000
gduggal-bwafbINDELI6_15HG002compoundhethetalt
82.7582
72.8242
95.8306
40.7336
6217232014716463
98.4375
gduggal-bwafbINDELI6_15HG002compoundhethomalt
9.4162
80.6452
5.0000
41.3146
25625475473
99.5789
gduggal-bwafbINDELI6_15decoy*
0.0000
100.0000
00000
gduggal-bwafbINDELI6_15decoyhet
0.0000
100.0000
00000
gduggal-bwafbINDELI6_15decoyhetalt
0.0000
100.0000
00000
gduggal-bwafbINDELI6_15decoyhomalt
0.0000
100.0000
00000
gduggal-bwafbINDELI6_15func_cds*
90.0000
83.7209
97.2973
35.0877
3673611
100.0000
gduggal-bwafbINDELI6_15func_cdshet
90.9091
83.3333
100.0000
40.0000
2042100
gduggal-bwafbINDELI6_15func_cdshetalt
50.0000
50.0000
50.0000
33.3333
22111
100.0000
gduggal-bwafbINDELI6_15func_cdshomalt
96.5517
93.3333
100.0000
26.3158
1411400
gduggal-bwafbINDELI6_15lowcmp_AllRepeats_51to200bp_gt95identity_merged*
65.0432
50.9569
89.8925
58.2960
4264104184747
100.0000
gduggal-bwafbINDELI6_15lowcmp_AllRepeats_51to200bp_gt95identity_mergedhet
67.1080
54.2510
87.9518
61.8098
1341132193030
100.0000
gduggal-bwafbINDELI6_15lowcmp_AllRepeats_51to200bp_gt95identity_mergedhetalt
63.0031
47.3585
94.0828
47.0219
2512791591010
100.0000
gduggal-bwafbINDELI6_15lowcmp_AllRepeats_51to200bp_gt95identity_mergedhomalt
76.5104
69.4915
85.1064
67.3611
41184077
100.0000
gduggal-bwafbINDELI6_15lowcmp_AllRepeats_gt200bp_gt95identity_merged*
0.0000
100.0000
00000
gduggal-bwafbINDELI6_15lowcmp_AllRepeats_gt200bp_gt95identity_mergedhet
0.0000
100.0000
00000
gduggal-bwafbINDELI6_15lowcmp_AllRepeats_gt200bp_gt95identity_mergedhetalt
0.0000
100.0000
00000
gduggal-bwafbINDELI6_15lowcmp_AllRepeats_gt200bp_gt95identity_mergedhomalt
0.0000
100.0000
00000
gduggal-bwafbINDELI6_15lowcmp_AllRepeats_lt51bp_gt95identity_merged*
81.8219
73.9368
91.5896
61.6810
392913852973273264
96.7033
gduggal-bwafbINDELI6_15lowcmp_AllRepeats_lt51bp_gt95identity_mergedhet
81.8829
73.3939
92.5926
62.2269
11314101825146139
95.2055
gduggal-bwafbINDELI6_15lowcmp_AllRepeats_lt51bp_gt95identity_mergedhetalt
80.3666
72.7564
89.7547
59.7561
22708506227171
100.0000
gduggal-bwafbINDELI6_15lowcmp_AllRepeats_lt51bp_gt95identity_mergedhomalt
85.3531
80.8576
90.3780
61.9856
5281255265654
96.4286
gduggal-bwafbINDELI6_15lowcmp_Human_Full_Genome_TRDB_hg19_150331*
78.6020
69.2502
90.8738
58.9733
437819443515353342
96.8839
gduggal-bwafbINDELI6_15lowcmp_Human_Full_Genome_TRDB_hg19_150331het
77.8141
67.6306
91.6078
59.9900
13336382205202194
96.0396
gduggal-bwafbINDELI6_15lowcmp_Human_Full_Genome_TRDB_hg19_150331hetalt
77.2155
68.0091
89.3048
56.7130
239811286688079
98.7500
gduggal-bwafbINDELI6_15lowcmp_Human_Full_Genome_TRDB_hg19_150331homalt
83.8326
78.4242
90.0421
57.6603
6471786427169
97.1831
gduggal-bwafbINDELI6_15lowcmp_Human_Full_Genome_TRDB_hg19_150331_TRgt6_51to200bp_gt95identity_merged*
75.6619
66.8539
87.1429
65.7702
119591221818
100.0000
gduggal-bwafbINDELI6_15lowcmp_Human_Full_Genome_TRDB_hg19_150331_TRgt6_51to200bp_gt95identity_mergedhet
80.4374
75.3086
86.3158
64.6840
6120821313
100.0000
gduggal-bwafbINDELI6_15lowcmp_Human_Full_Genome_TRDB_hg19_150331_TRgt6_51to200bp_gt95identity_mergedhetalt
65.6064
54.0984
83.3333
66.0377
33281533
100.0000
gduggal-bwafbINDELI6_15lowcmp_Human_Full_Genome_TRDB_hg19_150331_TRgt6_51to200bp_gt95identity_mergedhomalt
79.3651
69.4444
92.5926
68.9655
25112522
100.0000
gduggal-bwafbINDELI6_15lowcmp_Human_Full_Genome_TRDB_hg19_150331_TRgt6_gt200bp_gt95identity_merged*
0.0000
100.0000
00000
gduggal-bwafbINDELI6_15lowcmp_Human_Full_Genome_TRDB_hg19_150331_TRgt6_gt200bp_gt95identity_mergedhet
0.0000
100.0000
00000
gduggal-bwafbINDELI6_15lowcmp_Human_Full_Genome_TRDB_hg19_150331_TRgt6_gt200bp_gt95identity_mergedhetalt
0.0000
100.0000
00000
gduggal-bwafbINDELI6_15lowcmp_Human_Full_Genome_TRDB_hg19_150331_TRgt6_gt200bp_gt95identity_mergedhomalt
0.0000
100.0000
00000