PrecisionFDA
Truth Challenge

Engage and improve DNA test results with our community challenges

Explore HG002 comparison results
Use this interactive explorer to filter all results across submission entries and multiple dimensions.
Entry TypeSubtypeSubsetGenotypeF-scoreRecallPrecisionFrac_NATruth TPTruth FNQuery TPQuery FPFP gt% FP ma
37251-37300 / 86044 show all
gduggal-bwaplatINDELI1_5segdupwithalthet
0.0000
100.0000
00000
gduggal-bwaplatINDELI1_5segdupwithalthetalt
0.0000
100.0000
00000
gduggal-bwaplatINDELI1_5segdupwithalthomalt
0.0000
100.0000
00000
gduggal-bwaplatINDELI1_5tech_badpromoters*
87.1795
77.2727
100.0000
73.4375
1751700
gduggal-bwaplatINDELI1_5tech_badpromotershet
100.0000
100.0000
100.0000
75.7576
80800
gduggal-bwaplatINDELI1_5tech_badpromotershetalt
100.0000
100.0000
100.0000
75.0000
10100
gduggal-bwaplatINDELI1_5tech_badpromotershomalt
76.1905
61.5385
100.0000
70.3704
85800
gduggal-bwaplatINDELI6_15**
86.3949
77.4322
97.7041
60.0483
19221560219235452296
65.4867
gduggal-bwaplatINDELI6_15*het
84.1633
73.9659
97.6222
67.9845
74212612743118155
30.3867
gduggal-bwaplatINDELI6_15*hetalt
85.2525
75.5818
97.7610
47.7759
646320886462148138
93.2432
gduggal-bwaplatINDELI6_15*homalt
91.2395
85.5426
97.7493
57.4509
53379025342123103
83.7398
gduggal-bwaplatINDELI6_15HG002complexvar*
86.2963
77.2538
97.7362
64.0519
3702109037138650
58.1395
gduggal-bwaplatINDELI6_15HG002complexvarhet
84.5433
74.6497
97.4600
66.9103
175859717654616
34.7826
gduggal-bwaplatINDELI6_15HG002complexvarhetalt
84.1254
73.6713
98.0371
62.0604
9013228991817
94.4444
gduggal-bwaplatINDELI6_15HG002complexvarhomalt
91.5364
85.9143
97.9458
60.0075
104317110492217
77.2727
gduggal-bwaplatINDELI6_15HG002compoundhet*
84.8012
75.3988
96.8828
44.4562
661721596620213132
61.9718
gduggal-bwaplatINDELI6_15HG002compoundhethet
65.8766
65.8654
65.8879
86.4385
137711417316
21.9178
gduggal-bwaplatINDELI6_15HG002compoundhethetalt
85.9482
75.6238
99.5374
35.2988
6456208164553020
66.6667
gduggal-bwaplatINDELI6_15HG002compoundhethomalt
29.0909
77.4194
17.9104
80.8845
2472411096
87.2727
gduggal-bwaplatINDELI6_15decoy*
0.0000
100.0000
00000
gduggal-bwaplatINDELI6_15decoyhet
0.0000
100.0000
00000
gduggal-bwaplatINDELI6_15decoyhetalt
0.0000
100.0000
00000
gduggal-bwaplatINDELI6_15decoyhomalt
0.0000
100.0000
00000
gduggal-bwaplatINDELI6_15func_cds*
87.1795
79.0698
97.1429
43.5484
3493411
100.0000
gduggal-bwaplatINDELI6_15func_cdshet
85.7143
75.0000
100.0000
48.5714
1861800
gduggal-bwaplatINDELI6_15func_cdshetalt
85.7143
75.0000
100.0000
25.0000
31300
gduggal-bwaplatINDELI6_15func_cdshomalt
89.6552
86.6667
92.8571
39.1304
1321311
100.0000
gduggal-bwaplatINDELI6_15lowcmp_AllRepeats_51to200bp_gt95identity_merged*
64.8475
48.3254
98.5366
81.1754
40443240463
50.0000
gduggal-bwaplatINDELI6_15lowcmp_AllRepeats_51to200bp_gt95identity_mergedhet
64.1348
47.7733
97.5410
89.6698
11812911930
0.0000
gduggal-bwaplatINDELI6_15lowcmp_AllRepeats_51to200bp_gt95identity_mergedhetalt
63.8458
46.9811
99.5984
65.4167
24928124811
100.0000
gduggal-bwaplatINDELI6_15lowcmp_AllRepeats_51to200bp_gt95identity_mergedhomalt
75.5102
62.7119
94.8718
85.9206
37223722
100.0000
gduggal-bwaplatINDELI6_15lowcmp_AllRepeats_gt200bp_gt95identity_merged*
0.0000
100.0000
00000
gduggal-bwaplatINDELI6_15lowcmp_AllRepeats_gt200bp_gt95identity_mergedhet
0.0000
100.0000
00000
gduggal-bwaplatINDELI6_15lowcmp_AllRepeats_gt200bp_gt95identity_mergedhetalt
0.0000
100.0000
00000
gduggal-bwaplatINDELI6_15lowcmp_AllRepeats_gt200bp_gt95identity_mergedhomalt
0.0000
100.0000
00000
gduggal-bwaplatINDELI6_15lowcmp_AllRepeats_lt51bp_gt95identity_merged*
76.6225
63.6808
96.1659
75.3638
33841930338613545
33.3333
gduggal-bwaplatINDELI6_15lowcmp_AllRepeats_lt51bp_gt95identity_mergedhet
65.6044
51.5250
90.2715
87.2106
794747798863
3.4884
gduggal-bwaplatINDELI6_15lowcmp_AllRepeats_lt51bp_gt95identity_mergedhetalt
82.8931
71.4423
98.7151
55.3864
222989122282926
89.6552
gduggal-bwaplatINDELI6_15lowcmp_AllRepeats_lt51bp_gt95identity_mergedhomalt
69.8222
55.2833
94.7368
83.6277
3612923602016
80.0000
gduggal-bwaplatINDELI6_15lowcmp_Human_Full_Genome_TRDB_hg19_150331*
72.7559
58.4625
96.3002
76.5890
36962626369614255
38.7324
gduggal-bwaplatINDELI6_15lowcmp_Human_Full_Genome_TRDB_hg19_150331het
61.6323
46.5246
91.2698
88.3076
9171054920889
10.2273
gduggal-bwaplatINDELI6_15lowcmp_Human_Full_Genome_TRDB_hg19_150331hetalt
79.5052
66.5627
98.6958
53.7549
2347117923463128
90.3226
gduggal-bwaplatINDELI6_15lowcmp_Human_Full_Genome_TRDB_hg19_150331homalt
67.4944
52.3636
94.9227
82.7953
4323934302318
78.2609
gduggal-bwaplatINDELI6_15lowcmp_Human_Full_Genome_TRDB_hg19_150331_TRgt6_51to200bp_gt95identity_merged*
79.3333
66.8539
97.5410
81.8452
1195911931
33.3333
gduggal-bwaplatINDELI6_15lowcmp_Human_Full_Genome_TRDB_hg19_150331_TRgt6_51to200bp_gt95identity_mergedhet
77.9624
65.4321
96.4286
87.1854
53285420
0.0000
gduggal-bwaplatINDELI6_15lowcmp_Human_Full_Genome_TRDB_hg19_150331_TRgt6_51to200bp_gt95identity_mergedhetalt
80.3922
67.2131
100.0000
65.8120
41204000
gduggal-bwaplatINDELI6_15lowcmp_Human_Full_Genome_TRDB_hg19_150331_TRgt6_51to200bp_gt95identity_mergedhomalt
80.6452
69.4444
96.1538
77.9661
25112511
100.0000
gduggal-bwaplatINDELI6_15lowcmp_Human_Full_Genome_TRDB_hg19_150331_TRgt6_gt200bp_gt95identity_merged*
0.0000
100.0000
00000
gduggal-bwaplatINDELI6_15lowcmp_Human_Full_Genome_TRDB_hg19_150331_TRgt6_gt200bp_gt95identity_mergedhet
0.0000
100.0000
00000
gduggal-bwaplatINDELI6_15lowcmp_Human_Full_Genome_TRDB_hg19_150331_TRgt6_gt200bp_gt95identity_mergedhetalt
0.0000
100.0000
00000