PrecisionFDA
Truth Challenge

Engage and improve DNA test results with our community challenges

Explore HG002 comparison results
Use this interactive explorer to filter all results across submission entries and multiple dimensions.
Entry TypeSubtypeSubsetGenotypeF-scoreRecallPrecisionFrac_NATruth TPTruth FNQuery TPQuery FPFP gt% FP ma
72501-72550 / 86044 show all
ltrigg-rtg2INDELD6_15segduphet
97.7502
97.8261
97.6744
92.4495
9028420
0.0000
ltrigg-rtg2INDELD6_15segduphetalt
96.8421
93.8776
100.0000
90.1468
4634700
ltrigg-rtg2INDELD6_15segduphomalt
98.9899
98.0000
100.0000
89.5966
4914900
ltrigg-rtg2INDELD6_15segdupwithalt*
0.0000
100.0000
00000
ltrigg-rtg2INDELD6_15segdupwithalthet
0.0000
100.0000
00000
ltrigg-rtg2INDELD6_15segdupwithalthetalt
0.0000
100.0000
00000
ltrigg-rtg2INDELD6_15segdupwithalthomalt
0.0000
100.0000
00000
ltrigg-rtg2INDELD6_15tech_badpromoters*
100.0000
100.0000
100.0000
50.0000
1701700
ltrigg-rtg2INDELD6_15tech_badpromotershet
100.0000
100.0000
100.0000
44.4444
1001000
ltrigg-rtg2INDELD6_15tech_badpromotershetalt
100.0000
100.0000
100.0000
66.6667
10100
ltrigg-rtg2INDELD6_15tech_badpromotershomalt
100.0000
100.0000
100.0000
53.8462
60600
ltrigg-rtg2INDELI16_PLUS**
92.6024
87.3765
98.4933
47.8680
557280554918471
84.5238
ltrigg-rtg2INDELI16_PLUS*het
93.4104
88.2634
99.1949
47.8107
23993192341197
36.8421
ltrigg-rtg2INDELI16_PLUS*hetalt
90.2131
82.6025
99.3685
50.0717
173336517311111
100.0000
ltrigg-rtg2INDELI16_PLUS*homalt
94.2470
92.2486
96.3340
45.0988
144012114195453
98.1481
ltrigg-rtg2INDELI16_PLUSHG002complexvar*
92.2852
87.0130
98.2375
52.4691
113917010591916
84.2105
ltrigg-rtg2INDELI16_PLUSHG002complexvarhet
91.9911
86.1654
98.6616
46.5235
5739251674
57.1429
ltrigg-rtg2INDELI16_PLUSHG002complexvarhetalt
89.8660
82.0896
99.2701
60.2899
2756027222
100.0000
ltrigg-rtg2INDELI16_PLUSHG002complexvarhomalt
95.2945
94.1748
96.4413
53.1667
291182711010
100.0000
ltrigg-rtg2INDELI16_PLUSHG002compoundhet*
88.6244
82.1745
96.1730
41.5559
176138217346967
97.1014
ltrigg-rtg2INDELI16_PLUSHG002compoundhethet
69.4745
61.7021
79.4872
79.6875
29183187
87.5000
ltrigg-rtg2INDELI16_PLUSHG002compoundhethetalt
90.2840
82.6087
99.5316
37.2520
1729364170088
100.0000
ltrigg-rtg2INDELI16_PLUSHG002compoundhethomalt
10.1695
100.0000
5.3571
67.2515
3035352
98.1132
ltrigg-rtg2INDELI16_PLUSdecoy*
0.0000
100.0000
00000
ltrigg-rtg2INDELI16_PLUSdecoyhet
0.0000
100.0000
00000
ltrigg-rtg2INDELI16_PLUSdecoyhetalt
0.0000
100.0000
00000
ltrigg-rtg2INDELI16_PLUSdecoyhomalt
0.0000
100.0000
00000
ltrigg-rtg2INDELI16_PLUSfunc_cds*
90.9091
83.3333
100.0000
50.0000
1021000
ltrigg-rtg2INDELI16_PLUSfunc_cdshet
94.1176
88.8889
100.0000
42.8571
81800
ltrigg-rtg2INDELI16_PLUSfunc_cdshetalt
0.0000
0.0000
0.0000
01000
ltrigg-rtg2INDELI16_PLUSfunc_cdshomalt
100.0000
100.0000
100.0000
66.6667
20200
ltrigg-rtg2INDELI16_PLUSlowcmp_AllRepeats_51to200bp_gt95identity_merged*
77.5746
65.5340
95.0355
74.5946
1357113475
71.4286
ltrigg-rtg2INDELI16_PLUSlowcmp_AllRepeats_51to200bp_gt95identity_mergedhet
75.5891
62.1053
96.5517
74.3363
59365620
0.0000
ltrigg-rtg2INDELI16_PLUSlowcmp_AllRepeats_51to200bp_gt95identity_mergedhetalt
81.4336
70.9302
95.5882
72.4696
61256533
100.0000
ltrigg-rtg2INDELI16_PLUSlowcmp_AllRepeats_51to200bp_gt95identity_mergedhomalt
70.9091
60.0000
86.6667
81.7073
15101322
100.0000
ltrigg-rtg2INDELI16_PLUSlowcmp_AllRepeats_gt200bp_gt95identity_merged*
0.0000
100.0000
00000
ltrigg-rtg2INDELI16_PLUSlowcmp_AllRepeats_gt200bp_gt95identity_mergedhet
0.0000
100.0000
00000
ltrigg-rtg2INDELI16_PLUSlowcmp_AllRepeats_gt200bp_gt95identity_mergedhetalt
0.0000
100.0000
00000
ltrigg-rtg2INDELI16_PLUSlowcmp_AllRepeats_gt200bp_gt95identity_mergedhomalt
0.0000
100.0000
00000
ltrigg-rtg2INDELI16_PLUSlowcmp_AllRepeats_lt51bp_gt95identity_merged*
91.3456
85.6266
97.8831
68.3472
9771649712120
95.2381
ltrigg-rtg2INDELI16_PLUSlowcmp_AllRepeats_lt51bp_gt95identity_mergedhet
91.4566
84.8624
99.1620
69.5837
3706635532
66.6667
ltrigg-rtg2INDELI16_PLUSlowcmp_AllRepeats_lt51bp_gt95identity_mergedhetalt
90.9728
84.5626
98.4344
65.7965
4939050388
100.0000
ltrigg-rtg2INDELI16_PLUSlowcmp_AllRepeats_lt51bp_gt95identity_mergedhomalt
92.6496
93.4426
91.8699
73.4341
11481131010
100.0000
ltrigg-rtg2INDELI16_PLUSlowcmp_Human_Full_Genome_TRDB_hg19_150331*
87.8760
79.8693
97.6669
68.5714
122230812142921
72.4138
ltrigg-rtg2INDELI16_PLUSlowcmp_Human_Full_Genome_TRDB_hg19_150331het
86.4541
77.3273
98.0237
68.7461
515151496102
20.0000
ltrigg-rtg2INDELI16_PLUSlowcmp_Human_Full_Genome_TRDB_hg19_150331hetalt
90.2206
83.1563
98.5965
64.1509
54811156288
100.0000
ltrigg-rtg2INDELI16_PLUSlowcmp_Human_Full_Genome_TRDB_hg19_150331homalt
84.7522
77.5610
93.4132
77.6139
159461561111
100.0000
ltrigg-rtg2INDELI16_PLUSlowcmp_Human_Full_Genome_TRDB_hg19_150331_TRgt6_51to200bp_gt95identity_merged*
75.2215
62.0370
95.5224
69.6833
67416432
66.6667
ltrigg-rtg2INDELI16_PLUSlowcmp_Human_Full_Genome_TRDB_hg19_150331_TRgt6_51to200bp_gt95identity_mergedhet
74.7296
60.6061
97.4359
66.9492
40263810
0.0000
ltrigg-rtg2INDELI16_PLUSlowcmp_Human_Full_Genome_TRDB_hg19_150331_TRgt6_51to200bp_gt95identity_mergedhetalt
77.6119
66.6667
92.8571
66.6667
1261311
100.0000