PrecisionFDA
Truth Challenge

Engage and improve DNA test results with our community challenges

Explore HG002 comparison results
Use this interactive explorer to filter all results across submission entries and multiple dimensions.
Entry TypeSubtypeSubsetGenotypeF-scoreRecallPrecisionFrac_NATruth TPTruth FNQuery TPQuery FPFP gt% FP ma
13551-13600 / 86044 show all
ltrigg-rtg2INDELD1_5segdupwithalt*
100.0000
100.0000
100.0000
99.9917
10100
ltrigg-rtg2INDELD1_5segdupwithalthet
100.0000
100.0000
100.0000
99.9858
10100
ltrigg-rtg2INDELD1_5segdupwithalthetalt
0.0000
100.0000
00000
ltrigg-rtg2INDELD1_5segdupwithalthomalt
0.0000
100.0000
00000
ltrigg-rtg2INDELD1_5tech_badpromoters*
97.2973
94.7368
100.0000
33.3333
1811800
ltrigg-rtg2INDELD1_5tech_badpromotershet
100.0000
100.0000
100.0000
38.4615
80800
ltrigg-rtg2INDELD1_5tech_badpromotershetalt
100.0000
100.0000
100.0000
33.3333
20200
ltrigg-rtg2INDELD1_5tech_badpromotershomalt
94.1176
88.8889
100.0000
27.2727
81800
ltrigg-rtg2INDELD6_15**
98.6807
97.9036
99.4704
46.9068
255455472535413583
61.4815
ltrigg-rtg2INDELD6_15*het
99.2141
99.0942
99.3342
52.7094
11487105113397626
34.2105
ltrigg-rtg2INDELD6_15*hetalt
97.5461
95.7915
99.3661
40.2288
783034478385050
100.0000
ltrigg-rtg2INDELD6_15*homalt
99.1477
98.4508
99.8545
42.0407
622898617797
77.7778
ltrigg-rtg2INDELD6_15HG002complexvar*
97.7900
96.9257
98.6698
51.5393
513916349706750
74.6269
ltrigg-rtg2INDELD6_15HG002complexvarhet
98.5457
97.9487
99.1499
50.4465
3056642916258
32.0000
ltrigg-rtg2INDELD6_15HG002complexvarhetalt
94.0012
92.3001
95.7661
55.0113
935789504242
100.0000
ltrigg-rtg2INDELD6_15HG002complexvarhomalt
99.0937
98.2036
100.0000
51.0204
114821110400
ltrigg-rtg2INDELD6_15HG002compoundhet*
97.5434
95.9362
99.2054
30.5780
866436786156963
91.3043
ltrigg-rtg2INDELD6_15HG002compoundhethet
97.1219
96.8458
97.3995
54.6381
829278242216
72.7273
ltrigg-rtg2INDELD6_15HG002compoundhethetalt
97.6303
95.8410
99.4878
26.1770
781233977694040
100.0000
ltrigg-rtg2INDELD6_15HG002compoundhethomalt
84.6862
95.8333
75.8621
56.0606
2312277
100.0000
ltrigg-rtg2INDELD6_15decoy*
100.0000
100.0000
100.0000
99.8573
10100
ltrigg-rtg2INDELD6_15decoyhet
0.0000
100.0000
00000
ltrigg-rtg2INDELD6_15decoyhetalt
100.0000
100.0000
100.0000
99.0385
10100
ltrigg-rtg2INDELD6_15decoyhomalt
0.0000
100.0000
00000
ltrigg-rtg2INDELD6_15func_cds*
98.8235
97.6744
100.0000
48.1481
4214200
ltrigg-rtg2INDELD6_15func_cdshet
98.2456
96.5517
100.0000
44.0000
2812800
ltrigg-rtg2INDELD6_15func_cdshetalt
100.0000
100.0000
100.0000
66.6667
20200
ltrigg-rtg2INDELD6_15func_cdshomalt
100.0000
100.0000
100.0000
52.0000
1201200
ltrigg-rtg2INDELD6_15lowcmp_AllRepeats_51to200bp_gt95identity_merged*
95.4464
92.3899
98.7120
46.6442
363029936024734
72.3404
ltrigg-rtg2INDELD6_15lowcmp_AllRepeats_51to200bp_gt95identity_mergedhet
96.6005
95.5255
97.6999
63.9558
918438922110
47.6190
ltrigg-rtg2INDELD6_15lowcmp_AllRepeats_51to200bp_gt95identity_mergedhetalt
94.9326
91.0397
99.1732
34.1898
215421221591818
100.0000
ltrigg-rtg2INDELD6_15lowcmp_AllRepeats_51to200bp_gt95identity_mergedhomalt
95.5396
92.6910
98.5689
43.9880
5584455186
75.0000
ltrigg-rtg2INDELD6_15lowcmp_AllRepeats_gt200bp_gt95identity_merged*
100.0000
100.0000
100.0000
96.2500
60600
ltrigg-rtg2INDELD6_15lowcmp_AllRepeats_gt200bp_gt95identity_mergedhet
100.0000
100.0000
100.0000
96.9072
30300
ltrigg-rtg2INDELD6_15lowcmp_AllRepeats_gt200bp_gt95identity_mergedhetalt
100.0000
100.0000
100.0000
92.0000
20200
ltrigg-rtg2INDELD6_15lowcmp_AllRepeats_gt200bp_gt95identity_mergedhomalt
100.0000
100.0000
100.0000
97.3684
10100
ltrigg-rtg2INDELD6_15lowcmp_AllRepeats_lt51bp_gt95identity_merged*
98.7873
98.1180
99.4657
52.4428
15745302156378458
69.0476
ltrigg-rtg2INDELD6_15lowcmp_AllRepeats_lt51bp_gt95identity_mergedhet
99.1175
98.9995
99.2358
63.1487
55415654544217
40.4762
ltrigg-rtg2INDELD6_15lowcmp_AllRepeats_lt51bp_gt95identity_mergedhetalt
98.3080
97.2255
99.4149
40.8780
662318966273939
100.0000
ltrigg-rtg2INDELD6_15lowcmp_AllRepeats_lt51bp_gt95identity_mergedhomalt
99.1689
98.4332
99.9157
48.1800
358157355632
66.6667
ltrigg-rtg2INDELD6_15lowcmp_Human_Full_Genome_TRDB_hg19_150331*
98.1456
97.0290
99.2882
49.3199
171465251701812280
65.5738
ltrigg-rtg2INDELD6_15lowcmp_Human_Full_Genome_TRDB_hg19_150331het
98.6532
98.4263
98.8811
62.2592
57549256566424
37.5000
ltrigg-rtg2INDELD6_15lowcmp_Human_Full_Genome_TRDB_hg19_150331hetalt
97.5228
95.7439
99.3691
35.4149
771634377184949
100.0000
ltrigg-rtg2INDELD6_15lowcmp_Human_Full_Genome_TRDB_hg19_150331homalt
98.6703
97.6102
99.7536
44.9684
367690364497
77.7778
ltrigg-rtg2INDELD6_15lowcmp_Human_Full_Genome_TRDB_hg19_150331_TRgt6_51to200bp_gt95identity_merged*
98.0861
98.0861
98.0861
68.2853
205420540
0.0000
ltrigg-rtg2INDELD6_15lowcmp_Human_Full_Genome_TRDB_hg19_150331_TRgt6_51to200bp_gt95identity_mergedhet
98.0423
99.2126
96.8992
68.9904
126112540
0.0000
ltrigg-rtg2INDELD6_15lowcmp_Human_Full_Genome_TRDB_hg19_150331_TRgt6_51to200bp_gt95identity_mergedhetalt
96.0000
92.3077
100.0000
63.4615
3633800
ltrigg-rtg2INDELD6_15lowcmp_Human_Full_Genome_TRDB_hg19_150331_TRgt6_51to200bp_gt95identity_mergedhomalt
100.0000
100.0000
100.0000
69.7842
4304200
ltrigg-rtg2INDELD6_15lowcmp_Human_Full_Genome_TRDB_hg19_150331_TRgt6_gt200bp_gt95identity_merged*
100.0000
100.0000
100.0000
97.3684
40400
ltrigg-rtg2INDELD6_15lowcmp_Human_Full_Genome_TRDB_hg19_150331_TRgt6_gt200bp_gt95identity_mergedhet
100.0000
100.0000
100.0000
97.8723
20200