PrecisionFDA
Truth Challenge

Engage and improve DNA test results with our community challenges

Explore HG002 comparison results
Use this interactive explorer to filter all results across submission entries and multiple dimensions.
Entry TypeSubtypeSubsetGenotypeF-scoreRecallPrecisionFrac_NATruth TPTruth FNQuery TPQuery FPFP gt% FP ma
72001-72050 / 86044 show all
ltrigg-rtg2SNPtvmap_l250_m2_e1hetalt
75.0000
60.0000
100.0000
86.9565
32300
ltrigg-rtg2SNPtvmap_l250_m2_e1homalt
99.6819
99.3658
100.0000
85.4444
940694000
ltrigg-rtg2SNPtvmap_siren*
99.4071
99.1030
99.7130
49.8109
45518412455171319
6.8702
ltrigg-rtg2SNPtvmap_sirenhet
99.1276
98.6962
99.5628
47.7162
28236373282411244
3.2258
ltrigg-rtg2SNPtvmap_sirenhetalt
97.5000
96.2963
98.7342
67.4897
7837811
100.0000
ltrigg-rtg2SNPtvmap_sirenhomalt
99.8781
99.7912
99.9651
52.8102
17204361719864
66.6667
ltrigg-rtg2SNPtvsegdup*
98.9872
99.6132
98.3690
88.4663
849933850414121
14.8936
ltrigg-rtg2SNPtvsegduphet
98.5667
99.4137
97.7340
87.6779
52563152621222
1.6393
ltrigg-rtg2SNPtvsegduphetalt
100.0000
100.0000
100.0000
96.8037
70700
ltrigg-rtg2SNPtvsegduphomalt
99.6765
99.9382
99.4161
89.5171
3236232351919
100.0000
ltrigg-rtg2SNPtvsegdupwithalt*
0.0000
100.0000
00000
ltrigg-rtg2SNPtvsegdupwithalthet
0.0000
100.0000
00000
ltrigg-rtg2SNPtvsegdupwithalthetalt
0.0000
100.0000
00000
ltrigg-rtg2SNPtvsegdupwithalthomalt
0.0000
100.0000
00000
ltrigg-rtg2SNPtvtech_badpromoters*
98.6301
100.0000
97.2973
60.0000
7207220
0.0000
ltrigg-rtg2SNPtvtech_badpromotershet
97.0588
100.0000
94.2857
65.6863
3303320
0.0000
ltrigg-rtg2SNPtvtech_badpromotershetalt
0.0000
0.0000
0.0000
00000
ltrigg-rtg2SNPtvtech_badpromotershomalt
100.0000
100.0000
100.0000
53.0120
3903900
ltrigg-rtg2INDELI6_15*homalt
98.7906
98.4933
99.0897
41.6927
61459460965649
87.5000
ltrigg-rtg2INDELI6_15HG002complexvar*
98.0543
96.8698
99.2681
50.7658
464215043403218
56.2500
ltrigg-rtg2INDELI6_15HG002complexvarhet
98.2137
97.4098
99.0310
49.5355
2294612044209
45.0000
ltrigg-rtg2INDELI6_15HG002complexvarhetalt
96.8057
94.1946
99.5656
57.9927
115271114655
100.0000
ltrigg-rtg2INDELI6_15HG002complexvarhomalt
98.9542
98.5173
99.3950
43.5610
119618115074
57.1429
ltrigg-rtg2INDELI6_15HG002compoundhet*
96.9852
94.9521
99.1073
33.0128
833344382157467
90.5405
ltrigg-rtg2INDELI6_15HG002compoundhethet
93.1013
92.3077
93.9086
73.1973
19216185126
50.0000
ltrigg-rtg2INDELI6_15HG002compoundhethetalt
97.3702
95.0217
99.8378
29.8617
811242580011312
92.3077
ltrigg-rtg2INDELI6_15HG002compoundhethomalt
53.2110
93.5484
37.1795
63.3803
292294949
100.0000
ltrigg-rtg2INDELI6_15decoy*
0.0000
100.0000
00000
ltrigg-rtg2INDELI6_15decoyhet
0.0000
100.0000
00000
ltrigg-rtg2INDELI6_15decoyhetalt
0.0000
100.0000
00000
ltrigg-rtg2INDELI6_15decoyhomalt
0.0000
100.0000
00000
ltrigg-rtg2INDELI6_15func_cds*
98.8235
97.6744
100.0000
33.3333
4214200
ltrigg-rtg2INDELI6_15func_cdshet
100.0000
100.0000
100.0000
34.2857
2402300
ltrigg-rtg2INDELI6_15func_cdshetalt
85.7143
75.0000
100.0000
42.8571
31400
ltrigg-rtg2INDELI6_15func_cdshomalt
100.0000
100.0000
100.0000
28.5714
1501500
ltrigg-rtg2INDELI6_15lowcmp_AllRepeats_51to200bp_gt95identity_merged*
95.5049
92.7033
98.4810
62.0192
77561778128
66.6667
ltrigg-rtg2INDELI6_15lowcmp_AllRepeats_51to200bp_gt95identity_mergedhet
94.5345
91.9028
97.3214
70.5650
2272021862
33.3333
ltrigg-rtg2INDELI6_15lowcmp_AllRepeats_51to200bp_gt95identity_mergedhetalt
95.9057
92.6415
99.4083
55.1724
4913950433
100.0000
ltrigg-rtg2INDELI6_15lowcmp_AllRepeats_51to200bp_gt95identity_mergedhomalt
95.7552
96.6102
94.9153
68.6170
5725633
100.0000
ltrigg-rtg2INDELI6_15lowcmp_AllRepeats_gt200bp_gt95identity_merged*
0.0000
100.0000
00000
ltrigg-rtg2INDELI6_15lowcmp_AllRepeats_gt200bp_gt95identity_mergedhet
0.0000
100.0000
00000
ltrigg-rtg2INDELI6_15lowcmp_AllRepeats_gt200bp_gt95identity_mergedhetalt
0.0000
100.0000
00000
ltrigg-rtg2INDELI6_15lowcmp_AllRepeats_gt200bp_gt95identity_mergedhomalt
0.0000
100.0000
00000
ltrigg-rtg2INDELI6_15lowcmp_AllRepeats_lt51bp_gt95identity_merged*
98.3739
97.4595
99.3056
60.7094
517913551483621
58.3333
ltrigg-rtg2INDELI6_15lowcmp_AllRepeats_lt51bp_gt95identity_mergedhet
98.2438
97.5990
98.8973
68.4428
1504371435163
18.7500
ltrigg-rtg2INDELI6_15lowcmp_AllRepeats_lt51bp_gt95identity_mergedhetalt
98.3982
97.2756
99.5471
54.5040
30358530771413
92.8571
ltrigg-rtg2INDELI6_15lowcmp_AllRepeats_lt51bp_gt95identity_mergedhomalt
98.5345
98.0092
99.0654
64.3729
6401363665
83.3333
ltrigg-rtg2INDELI6_15lowcmp_Human_Full_Genome_TRDB_hg19_150331*
97.7424
96.3145
99.2132
57.7581
608923360534828
58.3333
ltrigg-rtg2INDELI6_15lowcmp_Human_Full_Genome_TRDB_hg19_150331het
97.2208
95.9411
98.5350
66.0589
1891801816279
33.3333
ltrigg-rtg2INDELI6_15lowcmp_Human_Full_Genome_TRDB_hg19_150331hetalt
97.9550
96.3415
99.6234
50.0434
339712934391312
92.3077