PrecisionFDA
Truth Challenge

Engage and improve DNA test results with our community challenges

Explore HG002 comparison results
Use this interactive explorer to filter all results across submission entries and multiple dimensions.
EntryTypeSubtypeSubsetGenotypeF-scoreRecallPrecision Frac_NATruth TPTruth FNQuery TPQuery FPFP gt% FP ma
79451-79500 / 86044 show all
ltrigg-rtg1SNPtvlowcmp_Human_Full_Genome_TRDB_hg19_150331_TRgt6_lt51bp_gt95identity_mergedhetalt
100.0000
100.0000
100.0000
66.6667
10100
ltrigg-rtg1SNPtvlowcmp_Human_Full_Genome_TRDB_hg19_150331_TRgt6_lt51bp_gt95identity_mergedhomalt
100.0000
100.0000
100.0000
68.7717
525053900
ltrigg-rtg1SNPtvlowcmp_Human_Full_Genome_TRDB_hg19_150331_TRlt7_51to200bp_gt95identity_mergedhomalt
92.1053
85.3659
100.0000
90.6977
3563200
ltrigg-rtg1SNPtvlowcmp_Human_Full_Genome_TRDB_hg19_150331_TRlt7_lt101bp_gt95identity_mergedhetalt
100.0000
100.0000
100.0000
70.0000
20300
ltrigg-rtg1SNPtvlowcmp_Human_Full_Genome_TRDB_hg19_150331_TRlt7_lt101bp_gt95identity_mergedhomalt
99.0040
98.0276
100.0000
80.7123
4971048200
ltrigg-rtg1SNPtvlowcmp_Human_Full_Genome_TRDB_hg19_150331_TRlt7_lt51bp_gt95identity_mergedhetalt
100.0000
100.0000
100.0000
70.0000
20300
ltrigg-rtg1SNPtvlowcmp_Human_Full_Genome_TRDB_hg19_150331_TRlt7_lt51bp_gt95identity_mergedhomalt
99.5717
99.1471
100.0000
79.3245
465445300
ltrigg-rtg1SNPtvlowcmp_Human_Full_Genome_TRDB_hg19_150331_all_gt95identity_mergedhetalt
100.0000
100.0000
100.0000
90.9091
30400
ltrigg-rtg1SNPtvlowcmp_Human_Full_Genome_TRDB_hg19_150331_all_gt95identity_mergedhomalt
99.5979
99.1989
100.0000
79.0898
148612150700
ltrigg-rtg1SNPtvlowcmp_SimpleRepeat_diTR_11to50hetalt
100.0000
100.0000
100.0000
93.3333
10100
ltrigg-rtg1SNPtvlowcmp_SimpleRepeat_diTR_51to200hetalt
0.0000
0.0000
100.0000
0.0000
00100
ltrigg-rtg1SNPtvlowcmp_SimpleRepeat_diTR_51to200homalt
87.5000
77.7778
100.0000
92.3913
72700
ltrigg-rtg1SNPtvlowcmp_SimpleRepeat_homopolymer_6to10hetalt
100.0000
100.0000
100.0000
66.6667
50500
ltrigg-rtg1SNPtvlowcmp_SimpleRepeat_homopolymer_6to10homalt
99.9099
99.8199
100.0000
58.1282
38807387900
ltrigg-rtg1SNPtvlowcmp_SimpleRepeat_quadTR_11to50hetalt
100.0000
100.0000
100.0000
72.2222
50500
ltrigg-rtg1SNPtvlowcmp_SimpleRepeat_quadTR_51to200homalt
100.0000
100.0000
100.0000
92.9577
60500
ltrigg-rtg1SNPtvlowcmp_SimpleRepeat_triTR_51to200*
100.0000
100.0000
100.0000
97.8723
10100
ltrigg-rtg1SNPtvlowcmp_SimpleRepeat_triTR_51to200het
100.0000
100.0000
100.0000
97.5610
10100
ltrigg-rtg1SNPtvmap_l100_m0_e0hetalt
100.0000
100.0000
100.0000
63.6364
1601600
ltrigg-rtg1SNPtvmap_l100_m1_e0hetalt
100.0000
100.0000
100.0000
64.0351
4104100
ltrigg-rtg1SNPtvmap_l100_m2_e0hetalt
100.0000
100.0000
100.0000
65.8537
4204200
ltrigg-rtg1SNPtvmap_l100_m2_e1hetalt
100.0000
100.0000
100.0000
65.3226
4304300
ltrigg-rtg1SNPtvmap_l125_m0_e0hetalt
100.0000
100.0000
100.0000
71.8750
90900
ltrigg-rtg1SNPtvmap_l125_m1_e0hetalt
100.0000
100.0000
100.0000
63.4146
3003000
ltrigg-rtg1SNPtvmap_l125_m2_e0hetalt
100.0000
100.0000
100.0000
67.3913
3003000
ltrigg-rtg1SNPtvmap_l125_m2_e1hetalt
100.0000
100.0000
100.0000
67.3913
3003000
ltrigg-rtg1SNPtvmap_l150_m0_e0hetalt
100.0000
100.0000
100.0000
86.3636
30300
ltrigg-rtg1SNPtvmap_l150_m1_e0hetalt
100.0000
100.0000
100.0000
68.7500
2002000
ltrigg-rtg1SNPtvmap_l150_m2_e0hetalt
100.0000
100.0000
100.0000
70.5882
2002000
ltrigg-rtg1SNPtvmap_l150_m2_e1hetalt
100.0000
100.0000
100.0000
71.0145
2002000
ltrigg-rtg1SNPtvmap_l250_m0_e0homalt
99.4792
98.9637
100.0000
92.6482
191219100
ltrigg-rtg1SNPtvmap_l250_m1_e0hetalt
100.0000
100.0000
100.0000
85.1852
40400
ltrigg-rtg1SNPtvmap_l250_m2_e0hetalt
100.0000
100.0000
100.0000
83.3333
50500
ltrigg-rtg1SNPtvmap_l250_m2_e1hetalt
100.0000
100.0000
100.0000
83.3333
50500
ltrigg-rtg1SNPtvsegduphetalt
100.0000
100.0000
100.0000
96.9027
70700
ltrigg-rtg1SNPtvtech_badpromotershomalt
98.7013
97.4359
100.0000
54.7619
3813800
ltrigg-rtg2INDEL*decoy*
100.0000
100.0000
100.0000
99.8898
1001200
ltrigg-rtg2INDEL*decoyhet
100.0000
100.0000
100.0000
99.8803
60800
ltrigg-rtg2INDEL*decoyhetalt
100.0000
100.0000
100.0000
99.8503
10100
ltrigg-rtg2INDEL*decoyhomalt
100.0000
100.0000
100.0000
99.9119
30300
ltrigg-rtg2INDEL*func_cdshetalt
88.8889
80.0000
100.0000
75.0000
41500
ltrigg-rtg2INDEL*func_cdshomalt
99.7783
99.5575
100.0000
29.6875
225122500
ltrigg-rtg2INDEL*lowcmp_AllRepeats_gt200bp_gt95identity_mergedhetalt
100.0000
100.0000
100.0000
98.0519
30300
ltrigg-rtg2INDEL*lowcmp_AllRepeats_gt200bp_gt95identity_mergedhomalt
75.0000
60.0000
100.0000
99.6099
32300
ltrigg-rtg2INDEL*lowcmp_Human_Full_Genome_TRDB_hg19_150331_TRgt6_gt200bp_gt95identity_mergedhetalt
100.0000
100.0000
100.0000
97.9592
30300
ltrigg-rtg2INDEL*lowcmp_Human_Full_Genome_TRDB_hg19_150331_TRgt6_gt200bp_gt95identity_mergedhomalt
66.6667
50.0000
100.0000
99.7379
22200
ltrigg-rtg2INDEL*lowcmp_Human_Full_Genome_TRDB_hg19_150331_TRgt6_lt51bp_gt95identity_mergedhomalt
99.7778
99.5565
100.0000
68.6843
13476134000
ltrigg-rtg2INDEL*lowcmp_Human_Full_Genome_TRDB_hg19_150331_TRlt7_gt200bp_gt95identity_merged*
100.0000
100.0000
100.0000
94.0000
30300
ltrigg-rtg2INDEL*lowcmp_Human_Full_Genome_TRDB_hg19_150331_TRlt7_gt200bp_gt95identity_mergedhet
100.0000
100.0000
100.0000
94.4444
20200
ltrigg-rtg2INDEL*lowcmp_Human_Full_Genome_TRDB_hg19_150331_TRlt7_gt200bp_gt95identity_mergedhomalt
100.0000
100.0000
100.0000
83.3333
10100