PrecisionFDA
Truth Challenge

Engage and improve DNA test results with our community challenges

Explore HG002 comparison results
Use this interactive explorer to filter all results across submission entries and multiple dimensions.
Entry TypeSubtypeSubsetGenotypeF-scoreRecallPrecisionFrac_NATruth TPTruth FNQuery TPQuery FPFP gt% FP ma
2701-2750 / 86044 show all
raldana-dualsentieonSNP*lowcmp_Human_Full_Genome_TRDB_hg19_150331_TRlt7_lt51bp_gt95identity_mergedhomalt
99.5294
99.2958
99.7642
84.2672
846684622
100.0000
raldana-dualsentieonSNP*lowcmp_Human_Full_Genome_TRDB_hg19_150331_all_gt95identity_merged*
98.2637
96.9355
99.6289
78.1445
10470331104703911
28.2051
raldana-dualsentieonSNP*lowcmp_Human_Full_Genome_TRDB_hg19_150331_all_gt95identity_mergedhet
97.3823
95.3945
99.4547
79.4362
65663176566369
25.0000
raldana-dualsentieonSNP*lowcmp_Human_Full_Genome_TRDB_hg19_150331_all_gt95identity_mergedhetalt
100.0000
100.0000
100.0000
92.3077
30300
raldana-dualsentieonSNP*lowcmp_Human_Full_Genome_TRDB_hg19_150331_all_gt95identity_mergedhomalt
99.7826
99.6424
99.9232
75.5082
390114390132
66.6667
raldana-dualsentieonSNP*lowcmp_Human_Full_Genome_TRDB_hg19_150331_all_merged*
98.9892
98.4640
99.5201
68.3963
450007024500021724
11.0599
raldana-dualsentieonSNP*lowcmp_Human_Full_Genome_TRDB_hg19_150331_all_mergedhet
98.4907
97.7292
99.2642
69.7423
280606522806020816
7.6923
raldana-dualsentieonSNP*lowcmp_Human_Full_Genome_TRDB_hg19_150331_all_mergedhetalt
100.0000
100.0000
100.0000
85.1485
1501500
raldana-dualsentieonSNP*lowcmp_Human_Full_Genome_TRDB_hg19_150331_all_mergedhomalt
99.8260
99.7054
99.9469
65.8244
16925501692598
88.8889
raldana-dualsentieonSNP*lowcmp_SimpleRepeat_diTR_11to50*
98.5310
97.2348
99.8622
66.0735
942426894241311
84.6154
raldana-dualsentieonSNP*lowcmp_SimpleRepeat_diTR_11to50het
97.7912
95.8467
99.8163
67.9237
59772595977119
81.8182
raldana-dualsentieonSNP*lowcmp_SimpleRepeat_diTR_11to50hetalt
100.0000
100.0000
100.0000
95.2381
10100
raldana-dualsentieonSNP*lowcmp_SimpleRepeat_diTR_11to50homalt
99.8406
99.7395
99.9420
62.2220
34469344622
100.0000
raldana-dualsentieonSNP*lowcmp_SimpleRepeat_diTR_51to200*
86.4865
76.1905
100.0000
97.2996
32103200
raldana-dualsentieonSNP*lowcmp_SimpleRepeat_diTR_51to200het
80.0000
66.6667
100.0000
97.8261
1891800
raldana-dualsentieonSNP*lowcmp_SimpleRepeat_diTR_51to200hetalt
0.0000
100.0000
00000
raldana-dualsentieonSNP*lowcmp_SimpleRepeat_diTR_51to200homalt
96.5517
93.3333
100.0000
96.0563
1411400
raldana-dualsentieonSNP*lowcmp_SimpleRepeat_homopolymer_6to10*
99.8251
99.7031
99.9475
55.2981
17126511712396
66.6667
raldana-dualsentieonSNP*lowcmp_SimpleRepeat_homopolymer_6to10het
99.7559
99.5579
99.9547
55.7919
11035491103252
40.0000
raldana-dualsentieonSNP*lowcmp_SimpleRepeat_homopolymer_6to10hetalt
100.0000
100.0000
100.0000
66.6667
50500
raldana-dualsentieonSNP*lowcmp_SimpleRepeat_homopolymer_6to10homalt
99.9507
99.9671
99.9343
54.3615
60862608644
100.0000
raldana-dualsentieonSNP*lowcmp_SimpleRepeat_homopolymer_gt10*
0.0000
100.0000
00000
raldana-dualsentieonSNP*lowcmp_SimpleRepeat_homopolymer_gt10het
0.0000
100.0000
00000
raldana-dualsentieonSNP*lowcmp_SimpleRepeat_homopolymer_gt10hetalt
0.0000
100.0000
00000
raldana-dualsentieonSNP*lowcmp_SimpleRepeat_homopolymer_gt10homalt
0.0000
100.0000
00000
raldana-dualsentieonSNP*lowcmp_SimpleRepeat_quadTR_11to50*
99.3283
99.2191
99.4377
37.1644
18041142180381022
1.9608
raldana-dualsentieonSNP*lowcmp_SimpleRepeat_quadTR_11to50het
98.9397
98.7667
99.1133
38.6843
11292141112891011
0.9901
raldana-dualsentieonSNP*lowcmp_SimpleRepeat_quadTR_11to50hetalt
100.0000
100.0000
100.0000
64.2857
50500
raldana-dualsentieonSNP*lowcmp_SimpleRepeat_quadTR_11to50homalt
99.9852
99.9852
99.9852
34.3808
67441674411
100.0000
raldana-dualsentieonSNP*lowcmp_SimpleRepeat_quadTR_51to200*
87.1595
78.3217
98.2456
92.8750
1123111221
50.0000
raldana-dualsentieonSNP*lowcmp_SimpleRepeat_quadTR_51to200het
83.4286
71.5686
100.0000
93.4470
73297300
raldana-dualsentieonSNP*lowcmp_SimpleRepeat_quadTR_51to200hetalt
0.0000
0.0000
0.0000
00000
raldana-dualsentieonSNP*lowcmp_SimpleRepeat_quadTR_51to200homalt
95.1220
95.1220
95.1220
91.5638
3923921
50.0000
raldana-dualsentieonSNP*lowcmp_SimpleRepeat_quadTR_gt200*
0.0000
100.0000
00000
raldana-dualsentieonSNP*lowcmp_SimpleRepeat_quadTR_gt200het
0.0000
100.0000
00000
raldana-dualsentieonSNP*lowcmp_SimpleRepeat_quadTR_gt200hetalt
0.0000
0.0000
0.0000
00000
raldana-dualsentieonSNP*lowcmp_SimpleRepeat_quadTR_gt200homalt
0.0000
0.0000
0.0000
00000
raldana-dualsentieonSNP*lowcmp_SimpleRepeat_triTR_11to50*
99.5910
99.3338
99.8496
31.5262
7306497302113
27.2727
raldana-dualsentieonSNP*lowcmp_SimpleRepeat_triTR_11to50het
99.3909
99.0035
99.7815
31.9351
4570464566102
20.0000
raldana-dualsentieonSNP*lowcmp_SimpleRepeat_triTR_11to50hetalt
0.0000
100.0000
01000
raldana-dualsentieonSNP*lowcmp_SimpleRepeat_triTR_11to50homalt
99.9452
99.9270
99.9635
30.8140
27362273611
100.0000
raldana-dualsentieonSNP*lowcmp_SimpleRepeat_triTR_51to200*
94.1176
88.8889
100.0000
95.6522
81800
raldana-dualsentieonSNP*lowcmp_SimpleRepeat_triTR_51to200het
92.3077
85.7143
100.0000
95.4198
61600
raldana-dualsentieonSNP*lowcmp_SimpleRepeat_triTR_51to200hetalt
0.0000
0.0000
0.0000
00000
raldana-dualsentieonSNP*lowcmp_SimpleRepeat_triTR_51to200homalt
100.0000
100.0000
100.0000
96.2264
20200
raldana-dualsentieonSNP*map_l100_m0_e0*
99.0621
99.0682
99.0561
67.1734
325353063253131012
3.8710
raldana-dualsentieonSNP*map_l100_m0_e0het
98.7044
98.8116
98.5974
70.4890
20953252209492983
1.0067
raldana-dualsentieonSNP*map_l100_m0_e0hetalt
96.9697
100.0000
94.1176
64.5833
1601611
100.0000
raldana-dualsentieonSNP*map_l100_m0_e0homalt
99.7198
99.5353
99.9050
58.6521
115665411566118
72.7273
raldana-dualsentieonSNP*map_l100_m1_e0*
99.3579
99.3895
99.3263
63.6563
719614427195048823
4.7131