PrecisionFDA
Truth Challenge

Engage and improve DNA test results with our community challenges

Explore HG002 comparison results
Use this interactive explorer to filter all results across submission entries and multiple dimensions.
EntryTypeSubtypeSubsetGenotypeF-scoreRecallPrecisionFrac_NATruth TPTruth FNQuery TP Query FPFP gt% FP ma
60801-60850 / 86044 show all
jmaeng-gatkINDELD16_PLUSHG002complexvarhetalt
92.1538
87.8543
96.8958
47.4971
217304371414
100.0000
ltrigg-rtg1INDELD6_15lowcmp_SimpleRepeat_triTR_11to50homalt
99.7738
99.5485
100.0000
27.1667
441243700
jli-customSNPtilowcmp_Human_Full_Genome_TRDB_hg19_150331_TRlt7_lt101bp_gt95identity_mergedhomalt
99.3197
98.6486
100.0000
84.6046
438643800
mlin-fermikitINDEL*func_cds*
98.6486
98.4270
98.8713
35.8900
438743853
60.0000
mlin-fermikitINDELD16_PLUSlowcmp_Human_Full_Genome_TRDB_hg19_150331_TRlt7_51to200bp_gt95identity_mergedhetalt
67.7419
51.2195
100.0000
45.5901
35734043800
qzeng-customINDEL*func_cds*
95.4248
98.4270
92.6004
43.9573
4387438354
11.4286
gduggal-snapfbINDEL*map_l150_m1_e0homalt
96.1581
94.8052
97.5501
91.6231
43824438118
72.7273
gduggal-bwafbINDELD1_5lowcmp_SimpleRepeat_diTR_51to200*
70.1122
60.5452
83.2700
54.5769
4222754388886
97.7273
ckim-gatkSNPtilowcmp_Human_Full_Genome_TRDB_hg19_150331_TRlt7_lt101bp_gt95identity_mergedhomalt
99.3197
98.6486
100.0000
84.5612
438643800
ckim-isaacINDELD16_PLUSHG002complexvarhetalt
71.9738
58.7045
92.9936
55.5660
1451024383326
78.7879
ckim-vqsrSNPtilowcmp_Human_Full_Genome_TRDB_hg19_150331_TRlt7_lt101bp_gt95identity_mergedhomalt
99.3197
98.6486
100.0000
84.5612
438643800
dgrover-gatkSNPtilowcmp_Human_Full_Genome_TRDB_hg19_150331_TRlt7_lt101bp_gt95identity_mergedhomalt
99.3197
98.6486
100.0000
84.6316
438643800
gduggal-snapplatINDELI1_5map_l100_m2_e0homalt
87.5010
81.5443
94.3966
88.3212
43398438261
3.8462
gduggal-snapvardINDELI1_5lowcmp_Human_Full_Genome_TRDB_hg19_150331_TRlt7_51to200bp_gt95identity_merged*
38.9035
36.0279
42.2780
67.7158
361641438598312
52.1739
asubramanian-gatkSNP*map_l250_m2_e0homalt
28.0410
16.3068
100.0000
97.5677
438224843800
astatham-gatkINDELI1_5map_l125_m1_e0het
93.9683
89.7119
98.6486
88.3311
4365043860
0.0000
bgallagher-sentieonSNPtilowcmp_Human_Full_Genome_TRDB_hg19_150331_TRlt7_lt101bp_gt95identity_mergedhomalt
99.2072
98.6486
99.7722
84.5803
438643811
100.0000
anovak-vgINDELD1_5lowcmp_SimpleRepeat_quadTR_51to200*
31.7791
28.9773
35.1807
45.7989
357875438807617
76.4560
rpoplin-dv42INDELD6_15lowcmp_SimpleRepeat_triTR_11to50homalt
99.4325
98.8713
100.0000
31.8818
438543800
rpoplin-dv42SNPtilowcmp_Human_Full_Genome_TRDB_hg19_150331_TRlt7_lt101bp_gt95identity_mergedhomalt
99.2090
98.8739
99.5465
84.4225
439543921
50.0000
raldana-dualsentieonSNPtilowcmp_Human_Full_Genome_TRDB_hg19_150331_TRlt7_lt101bp_gt95identity_mergedhomalt
99.3213
98.8739
99.7727
84.2237
439543911
100.0000
ciseli-customINDELD6_15lowcmp_Human_Full_Genome_TRDB_hg19_150331_TRgt6_lt51bp_gt95identity_mergedhet
85.2750
89.1170
81.7505
62.2628
434534399830
30.6122
cchapple-customINDELD6_15lowcmp_SimpleRepeat_triTR_11to50homalt
99.6602
99.7743
99.5465
28.6408
442143922
100.0000
ndellapenna-hhgaINDELD6_15lowcmp_SimpleRepeat_triTR_11to50homalt
98.3190
98.8713
97.7728
36.2216
4385439104
40.0000
jlack-gatkINDELI16_PLUSlowcmp_SimpleRepeat_diTR_11to50*
93.2955
91.2500
95.4348
82.0942
438424392110
47.6190
hfeng-pmm2SNPtilowcmp_Human_Full_Genome_TRDB_hg19_150331_TRlt7_lt101bp_gt95identity_mergedhomalt
99.4337
98.8739
100.0000
85.4007
439543900
hfeng-pmm1SNPtilowcmp_Human_Full_Genome_TRDB_hg19_150331_TRlt7_lt101bp_gt95identity_mergedhomalt
99.4337
98.8739
100.0000
84.5259
439543900
jmaeng-gatkSNPtilowcmp_Human_Full_Genome_TRDB_hg19_150331_TRlt7_lt101bp_gt95identity_mergedhomalt
99.4337
98.8739
100.0000
84.6019
439543900
gduggal-bwavardINDELI1_5segduphomalt
96.3005
93.6575
99.0971
89.4222
4433043944
100.0000
ghariani-varprowlINDELI6_15lowcmp_Human_Full_Genome_TRDB_hg19_150331homalt
62.9334
53.3333
76.7483
71.4713
440385439133119
89.4737
ghariani-varprowlINDELI6_15lowcmp_Human_Full_Genome_TRDB_hg19_150331_all_mergedhomalt
62.9334
53.3333
76.7483
71.4713
440385439133119
89.4737
gduggal-snapplatINDELI1_5lowcmp_Human_Full_Genome_TRDB_hg19_150331_TRgt6_lt51bp_gt95identity_merged*
45.2707
43.8743
46.7588
85.7511
41953644050111
2.1956
rpoplin-dv42INDEL*func_cds*
98.8729
98.4270
99.3228
93.2232
438744033
100.0000
egarrison-hhgaINDELD6_15lowcmp_SimpleRepeat_triTR_11to50homalt
99.2098
99.0971
99.3228
35.3285
439444032
66.6667
asubramanian-gatkINDELD6_15lowcmp_SimpleRepeat_triTR_11to50homalt
99.3228
99.3228
99.3228
34.5643
440344033
100.0000
asubramanian-gatkINDEL*func_cds*
98.8729
98.4270
99.3228
86.9360
438744031
33.3333
asubramanian-gatkINDEL*map_l150_m2_e0homalt
95.2388
91.4761
99.3243
90.2332
4404144131
33.3333
anovak-vgINDEL*lowcmp_Human_Full_Genome_TRDB_hg19_150331_TRlt7_51to200bp_gt95identity_mergedhomalt
37.6387
31.0491
47.7790
58.0455
367815441482356
73.8589
anovak-vgINDELD16_PLUSlowcmp_Human_Full_Genome_TRDB_hg19_150331_TRgt6_lt51bp_gt95identity_merged*
92.6984
90.4665
95.0431
49.8920
446474412312
52.1739
anovak-vgINDELD6_15lowcmp_Human_Full_Genome_TRDB_hg19_150331_TRgt6_lt51bp_gt95identity_mergedhet
89.1839
87.6797
90.7407
59.6010
427604414526
57.7778
ghariani-varprowlSNPtilowcmp_Human_Full_Genome_TRDB_hg19_150331_TRlt7_lt101bp_gt95identity_mergedhomalt
82.1159
98.1982
70.5600
84.9325
4368441184107
58.1522
jpowers-varprowlSNPtilowcmp_Human_Full_Genome_TRDB_hg19_150331_TRlt7_lt101bp_gt95identity_mergedhomalt
78.5332
98.1982
65.4303
85.7535
4368441233107
45.9227
ltrigg-rtg1INDEL*func_cds*
99.3231
98.8764
99.7738
35.4745
440544110
0.0000
qzeng-customINDELI1_5map_l150_m1_e0*
75.9428
62.2530
97.3510
93.4867
315191441128
66.6667
mlin-fermikitINDEL*map_l150_m2_e1het
63.3120
47.6190
94.4325
85.0560
4404844412612
46.1538
qzeng-customSNPtilowcmp_Human_Full_Genome_TRDB_hg19_150331_TRlt7_lt101bp_gt95identity_mergedhomalt
97.7758
98.4234
97.1366
86.2132
4377441139
69.2308
mlin-fermikitINDELI1_5map_l125_m1_e0*
66.7171
53.1325
89.6341
77.6871
4413894415146
90.1961
jlack-gatkINDELD6_15lowcmp_SimpleRepeat_triTR_11to50homalt
99.2126
99.5485
98.8789
32.5265
441244154
80.0000
gduggal-bwafbINDELD6_15lowcmp_SimpleRepeat_triTR_11to50homalt
99.1004
99.3228
98.8789
34.2183
440344155
100.0000
raldana-dualsentieonINDELI16_PLUSlowcmp_SimpleRepeat_diTR_11to50*
94.5246
91.6667
97.5664
80.0265
44040441119
81.8182