PrecisionFDA
Truth Challenge

Engage and improve DNA test results with our community challenges

Explore HG002 comparison results
Use this interactive explorer to filter all results across submission entries and multiple dimensions.
EntryTypeSubtypeSubsetGenotypeF-scoreRecallPrecisionFrac_NA Truth TPTruth FNQuery TPQuery FPFP gt% FP ma
22351-22400 / 86044 show all
gduggal-snapfbINDELD1_5lowcmp_Human_Full_Genome_TRDB_hg19_150331_TRlt7_51to200bp_gt95identity_mergedhetalt
68.0898
55.0736
89.1626
33.2237
7115803624424
54.5455
asubramanian-gatkINDELD1_5lowcmp_Human_Full_Genome_TRDB_hg19_150331_all_mergedhetalt
96.1607
93.4776
99.0024
33.2299
868560687338882
93.1818
asubramanian-gatkINDELD1_5lowcmp_Human_Full_Genome_TRDB_hg19_150331hetalt
96.1607
93.4776
99.0024
33.2299
868560687338882
93.1818
ckim-gatkINDELD16_PLUSlowcmp_Human_Full_Genome_TRDB_hg19_150331_TRlt7_lt101bp_gt95identity_mergedhetalt
96.5196
93.9394
99.2455
33.2429
15199817101313
100.0000
ckim-vqsrINDELD16_PLUSlowcmp_Human_Full_Genome_TRDB_hg19_150331_TRlt7_lt101bp_gt95identity_mergedhetalt
96.5196
93.9394
99.2455
33.2429
15199817101313
100.0000
ltrigg-rtg1INDELD6_15lowcmp_Human_Full_Genome_TRDB_hg19_150331_TRlt7_51to200bp_gt95identity_mergedhetalt
92.0444
86.0070
98.9933
33.2437
146923914751515
100.0000
raldana-dualsentieonSNPtilowcmp_SimpleRepeat_quadTR_11to50homalt
99.9875
100.0000
99.9749
33.2441
39870398711
100.0000
ckim-isaacINDELD6_15*hetalt
90.6237
83.9246
98.4851
33.2460
68601314728111299
88.3929
hfeng-pmm3INDEL*lowcmp_SimpleRepeat_diTR_11to50hetalt
97.4355
95.0263
99.9701
33.2469
99545211003632
66.6667
ckim-isaacINDELI6_15lowcmp_SimpleRepeat_diTR_11to50hetalt
83.6839
72.6708
98.6312
33.2487
128748412971813
72.2222
asubramanian-gatkINDELI6_15lowcmp_Human_Full_Genome_TRDB_hg19_150331_TRlt7_lt101bp_gt95identity_mergedhetalt
95.2667
91.9260
98.8593
33.2487
233420528603331
93.9394
ckim-dragenSNPtilowcmp_SimpleRepeat_triTR_11to50het
99.8389
99.9596
99.7185
33.2528
24771248073
42.8571
hfeng-pmm2SNP*lowcmp_SimpleRepeat_triTR_11to50homalt
99.9452
99.9635
99.9270
33.2602
27371273721
50.0000
raldana-dualsentieonINDELD6_15lowcmp_SimpleRepeat_triTR_11to50*
99.3616
98.9595
99.7670
33.2686
171218171343
75.0000
jlack-gatkINDEL*lowcmp_SimpleRepeat_quadTR_51to200hetalt
95.9073
92.8033
99.2261
33.2760
110986115498
88.8889
gduggal-snapvardINDELD6_15HG002compoundhet*
59.7955
51.4782
71.3183
33.2912
46494382489619691719
87.3032
hfeng-pmm3INDELD16_PLUSlowcmp_Human_Full_Genome_TRDB_hg19_150331_TRlt7_lt101bp_gt95identity_mergedhetalt
96.5563
93.4447
99.8823
33.2941
1511106169722
100.0000
gduggal-bwavardINDELI1_5*homalt
95.1435
90.7626
99.9687
33.2973
548465582543431711
64.7059
dgrover-gatkINDEL*lowcmp_SimpleRepeat_diTR_11to50hetalt
97.5627
95.6563
99.5467
33.3070
10020455101024645
97.8261
ltrigg-rtg2SNPtiHG002compoundhet*
99.3879
98.9873
99.7918
33.3089
17301177172553613
36.1111
gduggal-snapvardSNP*func_cdshet
99.2494
99.0323
99.4675
33.3113
11053108110205921
35.5932
ckim-isaacINDELI16_PLUSHG002compoundhethetalt
65.1901
48.5428
99.2149
33.3115
10161077101187
87.5000
ckim-isaacINDELD1_5lowcmp_SimpleRepeat_diTR_11to50*
93.9157
91.6952
96.2464
33.3267
22502203822436875753
86.0571
eyeh-varpipeINDELD1_5lowcmp_SimpleRepeat_diTR_11to50*
84.4279
81.5933
87.4666
33.3286
2002345172060129522912
98.6450
egarrison-hhgaINDELD16_PLUStech_badpromoters*
100.0000
100.0000
100.0000
33.3333
40400
dgrover-gatkINDELI1_5func_cdshetalt
100.0000
100.0000
100.0000
33.3333
20200
ckim-isaacINDELI6_15func_cds*
91.1392
83.7209
100.0000
33.3333
3673600
ckim-isaacINDELI6_15func_cdshetalt
66.6667
50.0000
100.0000
33.3333
22200
gduggal-snapvardINDELI6_15lowcmp_SimpleRepeat_triTR_11to50homalt
21.2121
11.8644
100.0000
33.3333
7521200
ghariani-varprowlINDELD16_PLUStech_badpromoters*
100.0000
100.0000
100.0000
33.3333
40400
gduggal-snapfbINDELI6_15func_cdshetalt
50.0000
50.0000
50.0000
33.3333
22111
100.0000
gduggal-snapfbINDELI6_15tech_badpromotershetalt
80.0000
66.6667
100.0000
33.3333
21200
hfeng-pmm1INDELD16_PLUStech_badpromoters*
100.0000
100.0000
100.0000
33.3333
40400
ghariani-varprowlINDELI6_15func_cdshomalt
88.8889
80.0000
100.0000
33.3333
1231200
hfeng-pmm3INDELD16_PLUStech_badpromoters*
100.0000
100.0000
100.0000
33.3333
40400
hfeng-pmm3INDELI1_5func_cdshetalt
100.0000
100.0000
100.0000
33.3333
20200
jlack-gatkINDELI1_5func_cdshetalt
100.0000
100.0000
100.0000
33.3333
20200
hfeng-pmm1INDELI1_5func_cdshetalt
100.0000
100.0000
100.0000
33.3333
20200
hfeng-pmm2INDELD16_PLUStech_badpromoters*
100.0000
100.0000
100.0000
33.3333
40400
hfeng-pmm2INDELI1_5func_cdshetalt
100.0000
100.0000
100.0000
33.3333
20200
hfeng-pmm2INDELI6_15lowcmp_SimpleRepeat_triTR_51to200hetalt
95.6522
91.6667
100.0000
33.3333
1111200
ckim-dragenINDELI1_5func_cdshetalt
100.0000
100.0000
100.0000
33.3333
20200
ckim-dragenINDELI6_15lowcmp_SimpleRepeat_triTR_51to200hetalt
95.6522
91.6667
100.0000
33.3333
1111200
cchapple-customINDELD16_PLUStech_badpromoters*
100.0000
100.0000
100.0000
33.3333
40400
cchapple-customINDELI6_15func_cdshomalt
96.7742
100.0000
93.7500
33.3333
1501511
100.0000
ciseli-customINDELD1_5tech_badpromotershet
60.0000
75.0000
50.0000
33.3333
62662
33.3333
gduggal-snapfbINDELC1_5lowcmp_SimpleRepeat_quadTR_51to200het
0.0000
0.0000
33.3333
00020
0.0000
gduggal-bwaplatINDELD16_PLUSlowcmp_SimpleRepeat_triTR_51to200homalt
86.9565
76.9231
100.0000
33.3333
1031000
gduggal-bwafbINDELI16_PLUSfunc_cdshet
61.5385
44.4444
100.0000
33.3333
45400
gduggal-bwafbINDELI16_PLUStech_badpromotershomalt
100.0000
100.0000
100.0000
33.3333
20200