PrecisionFDA
Truth Challenge

Engage and improve DNA test results with our community challenges

Explore HG002 comparison results
Use this interactive explorer to filter all results across submission entries and multiple dimensions.
EntryTypeSubtypeSubsetGenotypeF-scoreRecallPrecisionFrac_NATruth TPTruth FNQuery TPQuery FPFP gt % FP ma
52151-52200 / 86044 show all
qzeng-customSNP*lowcmp_SimpleRepeat_triTR_51to200*
70.5882
66.6667
75.0000
98.1043
63621
50.0000
qzeng-customSNP*lowcmp_SimpleRepeat_triTR_51to200homalt
0.0000
0.0000
96.9697
02021
50.0000
qzeng-customSNP*tech_badpromoters*
96.8273
98.0892
95.5975
47.8689
154315271
14.2857
qzeng-customSNP*tech_badpromotershomalt
98.0970
97.5000
98.7013
46.1538
7827611
100.0000
qzeng-customSNPtifunc_cdshet
99.7296
99.8589
99.6007
30.7386
8492128481341
2.9412
qzeng-customSNPtilowcmp_AllRepeats_lt51bp_gt95identity_mergedhetalt
83.3333
83.3333
83.3333
85.7143
51511
100.0000
qzeng-customSNPtilowcmp_Human_Full_Genome_TRDB_hg19_150331_TRgt6_lt51bp_gt95identity_mergedhet
99.2117
99.6203
98.8065
58.3355
157461573191
5.2632
qzeng-customSNPtilowcmp_SimpleRepeat_homopolymer_6to10het
99.6434
99.5326
99.7544
48.4618
4046194061101
10.0000
qzeng-customSNPtilowcmp_SimpleRepeat_homopolymer_6to10homalt
99.7726
99.6365
99.9090
41.7859
21938219521
50.0000
qzeng-customSNPtilowcmp_SimpleRepeat_quadTR_11to50hetalt
66.6667
100.0000
50.0000
80.0000
10111
100.0000
qzeng-customSNPtilowcmp_SimpleRepeat_triTR_11to50het
99.4125
98.9911
99.8375
43.7900
245325245841
25.0000
qzeng-customSNPtilowcmp_SimpleRepeat_triTR_51to200*
66.6667
62.5000
71.4286
97.8261
53521
50.0000
qzeng-customSNPtilowcmp_SimpleRepeat_triTR_51to200homalt
0.0000
0.0000
95.9184
02021
50.0000
qzeng-customSNPtimap_l250_m0_e0homalt
67.7742
51.3761
99.5475
94.9738
22421222011
100.0000
qzeng-customSNPtvHG002complexvarhetalt
97.3511
95.1613
99.6441
38.9130
2951528011
100.0000
qzeng-customSNPtvHG002compoundhethetalt
98.5292
97.2158
99.8786
21.9697
8382482311
100.0000
qzeng-customSNPtvlowcmp_AllRepeats_lt51bp_gt95identity_mergedhetalt
92.3077
92.3077
92.3077
82.1918
1211211
100.0000
qzeng-customSNPtvlowcmp_Human_Full_Genome_TRDB_hg19_150331_TRgt6_51to200bp_gt95identity_mergedhomalt
99.6872
99.5842
99.7904
53.5992
479247611
100.0000
qzeng-customSNPtvlowcmp_Human_Full_Genome_TRDB_hg19_150331_TRgt6_lt101bp_gt95identity_merged*
98.9647
99.8160
98.1279
69.9341
217042149411
2.4390
qzeng-customSNPtvlowcmp_Human_Full_Genome_TRDB_hg19_150331_TRgt6_lt101bp_gt95identity_mergedhet
98.3855
99.7116
97.0943
71.3328
138341370411
2.4390
qzeng-customSNPtvlowcmp_SimpleRepeat_diTR_51to200*
80.0000
76.9231
83.3333
97.2758
2062041
25.0000
qzeng-customSNPtvlowcmp_SimpleRepeat_diTR_51to200het
72.7273
70.5882
75.0000
97.7654
1251241
25.0000
qzeng-customSNPtvlowcmp_SimpleRepeat_quadTR_11to50hetalt
90.9091
100.0000
83.3333
68.4211
50511
100.0000
qzeng-customSNPtvlowcmp_SimpleRepeat_quadTR_51to200homalt
90.9091
100.0000
83.3333
94.2857
60511
100.0000
qzeng-customSNPtvlowcmp_SimpleRepeat_triTR_11to50het
99.1334
99.1581
99.1088
47.3320
2120182113191
5.2632
qzeng-customSNPtvtech_badpromoters*
93.8212
95.8333
91.8919
51.6340
6936861
16.6667
qzeng-customSNPtvtech_badpromotershomalt
96.0692
94.8718
97.2973
51.3158
3723611
100.0000
raldana-dualsentieonINDEL*lowcmp_AllRepeats_lt51bp_gt95identity_mergedhetalt
96.8972
93.9872
99.9932
58.6098
144909271460611
100.0000
raldana-dualsentieonINDEL*lowcmp_Human_Full_Genome_TRDB_hg19_150331_TRgt6_lt51bp_gt95identity_mergedhet
99.5541
99.2049
99.9057
74.5256
212117211921
50.0000
raldana-dualsentieonINDEL*lowcmp_Human_Full_Genome_TRDB_hg19_150331_TRlt7_lt51bp_gt95identity_mergedhetalt
96.8228
93.8496
99.9905
28.4361
104226831050111
100.0000
raldana-dualsentieonINDEL*lowcmp_SimpleRepeat_diTR_11to50hetalt
96.7068
93.6325
99.9899
31.2313
9808667989011
100.0000
raldana-dualsentieonINDEL*lowcmp_SimpleRepeat_quadTR_51to200hetalt
96.6754
93.6402
99.9140
32.5015
111976116211
100.0000
raldana-dualsentieonINDEL*map_l100_m0_e0het
97.3501
97.0617
97.6401
84.2716
99130993241
4.1667
raldana-dualsentieonINDEL*map_l250_m1_e0het
93.2292
94.2105
92.2680
95.0218
17911179151
6.6667
raldana-dualsentieonINDEL*map_l250_m1_e0homalt
96.7136
94.4954
99.0385
93.9850
103610311
100.0000
raldana-dualsentieonINDEL*map_l250_m2_e0het
93.8679
94.7619
92.9907
95.2339
19911199151
6.6667
raldana-dualsentieonINDEL*map_l250_m2_e0homalt
96.8889
94.7826
99.0909
94.5893
109610911
100.0000
raldana-dualsentieonINDEL*map_l250_m2_e1het
93.8967
94.7867
93.0233
95.3524
20011200151
6.6667
raldana-dualsentieonINDEL*map_l250_m2_e1homalt
96.9163
94.8276
99.0991
94.6839
110611011
100.0000
ndellapenna-hhgaINDEL*lowcmp_Human_Full_Genome_TRDB_hg19_150331_TRgt6_gt200bp_gt95identity_merged*
87.6827
82.3529
93.7500
99.9627
1431511
100.0000
ndellapenna-hhgaINDEL*lowcmp_Human_Full_Genome_TRDB_hg19_150331_TRgt6_gt200bp_gt95identity_mergedhet
96.0000
100.0000
92.3077
99.3970
1001211
100.0000
ndellapenna-hhgaINDEL*lowcmp_Human_Full_Genome_TRDB_hg19_150331_TRlt7_gt200bp_gt95identity_merged*
66.6667
66.6667
66.6667
99.5739
21211
100.0000
ndellapenna-hhgaINDEL*lowcmp_Human_Full_Genome_TRDB_hg19_150331_TRlt7_gt200bp_gt95identity_mergedhet
80.0000
100.0000
66.6667
96.9697
20211
100.0000
ndellapenna-hhgaINDEL*map_l250_m0_e0*
93.7500
96.1538
91.4634
99.7895
7537571
14.2857
ndellapenna-hhgaINDEL*map_l250_m0_e0het
91.8919
96.2264
87.9310
97.3827
5125171
14.2857
ndellapenna-hhgaINDEL*map_l250_m1_e0homalt
97.6959
97.2477
98.1481
94.5066
106310621
50.0000
ndellapenna-hhgaINDEL*map_l250_m2_e0homalt
97.8166
97.3913
98.2456
95.1136
112311221
50.0000
ndellapenna-hhgaINDEL*map_l250_m2_e1homalt
97.8355
97.4138
98.2609
95.2243
113311321
50.0000
ndellapenna-hhgaINDEL*segduphetalt
85.5736
75.3846
98.9474
95.6039
98329411
100.0000
ndellapenna-hhgaINDEL*tech_badpromoters*
97.3333
96.0526
98.6486
91.8051
7337311
100.0000