PrecisionFDA
Truth Challenge

Engage and improve DNA test results with our community challenges

Explore HG002 comparison results
Use this interactive explorer to filter all results across submission entries and multiple dimensions.
Entry TypeSubtypeSubsetGenotypeF-scoreRecallPrecisionFrac_NATruth TPTruth FNQuery TPQuery FPFP gt% FP ma
80951-81000 / 86044 show all
qzeng-customSNPtvlowcmp_SimpleRepeat_quadTR_gt200hetalt
0.0000
0.0000
0.0000
00000
qzeng-customSNPtvlowcmp_SimpleRepeat_quadTR_gt200homalt
0.0000
0.0000
0.0000
00000
qzeng-customSNPtvlowcmp_SimpleRepeat_triTR_11to50*
99.3614
99.3623
99.3605
44.2283
3428223418223
13.6364
qzeng-customSNPtvlowcmp_SimpleRepeat_triTR_11to50het
99.1334
99.1581
99.1088
47.3320
2120182113191
5.2632
qzeng-customSNPtvlowcmp_SimpleRepeat_triTR_11to50hetalt
0.0000
100.0000
01000
qzeng-customSNPtvlowcmp_SimpleRepeat_triTR_11to50homalt
99.7709
99.7712
99.7706
38.2436
13083130532
66.6667
qzeng-customSNPtvlowcmp_SimpleRepeat_triTR_51to200*
100.0000
100.0000
100.0000
99.0099
10100
qzeng-customSNPtvlowcmp_SimpleRepeat_triTR_51to200het
100.0000
100.0000
100.0000
98.8095
10100
qzeng-customSNPtvlowcmp_SimpleRepeat_triTR_51to200hetalt
0.0000
0.0000
0.0000
00000
qzeng-customSNPtvlowcmp_SimpleRepeat_triTR_51to200homalt
0.0000
100.0000
00000
raldana-dualsentieonINDEL***
99.1095
98.7566
99.4648
57.7282
340258428434012018301626
88.8525
raldana-dualsentieonINDEL**het
99.3035
99.0228
99.5858
57.9096
1922361897191867798611
76.5664
raldana-dualsentieonINDEL**hetalt
95.3681
91.1677
99.9742
56.5406
2300822292323066
100.0000
raldana-dualsentieonINDEL**homalt
99.5287
99.8738
99.1860
57.6626
12501415812502310261009
98.3431
raldana-dualsentieonINDEL*HG002complexvar*
98.9594
98.2323
99.6974
57.2556
75578136075441229196
85.5895
raldana-dualsentieonINDEL*HG002complexvarhet
98.7829
97.8101
99.7752
56.2974
4520010124483110172
71.2871
raldana-dualsentieonINDEL*HG002complexvarhetalt
95.3862
91.2949
99.8613
67.7087
3377322360055
100.0000
raldana-dualsentieonINDEL*HG002complexvarhomalt
99.7249
99.9038
99.5467
56.9673
270012627010123119
96.7480
raldana-dualsentieonINDEL*HG002compoundhet*
92.3506
90.1368
94.6759
61.0439
2700529552688715121502
99.3386
raldana-dualsentieonINDEL*HG002compoundhethet
83.9323
82.2912
85.6402
78.9688
33697253137526520
98.8593
raldana-dualsentieonINDEL*HG002compoundhethetalt
95.3715
91.1597
99.9913
50.4712
2295422262306822
100.0000
raldana-dualsentieonINDEL*HG002compoundhethomalt
57.9932
99.4169
40.9364
81.2893
6824682984980
99.5935
raldana-dualsentieonINDEL*decoy*
100.0000
100.0000
100.0000
99.9200
1001000
raldana-dualsentieonINDEL*decoyhet
100.0000
100.0000
100.0000
99.9259
60600
raldana-dualsentieonINDEL*decoyhetalt
100.0000
100.0000
100.0000
99.8020
10100
raldana-dualsentieonINDEL*decoyhomalt
100.0000
100.0000
100.0000
99.9232
30300
raldana-dualsentieonINDEL*func_cds*
99.2118
98.8764
99.5495
41.6557
440544220
0.0000
raldana-dualsentieonINDEL*func_cdshet
98.5959
98.1308
99.0654
45.9596
210421220
0.0000
raldana-dualsentieonINDEL*func_cdshetalt
88.8889
80.0000
100.0000
60.0000
41400
raldana-dualsentieonINDEL*func_cdshomalt
100.0000
100.0000
100.0000
36.3380
226022600
raldana-dualsentieonINDEL*lowcmp_AllRepeats_51to200bp_gt95identity_merged*
93.2233
91.0156
95.5407
66.1212
91689059020421396
94.0618
raldana-dualsentieonINDEL*lowcmp_AllRepeats_51to200bp_gt95identity_mergedhet
93.1260
92.3762
93.8881
76.4157
37323083441224202
90.1786
raldana-dualsentieonINDEL*lowcmp_AllRepeats_51to200bp_gt95identity_mergedhetalt
91.8485
84.9895
99.9117
39.0086
3250574339333
100.0000
raldana-dualsentieonINDEL*lowcmp_AllRepeats_51to200bp_gt95identity_mergedhomalt
95.2713
98.9588
91.8487
64.7877
2186232186194191
98.4536
raldana-dualsentieonINDEL*lowcmp_AllRepeats_gt200bp_gt95identity_merged*
92.3077
90.0000
94.7368
99.2146
1821810
0.0000
raldana-dualsentieonINDEL*lowcmp_AllRepeats_gt200bp_gt95identity_mergedhet
95.6522
91.6667
100.0000
99.2450
1111100
raldana-dualsentieonINDEL*lowcmp_AllRepeats_gt200bp_gt95identity_mergedhetalt
100.0000
100.0000
100.0000
96.2500
30300
raldana-dualsentieonINDEL*lowcmp_AllRepeats_gt200bp_gt95identity_mergedhomalt
80.0000
80.0000
80.0000
99.4331
41410
0.0000
raldana-dualsentieonINDEL*lowcmp_AllRepeats_lt51bp_gt95identity_merged*
98.6413
97.9883
99.3029
72.0487
92549190092457649588
90.6009
raldana-dualsentieonINDEL*lowcmp_AllRepeats_lt51bp_gt95identity_mergedhet
98.6423
98.0278
99.2646
73.6782
4731995247111349293
83.9542
raldana-dualsentieonINDEL*lowcmp_AllRepeats_lt51bp_gt95identity_mergedhetalt
96.8972
93.9872
99.9932
58.6098
144909271460611
100.0000
raldana-dualsentieonINDEL*lowcmp_AllRepeats_lt51bp_gt95identity_mergedhomalt
99.4822
99.9317
99.0367
73.5845
307402130740299294
98.3278
raldana-dualsentieonINDEL*lowcmp_Human_Full_Genome_TRDB_hg19_150331*
97.3740
96.2418
98.5331
64.4408
62895245662671933862
92.3901
raldana-dualsentieonINDEL*lowcmp_Human_Full_Genome_TRDB_hg19_150331het
97.2987
96.2828
98.3362
70.4045
29347113328902489430
87.9346
raldana-dualsentieonINDEL*lowcmp_Human_Full_Genome_TRDB_hg19_150331hetalt
96.0368
92.3975
99.9745
38.5038
1543512701565644
100.0000
raldana-dualsentieonINDEL*lowcmp_Human_Full_Genome_TRDB_hg19_150331homalt
98.6574
99.7082
97.6284
65.7023
181135318113440428
97.2727
raldana-dualsentieonINDEL*lowcmp_Human_Full_Genome_TRDB_hg19_150331_TRgt6_51to200bp_gt95identity_merged*
97.3437
96.0575
98.6648
74.3831
20718520692819
67.8571
raldana-dualsentieonINDEL*lowcmp_Human_Full_Genome_TRDB_hg19_150331_TRgt6_51to200bp_gt95identity_mergedhet
97.2925
95.9395
98.6842
76.0063
1205511200167
43.7500
raldana-dualsentieonINDEL*lowcmp_Human_Full_Genome_TRDB_hg19_150331_TRgt6_51to200bp_gt95identity_mergedhetalt
90.1316
82.0359
100.0000
68.2540
1373014000
raldana-dualsentieonINDEL*lowcmp_Human_Full_Genome_TRDB_hg19_150331_TRgt6_51to200bp_gt95identity_mergedhomalt
98.9145
99.4543
98.3806
72.3198
72947291212
100.0000