PrecisionFDA
Truth Challenge

Engage and improve DNA test results with our community challenges

Explore HG002 comparison results
Use this interactive explorer to filter all results across submission entries and multiple dimensions.
EntryType SubtypeSubsetGenotypeF-scoreRecallPrecisionFrac_NATruth TPTruth FNQuery TPQuery FPFP gt% FP ma
1951-2000 / 86044 show all
anovak-vgINDELI6_15segduphetalt
0.0000
20.0000
0.0000
0.0000
936000
anovak-vgINDELI6_15segduphomalt
59.8909
80.8511
47.5610
87.6506
389394342
97.6744
anovak-vgINDELI6_15segdupwithalt*
0.0000
100.0000
00000
anovak-vgINDELI6_15segdupwithalthet
0.0000
100.0000
00000
anovak-vgINDELI6_15segdupwithalthetalt
0.0000
0.0000
0.0000
00000
anovak-vgINDELI6_15segdupwithalthomalt
0.0000
100.0000
00000
anovak-vgINDELI6_15tech_badpromoters*
57.1429
46.1538
75.0000
50.0000
67622
100.0000
anovak-vgINDELI6_15tech_badpromotershet
25.0000
14.2857
100.0000
40.0000
16300
anovak-vgINDELI6_15tech_badpromotershetalt
0.0000
66.6667
0.0000
0.0000
21000
anovak-vgINDELI6_15tech_badpromotershomalt
75.0000
100.0000
60.0000
54.5455
30322
100.0000
astatham-gatkINDEL***
99.3424
99.2404
99.4446
59.9126
341925261734178819091550
81.1943
astatham-gatkINDEL**het
99.5127
99.4973
99.5281
60.7701
193157976192779914573
62.6915
astatham-gatkINDEL**hetalt
96.8001
94.0563
99.7088
58.1664
237371500239687069
98.5714
astatham-gatkINDEL**homalt
99.5755
99.8874
99.2657
58.8575
125031141125041925908
98.1622
astatham-gatkINDEL*HG002complexvar*
99.4827
99.2916
99.6745
58.3541
7639354576257249212
85.1406
astatham-gatkINDEL*HG002complexvarhet
99.5588
99.3119
99.8070
57.7885
45894318455168855
62.5000
astatham-gatkINDEL*HG002complexvarhetalt
96.3890
94.5391
98.3127
68.6296
349720237296463
98.4375
astatham-gatkINDEL*HG002complexvarhomalt
99.7747
99.9075
99.6422
57.3610
2700225270129794
96.9072
astatham-gatkINDEL*HG002compoundhet*
95.0088
94.7931
95.2256
62.9264
2840015602828214181407
99.2243
astatham-gatkINDEL*HG002compoundhethet
93.2853
98.3879
88.6859
79.4067
4028663786483475
98.3437
astatham-gatkINDEL*HG002compoundhethetalt
96.8345
94.0747
99.7612
51.5025
236881492238125756
98.2456
astatham-gatkINDEL*HG002compoundhethomalt
60.8541
99.7085
43.7900
84.6320
6842684878876
99.7722
astatham-gatkINDEL*decoy*
100.0000
100.0000
100.0000
99.9347
1001000
astatham-gatkINDEL*decoyhet
100.0000
100.0000
100.0000
99.9426
60600
astatham-gatkINDEL*decoyhetalt
100.0000
100.0000
100.0000
99.8374
10100
astatham-gatkINDEL*decoyhomalt
100.0000
100.0000
100.0000
99.9296
30300
astatham-gatkINDEL*func_cds*
99.5531
99.7753
99.3318
44.9080
444144631
33.3333
astatham-gatkINDEL*func_cdshet
99.5392
100.0000
99.0826
50.0000
214021620
0.0000
astatham-gatkINDEL*func_cdshetalt
88.8889
80.0000
100.0000
60.0000
41400
astatham-gatkINDEL*func_cdshomalt
99.7792
100.0000
99.5595
38.4824
226022611
100.0000
astatham-gatkINDEL*lowcmp_AllRepeats_51to200bp_gt95identity_merged*
94.9849
94.4009
95.5762
69.0899
95095649355433393
90.7621
astatham-gatkINDEL*lowcmp_AllRepeats_51to200bp_gt95identity_mergedhet
95.4126
96.9059
93.9646
79.4691
39151253612232195
84.0517
astatham-gatkINDEL*lowcmp_AllRepeats_51to200bp_gt95identity_mergedhetalt
93.9775
89.0167
99.5238
40.0705
340442035531717
100.0000
astatham-gatkINDEL*lowcmp_AllRepeats_51to200bp_gt95identity_mergedhomalt
95.5706
99.1399
92.2494
66.0177
2190192190184181
98.3696
astatham-gatkINDEL*lowcmp_AllRepeats_gt200bp_gt95identity_merged*
100.0000
100.0000
100.0000
99.3816
2002000
astatham-gatkINDEL*lowcmp_AllRepeats_gt200bp_gt95identity_mergedhet
100.0000
100.0000
100.0000
99.4222
1201200
astatham-gatkINDEL*lowcmp_AllRepeats_gt200bp_gt95identity_mergedhetalt
100.0000
100.0000
100.0000
96.7391
30300
astatham-gatkINDEL*lowcmp_AllRepeats_gt200bp_gt95identity_mergedhomalt
100.0000
100.0000
100.0000
99.5305
50500
astatham-gatkINDEL*lowcmp_AllRepeats_lt51bp_gt95identity_merged*
99.1675
99.0429
99.2924
73.5960
9354590493452666582
87.3874
astatham-gatkINDEL*lowcmp_AllRepeats_lt51bp_gt95identity_mergedhet
99.3786
99.4717
99.2856
75.6401
4801625547805344267
77.6163
astatham-gatkINDEL*lowcmp_AllRepeats_lt51bp_gt95identity_mergedhetalt
97.7545
95.9460
99.6325
60.1730
14792625149105554
98.1818
astatham-gatkINDEL*lowcmp_AllRepeats_lt51bp_gt95identity_mergedhomalt
99.5289
99.9220
99.1388
74.4236
307372430737267261
97.7528
astatham-gatkINDEL*lowcmp_Human_Full_Genome_TRDB_hg19_150331*
98.2824
98.0398
98.5261
67.2886
64070128163840955849
88.9005
astatham-gatkINDEL*lowcmp_Human_Full_Genome_TRDB_hg19_150331het
98.6446
98.9272
98.3637
73.5579
3015332729696494401
81.1741
astatham-gatkINDEL*lowcmp_Human_Full_Genome_TRDB_hg19_150331hetalt
97.0426
94.5944
99.6209
40.6185
15802903160296160
98.3607
astatham-gatkINDEL*lowcmp_Human_Full_Genome_TRDB_hg19_150331homalt
98.7705
99.7193
97.8396
67.4095
181155118115400388
97.0000
astatham-gatkINDEL*lowcmp_Human_Full_Genome_TRDB_hg19_150331_TRgt6_51to200bp_gt95identity_merged*
97.8091
97.3562
98.2662
76.6111
20995720973720
54.0541
astatham-gatkINDEL*lowcmp_Human_Full_Genome_TRDB_hg19_150331_TRgt6_51to200bp_gt95identity_mergedhet
97.8006
97.5318
98.0707
78.8291
1225311220247
29.1667
astatham-gatkINDEL*lowcmp_Human_Full_Genome_TRDB_hg19_150331_TRgt6_51to200bp_gt95identity_mergedhetalt
93.2907
87.4251
100.0000
68.6975
1462114900
astatham-gatkINDEL*lowcmp_Human_Full_Genome_TRDB_hg19_150331_TRgt6_51to200bp_gt95identity_mergedhomalt
98.7788
99.3179
98.2456
73.2684
72857281313
100.0000