PrecisionFDA
Truth Challenge

Engage and improve DNA test results with our community challenges

Explore HG002 comparison results
Use this interactive explorer to filter all results across submission entries and multiple dimensions.
EntryTypeSubtypeSubsetGenotype F-scoreRecallPrecisionFrac_NATruth TPTruth FNQuery TPQuery FPFP gt% FP ma
801-850 / 86044 show all
raldana-dualsentieonINDELI6_15map_l100_m0_e0homalt
91.6667
91.6667
91.6667
86.0465
1111110
0.0000
raldana-dualsentieonINDELI6_15lowcmp_SimpleRepeat_triTR_51to200homalt
0.0000
100.0000
00000
raldana-dualsentieonINDELI6_15lowcmp_SimpleRepeat_triTR_11to50homalt
100.0000
100.0000
100.0000
62.1795
5905900
raldana-dualsentieonINDELI6_15lowcmp_SimpleRepeat_quadTR_gt200homalt
0.0000
0.0000
0.0000
00000
raldana-dualsentieonINDELI6_15lowcmp_SimpleRepeat_quadTR_51to200homalt
0.0000
0.0000
95.0000
00011
100.0000
raldana-dualsentieonINDELI6_15lowcmp_SimpleRepeat_quadTR_11to50homalt
99.5392
100.0000
99.0826
66.3060
216021622
100.0000
raldana-dualsentieonINDELI6_15lowcmp_SimpleRepeat_homopolymer_gt10homalt
0.0000
100.0000
00000
raldana-dualsentieonINDELI6_15lowcmp_SimpleRepeat_homopolymer_6to10homalt
100.0000
100.0000
100.0000
81.1538
4904900
raldana-dualsentieonINDELI6_15lowcmp_SimpleRepeat_diTR_gt200homalt
0.0000
0.0000
0.0000
00000
raldana-dualsentieonINDELI6_15lowcmp_SimpleRepeat_diTR_51to200homalt
0.0000
100.0000
00000
raldana-dualsentieonINDELI6_15lowcmp_SimpleRepeat_diTR_11to50homalt
96.8354
100.0000
93.8650
76.7806
1530153109
90.0000
raldana-dualsentieonINDELI6_15lowcmp_Human_Full_Genome_TRDB_hg19_150331_all_mergedhomalt
97.0484
99.6364
94.5915
70.0964
82238224745
95.7447
raldana-dualsentieonINDELI6_15lowcmp_Human_Full_Genome_TRDB_hg19_150331_all_gt95identity_mergedhomalt
95.0920
99.6785
90.9091
69.4991
31013103130
96.7742
raldana-dualsentieonINDELI6_15lowcmp_Human_Full_Genome_TRDB_hg19_150331_TRlt7_lt51bp_gt95identity_mergedhomalt
94.4724
100.0000
89.5238
67.0846
18801882221
95.4545
raldana-dualsentieonINDELI6_15lowcmp_Human_Full_Genome_TRDB_hg19_150331_TRlt7_lt101bp_gt95identity_mergedhomalt
93.5412
99.5261
88.2353
68.7254
21012102827
96.4286
raldana-dualsentieonINDELI6_15lowcmp_Human_Full_Genome_TRDB_hg19_150331_TRlt7_gt200bp_gt95identity_mergedhomalt
0.0000
100.0000
00000
raldana-dualsentieonINDELI6_15lowcmp_Human_Full_Genome_TRDB_hg19_150331_TRlt7_51to200bp_gt95identity_mergedhomalt
86.7925
95.8333
79.3103
78.0303
2312366
100.0000
raldana-dualsentieonINDELI6_15lowcmp_Human_Full_Genome_TRDB_hg19_150331_TRgt6_lt51bp_gt95identity_mergedhomalt
98.5507
100.0000
97.1429
69.4323
6806822
100.0000
raldana-dualsentieonINDELI6_15lowcmp_Human_Full_Genome_TRDB_hg19_150331_TRgt6_lt101bp_gt95identity_mergedhomalt
98.3957
100.0000
96.8421
68.8525
9209233
100.0000
raldana-dualsentieonINDELI6_15lowcmp_Human_Full_Genome_TRDB_hg19_150331_TRgt6_gt200bp_gt95identity_mergedhomalt
0.0000
100.0000
00000
raldana-dualsentieonINDELI6_15lowcmp_Human_Full_Genome_TRDB_hg19_150331_TRgt6_51to200bp_gt95identity_mergedhomalt
97.2973
100.0000
94.7368
68.5950
3603622
100.0000
raldana-dualsentieonINDELI6_15lowcmp_Human_Full_Genome_TRDB_hg19_150331homalt
97.0484
99.6364
94.5915
70.0964
82238224745
95.7447
raldana-dualsentieonINDELI6_15lowcmp_AllRepeats_lt51bp_gt95identity_mergedhomalt
97.9010
100.0000
95.8884
72.2268
65306532827
96.4286
raldana-dualsentieonINDELI6_15lowcmp_AllRepeats_gt200bp_gt95identity_mergedhomalt
0.0000
100.0000
00000
raldana-dualsentieonINDELI6_15lowcmp_AllRepeats_51to200bp_gt95identity_mergedhomalt
92.0635
98.3051
86.5672
75.1852
5815899
100.0000
raldana-dualsentieonINDELI6_15func_cdshomalt
100.0000
100.0000
100.0000
42.3077
1501500
raldana-dualsentieonINDELI6_15decoyhomalt
0.0000
100.0000
00000
raldana-dualsentieonINDELI6_15HG002compoundhethomalt
16.3588
100.0000
8.9080
58.9623
31031317315
99.3691
raldana-dualsentieonINDELI6_15HG002complexvarhomalt
98.1789
99.9176
96.4996
55.2987
1213112134444
100.0000
raldana-dualsentieonINDELI6_15*homalt
97.3324
99.7115
95.0642
52.3622
6221186221323318
98.4520
raldana-dualsentieonINDELI1_5tech_badpromotershomalt
100.0000
100.0000
100.0000
58.0645
1301300
raldana-dualsentieonINDELI1_5segdupwithalthomalt
0.0000
100.0000
00000
raldana-dualsentieonINDELI1_5segduphomalt
99.4720
99.5772
99.3671
92.4798
471247133
100.0000
raldana-dualsentieonINDELI1_5map_sirenhomalt
99.6286
99.5050
99.7525
77.1148
12066120932
66.6667
raldana-dualsentieonINDELI1_5map_l250_m2_e1homalt
96.7033
95.6522
97.7778
94.5718
4424411
100.0000
raldana-dualsentieonINDELI1_5map_l250_m2_e0homalt
96.6292
95.5556
97.7273
94.5342
4324311
100.0000
raldana-dualsentieonINDELI1_5map_l250_m1_e0homalt
96.5517
95.4545
97.6744
93.4947
4224211
100.0000
raldana-dualsentieonINDELI1_5map_l250_m0_e0homalt
100.0000
100.0000
100.0000
96.2185
90900
raldana-dualsentieonINDELI1_5map_l150_m2_e1homalt
98.5222
98.0392
99.0099
87.7204
200420021
50.0000
raldana-dualsentieonINDELI1_5map_l150_m2_e0homalt
98.5000
98.0100
98.9950
87.6012
197419721
50.0000
raldana-dualsentieonINDELI1_5map_l150_m1_e0homalt
98.4772
97.9798
98.9796
85.7765
194419421
50.0000
raldana-dualsentieonINDELI1_5map_l150_m0_e0homalt
98.5075
98.5075
98.5075
87.2624
6616611
100.0000
raldana-dualsentieonINDELI1_5map_l125_m2_e1homalt
99.1228
98.8338
99.4135
83.7309
339433921
50.0000
raldana-dualsentieonINDELI1_5map_l125_m2_e0homalt
99.1176
98.8270
99.4100
83.4553
337433721
50.0000
raldana-dualsentieonINDELI1_5map_l125_m1_e0homalt
99.0798
98.7768
99.3846
81.8942
323432321
50.0000
raldana-dualsentieonINDELI1_5map_l125_m0_e0homalt
98.6900
99.1228
98.2609
83.5479
113111321
50.0000
raldana-dualsentieonINDELI1_5map_l100_m2_e1homalt
99.3513
99.2593
99.4434
80.9339
536453632
66.6667
raldana-dualsentieonINDELI1_5map_l100_m2_e0homalt
99.3402
99.2467
99.4340
80.8110
527452732
66.6667
raldana-dualsentieonINDELI1_5map_l100_m1_e0homalt
99.3237
99.2278
99.4197
79.1700
514451432
66.6667
raldana-dualsentieonINDELI1_5map_l100_m0_e0homalt
99.2806
99.5192
99.0431
78.4758
207120721
50.0000