PrecisionFDA
Truth Challenge

Engage and improve DNA test results with our community challenges

Explore HG002 comparison results
Use this interactive explorer to filter all results across submission entries and multiple dimensions.
EntryTypeSubtypeSubsetGenotypeF-scoreRecallPrecisionFrac_NATruth TPTruth FNQuery TPQuery FPFP gt % FP ma
33451-33500 / 86044 show all
raldana-dualsentieonINDELD16_PLUSlowcmp_SimpleRepeat_triTR_11to50het
98.3329
98.3607
98.3051
75.0000
6015811
100.0000
raldana-dualsentieonINDELD16_PLUSlowcmp_SimpleRepeat_triTR_51to200het
78.9474
75.0000
83.3333
88.4615
62511
100.0000
raldana-dualsentieonINDELD16_PLUSlowcmp_SimpleRepeat_triTR_51to200homalt
96.2963
100.0000
92.8571
41.6667
1301311
100.0000
raldana-dualsentieonINDELD1_5*hetalt
96.8333
93.8702
99.9896
62.1760
9617628966011
100.0000
raldana-dualsentieonINDELD1_5HG002complexvarhetalt
97.5400
95.2663
99.9249
72.3364
128864133111
100.0000
raldana-dualsentieonINDELD1_5HG002compoundhethetalt
96.8240
93.8528
99.9896
57.7055
9588628958811
100.0000
raldana-dualsentieonINDELD1_5lowcmp_Human_Full_Genome_TRDB_hg19_150331_TRgt6_lt101bp_gt95identity_mergedhet
99.2916
98.8069
99.7809
75.8402
9111191121
50.0000
raldana-dualsentieonINDELD1_5lowcmp_Human_Full_Genome_TRDB_hg19_150331_TRgt6_lt51bp_gt95identity_merged*
99.6226
99.3415
99.9054
80.1278
10567105611
100.0000
raldana-dualsentieonINDELD1_5lowcmp_Human_Full_Genome_TRDB_hg19_150331_TRgt6_lt51bp_gt95identity_mergedhomalt
99.8647
100.0000
99.7297
81.5645
369036911
100.0000
raldana-dualsentieonINDELD1_5map_l100_m0_e0het
98.0506
97.8003
98.3022
82.9226
57813579101
10.0000
raldana-dualsentieonINDELD1_5map_l150_m0_e0*
97.4141
97.5779
97.2509
89.7535
282728381
12.5000
raldana-dualsentieonINDELD1_5map_l150_m0_e0homalt
97.6190
96.4706
98.7952
89.0933
8238211
100.0000
raldana-dualsentieonINDELD1_5map_l150_m1_e0homalt
98.4479
97.3684
99.5516
86.1491
222622211
100.0000
raldana-dualsentieonINDELD1_5map_l150_m2_e0homalt
98.5386
97.5207
99.5781
86.9780
236623611
100.0000
raldana-dualsentieonINDELD1_5map_l250_m1_e0*
96.1877
95.9064
96.4706
94.3428
164716461
16.6667
raldana-dualsentieonINDELD1_5map_l250_m1_e0het
95.5357
96.3964
94.6903
94.5725
107410761
16.6667
raldana-dualsentieonINDELD1_5map_l250_m2_e0*
96.4578
96.1957
96.7213
94.6460
177717761
16.6667
raldana-dualsentieonINDELD1_5map_l250_m2_e0het
95.9016
96.6942
95.1220
94.7771
117411761
16.6667
raldana-dualsentieonINDELD1_5map_l250_m2_e1*
96.4770
96.2162
96.7391
94.7489
178717861
16.6667
raldana-dualsentieonINDELD1_5map_l250_m2_e1het
95.9350
96.7213
95.1613
94.8612
118411861
16.6667
raldana-dualsentieonINDELD1_5map_sirenhet
99.0760
98.8142
99.3392
78.2192
2250272255151
6.6667
raldana-dualsentieonINDELD6_15*hetalt
96.4666
93.1857
99.9870
32.1144
7617557766611
100.0000
raldana-dualsentieonINDELD6_15HG002complexvarhetalt
97.2668
94.7680
99.9010
47.1204
96053100911
100.0000
raldana-dualsentieonINDELD6_15lowcmp_AllRepeats_51to200bp_gt95identity_mergedhetalt
92.1644
85.5030
99.9515
30.1491
2023343206111
100.0000
raldana-dualsentieonINDELD6_15lowcmp_Human_Full_Genome_TRDB_hg19_150331hetalt
96.4144
93.0885
99.9868
28.9050
7502557755011
100.0000
raldana-dualsentieonINDELD6_15lowcmp_Human_Full_Genome_TRDB_hg19_150331_TRgt6_51to200bp_gt95identity_mergedhomalt
98.8506
100.0000
97.7273
76.5957
4304311
100.0000
raldana-dualsentieonINDELD6_15lowcmp_Human_Full_Genome_TRDB_hg19_150331_TRgt6_lt101bp_gt95identity_mergedhomalt
99.7333
100.0000
99.4681
59.3514
374037421
50.0000
raldana-dualsentieonINDELD6_15lowcmp_Human_Full_Genome_TRDB_hg19_150331_TRgt6_lt51bp_gt95identity_mergedhet
99.4851
99.1786
99.7934
64.4901
483448311
100.0000
raldana-dualsentieonINDELD6_15lowcmp_Human_Full_Genome_TRDB_hg19_150331_TRgt6_lt51bp_gt95identity_mergedhomalt
99.7093
100.0000
99.4203
57.3020
343034321
50.0000
raldana-dualsentieonINDELD6_15lowcmp_Human_Full_Genome_TRDB_hg19_150331_all_mergedhetalt
96.4144
93.0885
99.9868
28.9050
7502557755011
100.0000
raldana-dualsentieonINDELD6_15lowcmp_SimpleRepeat_quadTR_51to200hetalt
96.7499
93.8209
99.8677
23.9437
7444975511
100.0000
raldana-dualsentieonINDELD6_15lowcmp_SimpleRepeat_triTR_11to50homalt
99.8873
100.0000
99.7748
33.9286
443044311
100.0000
raldana-dualsentieonINDELD6_15lowcmp_SimpleRepeat_triTR_51to200*
97.8571
96.4789
99.2754
41.7722
137513711
100.0000
raldana-dualsentieonINDELD6_15lowcmp_SimpleRepeat_triTR_51to200het
95.8333
95.8333
95.8333
71.7647
2312311
100.0000
raldana-dualsentieonINDELD6_15map_l100_m1_e0het
96.4143
96.0317
96.8000
86.3983
121512141
25.0000
raldana-dualsentieonINDELD6_15map_l100_m1_e0homalt
98.4375
98.4375
98.4375
84.1975
6316311
100.0000
raldana-dualsentieonINDELD6_15map_l100_m2_e0het
95.7529
94.6565
96.8750
86.7632
124712441
25.0000
raldana-dualsentieonINDELD6_15map_l100_m2_e0homalt
98.4615
98.4615
98.4615
84.9188
6416411
100.0000
raldana-dualsentieonINDELD6_15map_l100_m2_e1het
95.8801
94.8148
96.9697
86.5990
128712841
25.0000
raldana-dualsentieonINDELD6_15map_l100_m2_e1homalt
98.5075
98.5075
98.5075
84.8073
6616611
100.0000
raldana-dualsentieonINDELD6_15map_l125_m1_e0*
97.3913
95.7265
99.1150
87.6096
112511211
100.0000
raldana-dualsentieonINDELD6_15map_l125_m1_e0het
96.8254
95.3125
98.3871
89.1986
6136111
100.0000
raldana-dualsentieonINDELD6_15map_l125_m2_e0*
96.7480
94.4444
99.1667
87.9154
119711911
100.0000
raldana-dualsentieonINDELD6_15map_l125_m2_e0het
95.6522
92.9577
98.5075
89.2456
6656611
100.0000
raldana-dualsentieonINDELD6_15map_l125_m2_e1*
96.3855
93.7500
99.1736
88.0788
120812011
100.0000
raldana-dualsentieonINDELD6_15map_l125_m2_e1het
95.6522
92.9577
98.5075
89.4155
6656611
100.0000
raldana-dualsentieonINDELD6_15map_sirenhet
97.4729
96.4286
98.5401
83.9390
2701027041
25.0000
raldana-dualsentieonINDELD6_15map_sirenhomalt
98.8506
99.2308
98.4733
81.7803
129112921
50.0000
raldana-dualsentieonINDELI16_PLUS*hetalt
94.2306
89.1325
99.9472
55.4588
1870228189211
100.0000
raldana-dualsentieonINDELI16_PLUSHG002complexvarhet
98.7816
97.7444
99.8410
62.2675
6501562811
100.0000