๐Ÿ“Š Model Accuracy

How well do our edge predictions match actual outcomes? We grade ourselves so you don't have to.

A+

Brier Score: 0.066

Based on 79 resolved predictions

What is a Brier Score?

The Brier Score measures how close our probability estimates are to what actually happens. Lower is better โ€” a perfect score is 0.000, and random guessing scores 0.250. For prediction markets โ€” where you're comparing two imperfect probability sources โ€” scores below 0.20 are considered strong.

It's the same metric used by election forecasters (FiveThirtyEight, The Economist) and professional weather services to grade prediction accuracy.

A < 0.13 (Excellent) B < 0.20 (Strong) C < 0.25 (Competitive) D < 0.30 (Needs Work) F ≥ 0.30 (Poor)

๐Ÿท๏ธ By Category

๐Ÿ’ฐ Finance
A+
0.005
8 predictions ยท 87.5% WR
โšฝ Sports
A
0.103
13 predictions ยท 38.5% WR

โšฝ By Sport

๐ŸฅŠ UFC
A+
0.076
5 predictions ยท 0.0% WR
๐Ÿ€ NBA
A+
0.083
5 predictions ยท 60.0% WR

Our Methodology

EdgeScouts compares prices across prediction markets (Polymarket) against independent probability sources โ€” Pinnacle sportsbook odds, weather forecasts, options-implied probabilities, and economic consensus data.

When our models detect a significant divergence (edge), we surface it on the dashboard. The Brier Score tracks how often our "fair value" estimates match reality after the event resolves.

Scores recalculate daily. As our models improve, you'll see grades trend upward over time.

Last updated: 2026-03-18T14:14:51