Comparison
AllSides covers 600 outlets reviewed by 5 editors. Web Jury covers thousands — rated by the crowd.
Hand-curated bias bucket assignments by a small editorial team. High precision per outlet, limited scale.
Crowd-sourced bias + accuracy ratings, trust-weighted to dampen brigading. Open scale, full distribution visible.
| Feature | AllSides | Web Jury |
|---|---|---|
Number of outlets rated AllSides is hand-curated by 5 editors; we scale via crowd ratings. | ~600 | Thousands (growing) |
Bias methodology | 5-editor blind review | Trust-weighted crowd vote |
Accuracy methodology | Separate crowd vote per outlet | |
Vote distribution shown See the full histogram, not just the median. | ||
Per-article ratings Both outlets and individual articles get rated. | Coming Q2 | |
Public API | Paid | Free + paid tiers |
Browser extension | ||
Coverage of YouTubers + Twitter accounts | ||
Coverage of non-US outlets | Limited | Yes (UK, India, EU growing) |
Pricing for readers | Free + subscriptions | Free |
Open methodology | Documented | Documented + open-source planned |
AllSides does excellent work — but their methodology requires a 5-editor blind review for every outlet they classify. That's a hard cap on coverage. They've rated 600 outlets in 10 years. Meanwhile, tens of thousands of news sources, YouTubers, and creators have material political audiences. The long tail is invisible to an editor-led process.
Crowd-sourced ratings have an obvious failure mode: brigading. A coordinated group can push any outlet's bias score to the extreme. We address that three ways:
Neither model is perfect. We think the crowd, weighted intelligently, beats five editors at scale. Try it and decide for yourself.
Browse the bias map. Rate a source. See how the crowd compares to AllSides' editors.
Other comparisons