As of September 2025, only 53% of origins in Chrome’s real-user dataset pass all three Core Web Vitals — LCP, INP, and CLS — simultaneously. This article compiles verified data from the 2024 Web Almanac, Chrome UX Report, and Google’s developer documentation to show where Chrome Lighthouse scores stand heading into 2026, across performance, accessibility, and JavaScript audits.
Chrome Lighthouse Statistics: Key Numbers for 2026
- 53% of origins in CrUX passed all three Core Web Vitals as of September 2025.
- Mobile INP pass rates rose from 55% to 74% between 2022 and 2024.
- 71% of sites tested in the HTTP Archive still fail the color contrast accessibility check.
- 38% of mobile pages ship JavaScript that could be reduced with minification alone.
- The median mobile Lighthouse performance score for WordPress sites reached 38 out of 100 in 2024.
How the Chrome Lighthouse Performance Score Works
The Lighthouse performance score is a weighted average of five lab metrics, not a single measurement. Each metric maps to a log-normal distribution built from real HTTP Archive data.
| Metric | Score Weight | “Good” Threshold |
|---|---|---|
| Total Blocking Time (TBT) | 30% | Under 200ms |
| Largest Contentful Paint (LCP) | 25% | Under 2.5 seconds |
| Cumulative Layout Shift (CLS) | 25% | Under 0.1 |
| First Contentful Paint (FCP) | 10% | Under 1.8 seconds |
| Speed Index (SI) | 10% | Under 3.4 seconds |
Source: Chrome for Developers
TBT carries the most weight at 30%, yet it never appears in real-user field data — it acts as a proxy for INP, which replaced First Input Delay as a Core Web Vital in March 2024. The scoring curve sets the 25th percentile of HTTP Archive data as a score of 50, and the 8th percentile as a score of 90, meaning only the top 8% of sites define what “90+” looks like for any given metric.
Chrome Lighthouse Mobile Performance Scores by CMS Platform
The 2024 Web Almanac tested nearly 17 million websites monthly using Lighthouse 12.0.0, drawing from the June 2024 HTTP Archive crawl. Every CMS platform with at least 50,000 sites in the dataset improved year over year — but none cleared the 50-point “needs improvement” threshold on mobile.
| CMS Platform | 2023 Median Score | 2024 Median Score | Year-over-Year Change |
|---|---|---|---|
| Duda | 56 | 59 | +3 |
| Wix | 50 | 55 | +5 |
| TYPO3 CMS | 42 | 47 | +5 |
| Drupal | 36 | 40 | +4 |
| Joomla | 35 | 39 | +4 |
| WordPress | 33 | 38 | +5 |
| 1C-Bitrix | 31 | 33 | +2 |
| Weebly | 32 | 33 | +1 |
| Squarespace | 28 | 30 | +2 |
Source: HTTP Archive Web Almanac 2024, CMS chapter
Duda and Wix lead the table partly because both are closed platforms — performance updates roll out centrally without requiring individual site owners to install plugins or updates. WordPress gained 5 points, matching Wix and TYPO3 for the largest single-year jump, but its median of 38 still sits deep in the red zone.
The desktop-to-mobile gap is consistent across all platforms. Wix’s median desktop score reached 85 in 2024 while its mobile score was 55 — a 30-point difference that mirrors the broader pattern seen across the web.
Chrome Lighthouse Accessibility Audit Pass Rates
The Lighthouse accessibility score averages 57 individual audits powered by Deque’s axe-core engine. Each audit is binary — pass or fail, with no partial credit. The median accessibility score across the web reached 84% in 2024, up from 83% in 2022.
| Accessibility Audit | Pass Rate (2022) | Pass Rate (2024) | Change |
|---|---|---|---|
| aria-allowed-attr | 82% | 95% | +13pp |
| frame-title | 36% | 51% | +15pp |
| aria-input-field-name | 14% | 21% | +7pp |
| aria-progressbar-name | 3% | 14% | +11pp |
| color-contrast | 23% | 29% | +6pp |
Source: HTTP Archive Web Almanac 2024, Accessibility chapter
Frame-title, which checks whether iframes have descriptive titles for screen reader users, recorded the largest jump at 15 percentage points. Despite that gain, it still fails on nearly half of all tested pages.
Color contrast improved by just 6 percentage points, leaving 71% of sites in the HTTP Archive dataset failing this check. A separate analysis of 63,000-plus websites found that 88% were not compliant with current web accessibility standards, with an average score of around 60 out of 100.
The 2025 Web Almanac shows continued progress: the median Lighthouse accessibility score climbed to 85%, with the aria-tooltip-name audit jumping from 74% to 87% pass rate in a single year.
Core Web Vitals Pass Rates in Chrome’s CrUX Dataset
Core Web Vitals rankings use CrUX field data — real measurements from Chrome browsers — not Lighthouse lab scores. Google requires 75% of page loads to meet the “good” threshold for a site to pass each metric.
| Core Web Vital | Good Threshold | Mobile Pass Rate (2022) | Mobile Pass Rate (2024) |
|---|---|---|---|
| INP (replaced FID, March 2024) | Under 200ms | 55% | 74% |
| LCP | Under 2.5s | — | ~52% |
| CLS | Under 0.1 | — | Higher than LCP |
Source: HTTP Archive Web Almanac 2024, Performance chapter; Chrome for Developers
INP’s mobile pass rate rising from 55% to 74% in two years is the most notable shift in the dataset. The Web Almanac attributes this to developer awareness following INP’s promotion to Core Web Vital status, hardware improvements on users’ devices, and optimizations within Chrome itself.
LCP on mobile sits at roughly 52% passing, making it the hardest individual metric to clear. Image optimization explains much of the variance: 48% of mobile pages served an LCP image of 100KB or less, while 8% exceeded 1,000KB — a 10x gap that shows how unevenly optimization is applied across the web.
Chrome Lighthouse JavaScript Audit Results
One of Lighthouse’s most commonly triggered audits flags unminified JavaScript. In 2024, 62% of mobile pages scored between 0.9 and 1.0 on this check, meaning 38% of mobile pages still ship JavaScript that could be reduced without changing any logic.
The median mobile page in 2024 made 22 JavaScript requests, with the 90th percentile reaching 68 requests. Both figures increased compared to 2022 — by one and four requests, respectively. Every additional JavaScript request risks adding to TBT, which carries the largest single weight in the Lighthouse performance score at 30%.
FAQs
What is a good Chrome Lighthouse performance score?
Scores of 90 or above are considered good. Scores from 50 to 89 need improvement, and anything below 50 is categorized as poor. Only around 8% of sites in the HTTP Archive dataset score 90 or above on any individual metric.
How does Lighthouse differ from Core Web Vitals field data?
Lighthouse uses simulated lab conditions, while Core Web Vitals are measured using real Chrome user data via CrUX. Lab and field scores can differ substantially, especially on mobile, due to network and device variation.
Why do mobile Lighthouse scores lag so far behind desktop?
Lighthouse simulates a mid-tier Android device on a throttled connection for mobile tests. Wix, for example, scored 55 on mobile and 85 on desktop in 2024 — a 30-point gap consistent across most CMS platforms.
What replaced First Input Delay in Core Web Vitals?
Interaction to Next Paint (INP) replaced First Input Delay as a Core Web Vital in March 2024. The mobile INP pass rate rose from 55% in 2022 to 74% in 2024, the largest single metric improvement in the dataset.
Which accessibility audit fails on the most websites?
Color contrast fails on 71% of sites in the HTTP Archive dataset as of 2024, despite being one of the most impactful checks for user readability. It improved by only 6 percentage points between 2022 and 2024.
