Polyfill.io, One Year Later: How to Detect a Compromised Third-Party Script Before It Hits You

Profile
Yves SoeteFollow
6 min read · Apr 8, 2026

APR 8, 2026 - Written by Yves SoeteBlacksight LLC — scan your scripts free atscanner.blacksight.io

In June 2024, the polyfill.io domain was acquired by a new owner, and shortly after, the script it served began injecting malicious code on mobile devices. The script was installed on roughly 100,000 websites by an estimated count. Most site owners found out when Cloudflare, Google, and security researchers started raising alarms — not from their own monitoring. A year and a half later, the lesson has not been absorbed. Most sites still load third-party scripts they could not enumerate if you asked, and most still have no mechanism to notice when one of those scripts changes. Here is what actually catches these incidents, and what failed in 2024.



What actually happened with polyfill.io



The original polyfill.io was a community project providing compatibility shims for older browsers. It was an unremarkable dependency: include a script tag, browsers with missing features get polyfilled, done. In early 2024, the domain and GitHub org were sold to a Chinese CDN company. Within months, the served script began detecting mobile User-Agents and redirecting to sports-betting and scam sites. On desktop it behaved normally, which is why most automated checks did not catch it.

The new code was not static. It loaded additional payloads from a separate domain, which meant even sites that pinned SRI hashes on the polyfill.io script itself were not fully protected. The downstream payload could change without triggering the pinned hash on the entry-point script. It was a well-designed attack for the post-SRI world.



What caught it early (and what did not)



Static build-time scanners caught nothing. They do not look at runtime behavior, and the package on npm was unchanged. SAST tools caught nothing. Git history of every affected site showed no vulnerabilities. What caught the incident was runtime behavioral analysis — browsers and security researchers observing the served JavaScript change on live sites — and third-party threat intel from companies watching CDN-served payloads across the broader web.

For individual site operators, three signals would have flagged it earlier. One, inventory drift: if you knew which scripts loaded on your site yesterday, you could notice when a new payload URL appeared today. Two, SRI mismatch on pinned scripts: the entry-point script itself did change enough that SRI pinning would have broken it. Three, Content Security Policy violation reports: if your CSP allowed polyfill.io but not the new payload domain, the browser would report the block attempt.



Three practical controls worth implementing



First, inventory every third-party script on every public page of your site. Most operators have never actually listed these out. A quick way: load your homepage in Chrome DevTools, go to Network, filter by JS, and enumerate every external domain. Do the same for checkout, account, and any page that handles sensitive data. The list is usually longer than you expected. Put it in a file, date it, and commit it to your infra repo.

Second, add SRI hashes to scripts you control the version of. The polyfill.io scripts were not versioned, which is part of what made them dangerous. Scripts from vendors like Stripe, Cloudflare, and major analytics providers often do publish integrity hashes or version-pinned URLs — use them. For vendors that refuse, decide whether the script is worth the trust you are extending.

Third, set up Content Security Policy with script-src restrictions and set the report-uri so CSP violations come back to a log you read. Start in report-only mode to avoid breaking things. After two weeks of logging, promote to enforcing mode with the known-good allowlist. Most sites can reach a working enforcing CSP in a day of tuning.



Automating the watch



Inventory and SRI are both static artifacts that go stale. The real control is continuous monitoring — some system that scans your live site on a schedule, compares the current third-party script set against last week's, and alerts on any new domain or any hash change. Without that, your inventory is correct for about an hour after you wrote it.

We built BlackSight's supply-chain scanner to be exactly this watchdog. It runs on a schedule, produces a diff against the previous scan, and flags new vendors and hash drift. But you can build equivalent tooling with a nightly Puppeteer script and a diff tool if you prefer. The key is that the check happens on a real browser fetching your live site, not on your build output.



The deeper lesson



The polyfill.io incident was not a failure of any specific tool. It was a failure of treating third-party scripts as a solved problem. Every vendor you include is a trust relationship that extends indefinitely unless you put controls in place. "Include this tag and forget about it" is not a viable operating model for any site handling sensitive data in 2026.

Audit what is loading on your site this week. If you have not enumerated third-party scripts in the last 90 days, that is probably the single highest-leverage security task on your list.

Inventory your third-party scripts free at scanner.blacksight.io/supply-chain-security

Liked this article? Get notified when new articles drop — visitblacksight.io/blogto subscribe.

Version 1.0.66