What Is an SBOM and Why Every Website Needs One

Profile
Yves SoeteFollow
6 min read · Apr 21, 2026

APR 21, 2026 - Written by Yves SoeteBlacksight LLC — scan your vendor stack free atscanner.blacksight.io

SBOM — Software Bill of Materials — has become one of the most discussed concepts in enterprise security over the past two years. Executive orders reference it. Compliance frameworks require it. Vendors sell tooling around it. But most of the conversation focuses on code repositories and CI/CD pipelines, which is only half the story. If you operate a website that serves pages to real customers, the more relevant question is simpler and more uncomfortable: what third-party code is running on your live site right now, and do you have a complete inventory of it? For most operators, the honest answer is no. This post explains what an SBOM is in the web context, why the standard approach misses critical attack surface, and how to close the gap.



What an SBOM actually is



At its core, an SBOM is an inventory of every software component in a system. Think of it as a nutritional label for software — instead of listing calories and ingredients, it lists libraries, frameworks, versions, and origins. In the traditional development context, an SBOM catalogs your application's dependencies: every npm package, every Python library, every Go module, along with version numbers and known vulnerabilities.

In the web context, the definition expands. An SBOM for a live website means every JavaScript file that loads in the browser, every analytics pixel that fires, every CDN-hosted library pulled in via a script tag, every embedded widget from a third-party vendor, every tag manager script and the tags it injects, and every payment iframe or chat overlay that renders on the page. This is the runtime SBOM, and it differs fundamentally from a build-time SBOM because it captures what actually executes in a visitor's browser — not just what appears in your package.json or your repository's dependency tree.



Why build-time SBOMs miss half the picture



Tools like npm audit, Snyk, and Dependabot do valuable work scanning your repository for known vulnerabilities in declared dependencies. If you have a vulnerable version of lodash in your package-lock.json, these tools will find it. That is build-time scanning, and it covers the code you control and committed to version control.

But your live website loads code you never committed. Google Tag Manager injects scripts dynamically based on rules set in a marketing dashboard — your engineering team may not even know what is running. A/B testing platforms like Optimizely or VWO load JavaScript that modifies the DOM in real time. Chat widgets from Intercom, Drift, or Zendesk pull in their own dependency trees. Payment iframes from Stripe or Braintree load vendor-controlled code inside your page context. CDN-hosted libraries loaded via bare script tags — jQuery from cdnjs, fonts from Google, polyfills from a third-party CDN — exist only as URLs in your HTML, not as entries in any package manager.

None of this code appears in your repository. None of it shows up in npm audit. None of it is covered by Snyk or Dependabot. Build-time tools are scanning a map while the territory has changed beneath it.



The Polyfill.io wake-up call



In mid-2024, the domain polyfill.io was acquired by a new owner who began serving malicious code through the same CDN endpoint that over 100,000 websites had been loading via a simple script tag. The injected code redirected mobile users to scam sites, and it did so selectively — only triggering for certain user agents and geographies to avoid detection by automated scanners running from data centers.

The critical detail: not a single one of those 100,000+ affected sites had polyfill.io listed in their package.json. It was loaded via a script tag in HTML, completely invisible to every build-time dependency scanner on the market. There was no CVE to flag. There was no vulnerable package version to update. The attack vector was a runtime dependency that existed only in the browser, and the only way to detect it was to look at what the live site was actually loading.

A runtime SBOM would have caught this immediately. Any tool that inventoried the external scripts loading on the page would have flagged the polyfill.io domain, and any tool monitoring script content hashes would have detected the change the moment the malicious payload was deployed.



How to build a runtime SBOM for your website



The manual approach is straightforward and worth doing at least once. Open your site in Chrome, open DevTools, go to the Network tab, and filter by JS. Load your homepage, your checkout page, and any page that handles sensitive data. For every external domain you see loading scripts, record the full script URL, the domain it is served from, whether a Subresource Integrity hash is present on the script tag, the approximate file size, and when it was last modified if the server provides that header.

Next, check your HTML source for any inline scripts that dynamically load additional resources. Tag managers are the biggest offender here — a single Google Tag Manager container can inject dozens of scripts that only appear after the initial page load, and they change whenever someone updates a tag in the GTM dashboard.

This manual process gives you a snapshot, which is better than nothing. The problems are that it is labor-intensive, it captures a single point in time, it misses scripts that load conditionally based on geography or device type, and it misses scripts injected after initial page load by other scripts. A snapshot taken on Monday may be inaccurate by Wednesday.



Automating the runtime SBOM



Automation turns a one-time audit into a living inventory. The approach is to visit the site programmatically using a real browser engine, wait for all scripts to load including dynamically injected ones, catalog every third-party script with its source domain and content hash, check each script tag for SRI compliance, cross-reference vendor domains against known breach databases and threat intelligence feeds, and repeat on a schedule.

BlackSight Scanner automates exactly this workflow. It visits your site like a real browser, builds a complete runtime inventory of every third-party script, checks SRI compliance, and cross-references vendors against breach history. Continuous scanning on the Plus plan at $29/mo turns the manual audit into a persistent watch that alerts you when anything changes — a new script appears, an existing script's hash changes, or a vendor domain shows up in breach reports.



Who needs a runtime SBOM right now



If you process payments on a web page, you need one. PCI DSS 4.0 Requirement 6.4.3, which became mandatory in March 2025, explicitly requires an inventory of all scripts executing on payment pages, a mechanism to confirm each script is authorized, and integrity assurance for each script. A runtime SBOM is the foundation for all three requirements. Without one, you have a compliance gap that your QSA will find.

If you handle personal data under GDPR, the accountability principle in Article 5(2) means you need to demonstrate that you know what code is processing visitor data. Third-party scripts that set cookies, fingerprint browsers, or exfiltrate form data without your knowledge are your liability, not the vendor's.

If your site loads more than a handful of third-party scripts — and nearly every production site does — you have runtime dependencies you are not tracking. E-commerce sites are the highest-priority target because they combine payment data, personal data, and heavy third-party script usage in a single page. But SaaS login pages, healthcare portals, financial dashboards, and any page that handles sensitive input carry the same risk.

The honest test: open DevTools on your most sensitive page and count the external script domains. If the number is higher than you expected, you need a runtime SBOM.



Closing the gap



The gap between what is in your repository and what runs in the browser is the gap most supply-chain attacks exploit. Build-time tools guard one side. Runtime SBOMs guard the other. Neither is sufficient alone, and right now, almost everyone is running only the build-time half.

Start with the manual audit. Open DevTools, list your external scripts, and be honest about how many you did not know were there. Then decide whether a spreadsheet you update monthly is realistic, or whether continuous automated scanning is worth the investment. Either way, the first step is the same: know what is running on your site.

Run a free supply-chain scan at scanner.blacksight.io/supply-chain-security

Liked this article? Get notified when new articles drop — visitblacksight.io/blogto subscribe.

Version 1.0.68