npm registry Security Audit: 14 Findings, Three of Which Are Just npm Packages Named admin, manage, and debug
https://registry.npmjs.org/reactAbout This API
The npm registry at registry.npmjs.org is the metadata API behind every npm install on the planet. It is operated by npm Inc., now part of GitHub (a Microsoft subsidiary since the 2020 acquisition), and it serves the world's largest software registry — over 3 million packages, with billions of monthly tarball downloads. The registry is a read-mostly CouchDB-shaped JSON API, fronted by Cloudflare for caching and DDoS protection.
The endpoint we scanned is https://registry.npmjs.org/react — the canonical metadata document for React. It returns a single, large JSON document containing the package's name, description, license, repository URL, every published version (2,791 of them at scan time), the dist-tags map (latest: 19.2.5, plus beta, rc, canary, experimental, next), publish timestamps for every version, the README, the maintainer list, and per-version dist objects with the tarball URL, SHA-1 shasum, SHA-512 integrity hash, file count, unpacked size, and a Sigstore signature.
The format is the same for every package. Whether you fetch /react, /lodash, or /some-package-no-one-has-heard-of, you get the same shape: name, versions, dist-tags, time, maintainers, readme. The endpoint is the contract that npm, yarn, pnpm, bun, and every CI system on Earth depend on.
This audit is the latest in middleBrick's public-API case-study series. It's also the largest API we've scanned by traffic volume — every developer with a Node project, every CI run, every build of every JavaScript application on Earth, hits this surface multiple times a day. So it deserves a careful read. But it also deserves an honest one: most of what a black-box scanner finds here is not what's actually interesting about the registry's security posture.
Threat Model
Almost nothing about the npm registry's threat model is captured by a per-endpoint scan. The endpoint itself is read-only and Cloudflare-cached; an attacker compromising the read API yields no leverage that wasn't already available by running npm view react from any machine.
What matters about npm-registry security is upstream and downstream of this endpoint. Upstream: account compromise, maintainer takeover, malicious-publish, dependency-confusion attacks. Downstream: what a package manager does with the metadata we just read. Neither is in scope for a black-box probe of /react.
The trust boundary that does matter
The interesting question for this scan is what a downstream tool — npm CLI, yarn, a custom registry mirror, an SBOM scanner, a Renovate-style update bot — should and shouldn't trust from the response. The answer is: trust the integrity hashes and the Sigstore signatures, distrust everything else by default.
The dist.integrity field (e.g. sha512-llUJLzz1zTUBrskt...) is a content hash of the tarball. If the registry — or any mirror, or any MITM — serves a tarball whose SHA-512 doesn't match the integrity from the metadata, package managers refuse to install. The dist.signatures array carries Sigstore-style signatures linked to the registry's signing key (keyid: SHA256:DhQ8wR5APBvFHLF/+Tc+AYvPOdTpcIDqOhxsBHRwC7U), and modern npm clients verify these. Both of these survive any of the structural findings in this scan, because they are protected by cryptographic primitives independent of the API surface.
The fields that aren't cryptographically protected — maintainers, repository.url, homepage, readme, the embedded URL list — are exactly the fields a downstream tool should treat as advisory rather than trusted. A maintainer email in the metadata isn't a credential check. A repository.url pointing at https://github.com/facebook/react isn't a verified link to the source. The embedded URLs in the README aren't curated. Tools that elevate any of these fields to a security-decision input — for example, "alert me if a maintainer email changes" or "display the linked GitHub repo as canonical" — are inheriting a trust they don't have.
What this scan does not tell you
It does not tell you whether the registry's account-takeover protections are working. It does not tell you whether 2FA enforcement on publish is universal (it's been required for high-impact packages since 2022, expanded in 2023). It does not tell you whether typosquats are being caught. It does not tell you whether a malicious publish-script is propagating through dependencies right now. Those are the threat-model questions that actually matter for npm-registry security, and they aren't observable from a black-box probe of /react. Nothing in this writeup should be read as a verdict on registry-level security; it is a verdict on what you can and can't infer from the metadata API surface.
Methodology
middleBrick ran a black-box scan against https://registry.npmjs.org/react using HTTP GET. The scan executed twelve security checks across the OWASP API Top 10 categories: authentication, authorization (BOLA / BFLA / property-level), input validation, CORS, rate limiting, encryption, data exposure, inventory management, unsafe-consumption, SSRF, and an LLM probe. The scan was read-only — no credentials, no destructive methods, no payload mutation beyond standard path-suffix probes.
Fourteen findings resulted. They distribute as one CRITICAL, five HIGH, five MEDIUM, three LOW. Of the fourteen, three are scanner false positives caused by the registry's flat package namespace; the remaining eleven are accurate but most are structural to a public read-only metadata API rather than actionable.
The BFLA probe — what happens when /admin is a real package
The scanner's BFLA (broken function-level authorization) check appends well-known privileged paths to the base URL and looks for 200 responses. On most APIs this surface — /admin, /manage, /config, /internal, /debug, /health — yields 401, 403, or 404. On the npm registry, /admin, /manage, and /debug all return 200 with a JSON document — because each of those is a real npm package whose metadata the registry served. The admin package is described as 'Drop-in Node.js admin endpoint to help you analyze production issues.' The debug package is the well-known logging library used by tens of thousands of projects. The scanner saw 200 + JSON and flagged HIGH-severity privileged-endpoint exposure; the reality is that the registry has a flat namespace and any single-segment path resolves to whatever package shares that name, or 404 if no such package exists.
This is worth flagging because it's the kind of false-positive that is identifiable only with domain knowledge. A scanner without a registry-shaped exception cannot tell the difference between 'admin endpoint exposed' and 'someone published a package called admin in 2010.' For this writeup, the relevant point is that the registry's flat namespace is the design, the BFLA probe finds packages instead of admin endpoints, and a downstream consumer should not infer privileged surface from the scan output.
Results Overview
The npm registry endpoint scored 75 out of 100 — a B grade — across fourteen findings.
One CRITICAL: no authentication required on GET. Structural. Reading public package metadata is the registry's purpose; gating it behind credentials would break npm install for every developer on Earth. The finding stays critical-by-default in the scoring engine so that on APIs where unauthenticated read does matter, the signal isn't suppressed.
Five HIGH: three are the false-positive BFLA-style findings (/admin, /manage, /debug — all real packages, see methodology). The other two are wildcard CORS (access-control-allow-origin: *, structural for a public read API consumed from arbitrary origins) and missing rate-limit headers — accurate, structural, mitigated by Cloudflare-layer rate-limiting that the scanner cannot observe.
Five MEDIUM: response exceeds 1MB (the /react document is 6.5 MB because every published version's metadata is included), excessively-large response (same observation, different heuristic), embedded email addresses in the response body (maintainer email [email protected] is published by design), verbose error information (the readme field includes structured error examples from React documentation, which the heuristic correctly flagged on shape but not on intent), and 9,440 external URLs across multiple hosts (the readme renders to a document linking to GitHub, react.dev, blog posts, and the apache.org license — all expected on a public package landing page).
Three LOW: missing security headers (HSTS, X-Content-Type-Options, X-Frame-Options absent on the response), 59 plain-HTTP URLs inside the HTTPS response (mostly old README content from package versions published a decade ago), and no URL versioning on the metadata path. The last finding is technically correct — the /react path has no /v1/ prefix — but the registry does have a versioned namespace at /-/v1/ for namespaced operations like search, which the scanner didn't probe.
The honest single-sentence summary: of fourteen findings, eleven are accurate and structural to a public read-only metadata API, three are false positives caused by package-namespace collision, and zero describe an actionable risk to the registry's security posture.
Detailed Findings
API accessible without authentication
The endpoint returned 200 without any authentication credentials.
Implement authentication (API key, OAuth 2.0, or JWT) for all API endpoints.
Privileged endpoint accessible: /admin
/admin returned 200 without authentication. This may expose admin functionality.
Restrict access to admin/management endpoints. Implement RBAC with proper role checks.
Privileged endpoint accessible: /manage
/manage returned 200 without authentication. This may expose admin functionality.
Restrict access to admin/management endpoints. Implement RBAC with proper role checks.
CORS allows all origins (wildcard *)
Access-Control-Allow-Origin is set to *, allowing any website to make requests.
Restrict CORS to specific trusted origins. Avoid wildcard in production.
Debug endpoint accessible: /debug
A debug/diagnostic endpoint is publicly accessible — may leak internal state.
Disable debug endpoints in production. Restrict access to internal networks only.
Missing rate limiting headers
Response contains no X-RateLimit-* or Retry-After headers. Without rate limiting, the API is vulnerable to resource exhaustion attacks (DoS, brute force, abuse).
Implement rate limiting (token bucket, sliding window) and return X-RateLimit-Limit, X-RateLimit-Remaining, and Retry-After headers.
Response exceeds 1MB
Response body is 6470KB — may indicate missing pagination.
Implement pagination for large collections. Set maximum response size limits.
Email addresses exposed in response body
Response body contains email addresses. This constitutes a PII/sensitive data leak (CWE-200).
Remove or mask sensitive data before returning to clients. Implement field-level access controls and output filtering.
Verbose error information in response
API response contains detailed error information that could aid attackers.
Return generic error messages to clients. Log detailed errors server-side only.
Excessively large API response
Response body is 6470KB — may indicate excessive data exposure.
Implement pagination, field selection, or response size limits.
Multiple external URLs in API response
Response references 9440 external URLs across 6 host(s): github.com, www.apache.org, facebook.github.io, reactjs.org, openpgpjs.org
Validate and sanitize all external URLs. Implement allowlists for trusted third-party services.
Missing security headers (3/4)
Missing: HSTS — protocol downgrade attacks; X-Content-Type-Options — MIME sniffing; X-Frame-Options — clickjacking.
Add the following headers to all API responses: strict-transport-security, x-content-type-options, x-frame-options.
HTTP URLs in HTTPS response
Response contains 59 plain HTTP URL(s) — potential mixed content issue.
Use HTTPS for all URLs in API responses.
No API versioning detected
The API URL doesn't include a version prefix (e.g., /v1/) and no version header is present.
Implement API versioning via URL path (/v1/), header (API-Version), or query parameter.
Attacker Perspective
An attacker compromising the metadata read endpoint gains essentially nothing. The data is public; you can curl https://registry.npmjs.org/react from anywhere and get the same response. The interesting attacker-perspective questions are upstream and downstream of the scanned surface, and they're worth walking through because they're the questions the scan does not answer.
Upstream: account takeover and malicious publish
The supply-chain attacks that have actually moved the needle in the npm ecosystem — event-stream (2018), ua-parser-js (2021), colors/faker sabotage (2022), the wave of typosquats and malicious lookalikes throughout the last six years — all happened on the publish side. An attacker who compromises a maintainer account or persuades a maintainer to merge a malicious dependency does not need to attack the read API at all; they just publish, and the read API serves the result to everyone who runs npm install. The registry's defenses against this class of attack — required 2FA on publish for popular packages, provenance attestations, npm's malware-scanning pipeline — are not visible to a black-box read probe. None of them are tested, validated, or even observable from this scan.
The pattern an attacker actually exploits
Read the readme field on the response. It is rendered to HTML by the registry's web UI, displayed on npmjs.com/package/react, and embedded into editor extensions, IDE tooltips, and AI coding assistants that pull package metadata to give context. An attacker who controls the README controls a chunk of UI surface across the JavaScript ecosystem. This is one of the recurring features of npm-registry attacks: the README isn't just documentation, it's a small marketing surface that gets distributed via the metadata API into an enormous downstream surface area.
Downstream: what a tool wrongly trusts
Every tool that reads from registry.npmjs.org — npm CLI, yarn, pnpm, bun, Renovate, Dependabot, every SBOM scanner, every license-compliance tool, every "package health" dashboard — has to make a decision about which fields to trust. The fields that are cryptographically protected (dist.integrity, dist.signatures) survive a registry compromise because their verification uses external trust anchors. The fields that aren't (maintainers, repository.url, readme, the embedded URL list) only survive a registry compromise if you don't elevate them to security-decision inputs. A tool that flags 'this package's repository URL is now github.com/totally-legit/react' as benign-because-the-API-said-so has trusted the wrong field.
Analysis
Two findings deserve a closer look because they explain how to read the rest of the report.
The /admin, /manage, /debug false positive
The BFLA probe found that /admin, /manage, and /debug all return 200 with JSON. Spot check:
$ curl -s https://registry.npmjs.org/admin | head -c 400
{"_id":"admin","_rev":"...","name":"admin","description":"Drop-in Node.js admin endpoint to help you analyze production issues.","dist-tags":{"latest":"..."},"versions":{...},...}This is the metadata document for an actual npm package called admin. The registry has a flat namespace where /{package-name} resolves to package metadata for that name (or 404 if no such package). The scanner's privileged-path heuristic was designed against APIs that gate /admin behind RBAC; npm doesn't gate it because there is no admin endpoint at /admin, just a package whose author chose that name. The same is true for debug (the very-popular logging library by Sindre Sorhus / TJ Holowaychuk) and manage (a much smaller package with the same accidental collision).
This is the kind of false positive that's only identifiable with domain knowledge of the API. It's worth calling out because it's a recurring pattern in scanning any registry-shaped API: a flat namespace creates collisions between security-significant paths and user-content paths, and a generic scanner cannot tell which is which.
Why the response is 6.5 MB
The 'response exceeds 1MB' and 'excessively-large response' findings both fire on the same observation. The /react metadata document is 6.5 MB because the registry returns metadata for every published version of the package — 2,791 versions for React, dating back to 0.0.1 in 2011 — in a single response. Each version entry includes its dist object, dependency tree, scripts, files, and most of the package.json. Multiplied by 2,791, the response inflates rapidly.
This is a deliberate design choice. The npm CLI fetches the full metadata document so it can resolve a version selector like ^18.0.0 against the entire publish history without paginated round-trips. Pagination would require multiple requests for every npm install against a large package, and the registry's caching is built around full-document caching at the Cloudflare edge. The cost is that a fresh fetch of react is a 6.5 MB transfer; the benefit is that subsequent requests serve from cache for everyone, the response is gzippable to roughly 700 KB, and the resolution algorithm runs locally without registry round-trips.
For a downstream tool building against the registry, the structural lesson is: the registry is optimized for batch metadata transfer with edge caching, not for paginated lookup. If your tool only needs one version, fetch the version-specific document at /react/19.2.5 instead, which is roughly 5 KB.
Maintainer emails (the 'PII' finding)
The maintainers field on every package contains {name, email} entries. For React: {name: 'react-bot', email: '[email protected]'} and one other Meta-team entry. The scanner correctly flagged this as PII (CWE-200, sensitive data in response). The reality is that maintainer emails are published intentionally — they're the contact channel for security disclosures, takeover-recovery, and the npm-registry's own provenance model. Removing them would break the disclosure flow that the entire ecosystem depends on. On a real-business API this would be a leak. On a public registry it's a feature.
Industry Context
The npm registry sits in the same architectural family as PyPI (Python), RubyGems, crates.io (Rust), Maven Central (Java), packagist (PHP / Composer), and Go's module proxy. All of them serve public read-only metadata for an enormous number of small packages, all are hit by every install on Earth, and all of them face essentially the same threat model: the read endpoint is operationally robust and largely uninteresting from a security-test perspective; the supply-chain attack surface is on the publish side and on the trust transitivity downstream.
Among them, npm's content-integrity model — SHA-512 integrity hashes plus Sigstore-style signatures plus the recently-introduced provenance attestations from npm publish --provenance — is one of the more developed. PyPI added Sigstore signatures in 2024. RubyGems has had per-version SHA hashes for a long time but only adopted strong signing recently. Maven Central has a different model with PGP signatures verified against a key directory. Across all of them, the guidance for downstream tools is the same: trust the cryptographic fields, treat the rest as advisory.
For a custom registry mirror — Nexus, JFrog Artifactory, GitHub Packages, an internal proxy — the temptation is to copy the upstream registry's response shape verbatim. Most of npm's response-level findings then propagate. Wildcard CORS, missing rate-limit headers, embedded plain-HTTP URLs from old READMEs, no URL versioning on package paths — all of these become inherited findings for the mirror unless the mirror reshapes the response. If you copy this pattern into a real backend, here's what you inherit: the registry's loose authentication model is fine for npm because the data is public, but a private mirror serving private packages with the same shape will fail authentication audits and probably PCI/SOC 2 controls if it's serving paid customers. The mirror needs to layer authentication on top, gate package-name namespaces by entitlement, emit rate-limit headers, and set a tight CORS allowlist. Mirrors that simply reverse-proxy the upstream response inherit a posture that was acceptable for the public read use case and not acceptable for a private read use case.
OWASP API Top 10 mapping: API1 (intentional no-auth on a public read), API4 (no rate-limit headers — mitigated by edge layer not visible to scan), API5 (BFLA probe false positives, no real privileged surface), API8 (CORS wildcard + missing security headers, structural), API9 (no URL versioning on package path — partly mitigated by the namespaced /-/v1/ path that the scan didn't probe), API10 (unsafe consumption of external URLs in README content). Six of ten categories touched, none describing actionable registry-level risk.
Remediation Guide
Fetch the abbreviated metadata when full publish history isn't needed
If you're a registry consumer that doesn't need the full version history, send Accept: application/vnd.npm.install-v1+json. The registry returns a smaller document with only the fields the install algorithm uses, dramatically reducing response size.
// Fetch abbreviated install metadata
const res = await fetch('https://registry.npmjs.org/react', {
headers: { Accept: 'application/vnd.npm.install-v1+json' }
});
const meta = await res.json();
// meta.versions[v].dist still has integrity + signatures
// but the response is much smaller (no readme, no time, no users) Verify integrity hashes on every tarball you fetch
The dist.integrity field is the cryptographic primitive that protects against tarball substitution. Every install must verify it. npm CLI does this by default; if you're writing a custom installer or registry mirror, don't skip it.
import { createHash } from 'node:crypto';
import ssri from 'ssri';
// `tarballBuf` is the bytes of the .tgz you fetched
// `expectedIntegrity` is the dist.integrity from the metadata
const expected = ssri.parse(expectedIntegrity);
const actual = ssri.fromData(tarballBuf, { algorithms: ['sha512'] });
if (!actual.match(expected)) {
throw new Error('Integrity mismatch — refusing to install tarball');
} Verify Sigstore signatures, not just hashes
Modern npm CLI verifies dist.signatures against the registry's published signing key. If you're consuming registry metadata in your own tooling, verify the signature rather than treating integrity as the only check.
// Pseudocode using the npm registry's public keys
// (fetched from https://registry.npmjs.org/-/npm/v1/keys)
const keys = await fetch('https://registry.npmjs.org/-/npm/v1/keys')
.then(r => r.json());
const { sig, keyid } = dist.signatures[0];
const pubKey = keys.keys.find(k => k.keyid === keyid);
const payload = `${packageName}@${version}:${dist.integrity}`;
const valid = await verify(pubKey, sig, payload);
if (!valid) throw new Error('Signature verification failed'); Treat maintainers / repository.url / README as advisory only
These fields are NOT cryptographically protected against registry tampering. Don't elevate them to security-decision inputs without independent verification.
// BAD: trusting repository.url for security decisions
if (pkg.repository.url.includes('github.com/facebook/react')) {
markAsTrusted(pkg); // attacker can claim any URL here
}
// BETTER: verify via npm provenance attestation
const provenance = await fetch(
`https://registry.npmjs.org/-/npm/v1/attestations/${name}@${version}`
).then(r => r.json());
// provenance.attestations[].predicate.buildDefinition.externalParameters
// contains the verified source repo & build trigger from Sigstore. If you operate a registry mirror, don't reverse-proxy the response shape verbatim
The npm registry's loose authentication and wildcard-CORS posture is acceptable for the public read use case. A private mirror serving paid customers needs to add authentication, gate package-name namespaces by entitlement, emit rate-limit headers, and set a tight CORS allowlist.
// Verdaccio / Nexus / Artifactory custom mirror config
// Layer auth on top of the upstream response shape
app.get('/:package', requireAuth, requireEntitlement, async (req, res) => {
const upstream = await fetchFromNpm(req.params.package);
// Re-emit with mirror-specific headers
res.setHeader('X-RateLimit-Limit', '1000');
res.setHeader('X-RateLimit-Remaining', String(req.rateRemaining));
res.setHeader('Access-Control-Allow-Origin', 'https://your-mirror-ui.example.com');
res.json(upstream);
}); Sanitize README content before rendering it as HTML in your tool
The README contains arbitrary user-submitted Markdown that gets rendered to HTML for display in npm.com, IDE tooltips, AI coding assistants, and package-health dashboards. Sanitize it.
import { marked } from 'marked';
import DOMPurify from 'isomorphic-dompurify';
const html = marked.parse(pkg.readme);
const safe = DOMPurify.sanitize(html, {
ALLOWED_TAGS: ['p','a','code','pre','strong','em','ul','ol','li','h1','h2','h3','blockquote'],
ALLOWED_ATTR: ['href'],
ALLOWED_URI_REGEXP: /^https:/ // reject http:, javascript:, data:
});
renderInTooltip(safe); Defense in Depth
If you maintain the npm registry, the action items from this scan are minimal — and most of them are structural decisions that have already been made deliberately. The genuinely useful improvements are not visible from a per-endpoint scan; they're on the publish-side and on tooling-trust transitivity, which is where actual ecosystem damage happens.
1. The CORS wildcard, missing rate-limit headers, missing security headers. Each of these is intentional or mitigated at the Cloudflare edge. CORS wildcard is required because the registry is consumed by browser-based tools (Bundlephobia, npm.io, code playgrounds) from arbitrary origins. Rate limits exist but live at the edge layer and aren't surfaced as response headers; surfacing X-RateLimit-* on metadata responses would let abusive clients tune to the boundary, which the registry deliberately doesn't expose. HSTS is set at the host level via Cloudflare; the missing-on-response-itself finding is a structural quirk of how the static-cached metadata response is served.
2. The 6.5 MB response size. Already a deliberate trade-off — full-document caching versus paginated fetches. The introduction of the abbreviated metadata format (application/vnd.npm.install-v1+json, returned when the client sends that Accept header) is the registry's existing answer; clients that opt into it get a smaller document with only the fields the install algorithm needs. Encouraging more clients to use the abbreviated content type is the cleaner improvement than reshaping the default response.
3. The README content (embedded HTTP URLs, the email-pattern finding). The maintainer-email finding is intentional and required for the disclosure flow. The 59 plain-HTTP URLs in the response come from old README content for old package versions; rewriting historical READMEs would be a destructive change and isn't worth it. New READMEs should use HTTPS URLs (most do).
For consumers — every tool that reads from the registry — the defenses are different and worth enumerating clearly:
Verify the integrity hash. Every tarball install must verify dist.integrity against the SHA-512 of the downloaded tarball. npm, yarn, pnpm, and bun all do this by default. If you're writing a custom installer or a registry mirror, do not skip this check. It is the single primitive that protects against tarball-substitution attacks, including ones that compromise the registry itself.
Verify the Sigstore signature. Modern npm clients verify dist.signatures[].sig against the registry's published signing key. If you're writing tooling that consumes npm metadata, validate the signatures rather than treating integrity as the only check.
Treat advisory fields as advisory. The maintainers, repository.url, homepage, and readme fields are not cryptographically protected against registry tampering. Don't elevate them to security-decision inputs without independent verification (e.g., for repository.url, follow it to GitHub and confirm the repo claims the package via npm's provenance link, not just by self-declaration in the metadata).
If you're building a registry mirror. Don't reverse-proxy the response shape verbatim if your mirror serves private packages. Add authentication, add a tight CORS allowlist, add rate-limit headers, gate package-name namespaces by entitlement, and re-sign tarballs if your trust model differs from upstream npm's.
Conclusion
The npm registry's package-metadata endpoint scored 75/100 with fourteen findings. Eleven of the fourteen are accurate and structural to a public read-only metadata API, three are false positives caused by the registry's flat package namespace (admin, manage, and debug are real packages), and zero describe an actionable risk to the registry itself. The CRITICAL on no-authentication and the HIGH on BFLA paths are scanner-correct but context-wrong. The MEDIUMs on response size and embedded URLs are scanner-correct and structurally accurate descriptions of the registry's deliberate design. The LOWs on CORS, security headers, and versioning are real but mitigated at the edge layer or the namespaced surface.
The honest takeaway is the one the scan can't draw on its own: npm-registry security is mostly not on the read-API surface. It's on the publish flow (2FA, malware-scanning, provenance), the cryptographic primitives in the response (integrity hash, Sigstore signature), and the trust transitivity into downstream tooling. A black-box probe of /react gets you to 'this is a robust read endpoint with a few structural quirks that aren't actionable.' Beyond that, the analysis lives in the publish-side controls and the verification primitives that survive a registry compromise. Both are out of scope for what a per-endpoint scanner can observe.
For a downstream tool consuming the registry — a CLI, a mirror, an SBOM tool, a package-health dashboard, an AI coding assistant pulling package context — the operational guidance is short: verify integrity hashes, verify signatures, treat the advisory fields (maintainers, repository URL, README) as advisory. For a registry-mirror operator, don't inherit the upstream's loose authentication model when serving private packages. For everyone else: the next time the JavaScript ecosystem has an npm-supply-chain incident — and it will — the surface that gets attacked will not be the read API. It will be the publish flow.