<img height="1" width="1" style="display:none" src="https://www.facebook.com/tr?id=754813615259820&amp;ev=PageView&amp;noscript=1">

Firmware Security and Why CISOs Should Audit Embedded Code in 2026

Date: 15 May 2026

Featured Image

For the last decade, most enterprise security programs have focused on the layers everyone can see. Endpoints, identity, cloud, network perimeter, application code. The investments have paid off in mature ways. EDR catches most commodity malware. Zero-trust frameworks contain lateral movement. SAST tools shake out the obvious application bugs before they ship.

Then attackers moved down the stack.

Firmware, the code that sits between hardware and the operating system, has quietly become one of the softest targets in the enterprise attack surface. A 2024 ONEKEY report found that outdated firmware is now among the most common entry points into IoT systems, and only 31 percent of surveyed companies regularly test embedded code. Another 22 percent admitted they don't know whether their firmware is being tested at all.

That gap is becoming harder to ignore. UEFI implants, malicious over-the-air updates, baseboard management controller (BMC) compromises, and supply-chain attacks targeting third-party firmware components have all surfaced in major incidents over the past three years. The pattern is consistent: when attackers can't get through hardened operating systems, they go lower.

By 2026, "we patch our servers monthly" is no longer a defensible security posture if the fleet of devices feeding those servers runs firmware no one has audited in three years.

Why Firmware Has Been Ignored for So Long

Firmware security has a credibility problem inside most organisations, and the reasons are structural.

It's been treated as an engineering concern, not a security concern. Firmware lives with the product team, the embedded engineers, sometimes a contract manufacturer overseas. The CISO's mandate has traditionally ended at the operating system boundary.

It's invisible to the tools most security teams use. Vulnerability scanners flag CVEs in the OS and applications, not in the bootloader, BMC, or microcontroller firmware. SBOMs (software bills of materials) are still rare for embedded components, even though they've become standard for application code.

It's owned by no one. Who patches the firmware on the conference-room camera? On the factory-floor PLCs? On the connected medical pumps in a hospital? In most organisations, the honest answer is "we'd have to figure that out."

The result is a class of devices, often numbering in the thousands per enterprise, that run code no one in the security organisation has looked at, can't easily inventory, and frequently can't update without physical access. For attackers, that's a substantial and lightly defended footprint.

The Audit Framework: What to Actually Look At

A firmware audit isn't a single activity. It's a structured review across several dimensions, each producing actionable findings. The framework below maps to controls most CISOs already use, so it doesn't require reinventing the security program. The full discipline of preventing firmware vulnerabilities is broader than any audit, but the audit is where most security programs need to start, simply because you can't defend what you haven't inspected.

Component inventory and SBOM. The first question in any firmware audit is what's actually running. That means an SBOM for every shipping device or every device class on the network. The SBOM should cover the bootloader, real-time operating system (FreeRTOS, Zephyr, VxWorks, or custom), third-party libraries, communication stacks, and cryptographic components. Without this inventory, no other audit step is meaningful.

Secure boot and chain of trust. Does each device cryptographically verify its firmware before executing it? Is the verification chain anchored in immutable hardware (a TPM, secure element, or boot ROM)? Devices without enforced secure boot are devices an attacker can persistently compromise below the OS layer, where no endpoint agent will ever see them.

Firmware update authentication. Are over-the-air updates cryptographically signed? Where do the signing keys live, and who can access them? A compromised update server pushing signed-but-malicious firmware has appeared in multiple recent supply-chain incidents. The signing process itself becomes part of the attack surface.

Rollback protection. Can an attacker downgrade a device to an older, vulnerable firmware version? Anti-rollback counters and version-enforcement logic are basic controls, but a surprising number of production devices lack them.

Debug and diagnostic interfaces. JTAG, UART, SWD, and other debug interfaces are essential during development. They're disasters if left enabled in production. The audit should confirm these are disabled or properly authenticated on shipping devices, and that test points are not exposed on accessible PCB locations.

Memory safety in critical code paths. C and C++ still dominate firmware development, which means memory-safety bugs (buffer overflows, use-after-free, integer overflows) remain a structural risk. Where memory safety matters most, security-critical paths, network parsers, cryptographic operations, modern engineering teams are increasingly adopting Rust for new firmware development to eliminate entire bug classes at compile time rather than discovering them through fuzzing or worse.

Cryptographic implementation review. Many embedded devices implement cryptography poorly: weak random number generators, hardcoded keys, obsolete algorithms (still seeing MD5 and DES in 2026 is depressingly common), or correct algorithms used in incorrect modes. A focused review of crypto implementation is high-value, low-cost.

Network-facing services and protocols. What ports are listening? What protocols are supported? Many embedded devices ship with legacy protocols (Telnet, unencrypted SNMP, hardcoded HTTP admin interfaces) that would be unthinkable on a server but pass without comment on a thermostat.

Mapping to Frameworks the Board Already Understands

CISOs presenting a firmware audit programme to the board don't need to invent new vocabulary. Existing frameworks already cover most of the territory:

  • NIST SP 800-193 (Platform Firmware Resiliency Guidelines) addresses protection, detection, and recovery of platform firmware.
  • NIST SP 800-147 covers BIOS protection on enterprise systems.
  • IEC 62443 is the standard for industrial control system security, with detailed requirements for firmware integrity and update management.
  • IEC 62304 governs software lifecycle processes for medical device software, including firmware components.
  • ETSI EN 303 645 sets baseline cybersecurity requirements for consumer IoT devices.

Mapping audit findings to these frameworks gives security leaders a vocabulary the board, regulators, and auditors all recognise. It also turns firmware security from a vague engineering issue into a measurable compliance posture.

The Practical Path Forward

A firmware security programme doesn't have to start at full enterprise scale. Most organisations that successfully bring firmware into the security perimeter follow a similar progression.

Start with an inventory. Pick a device category, IoT sensors, connected medical equipment, networked office hardware, and build the SBOM. Even an incomplete inventory is more than most organisations have today.

Run a focused audit on the highest-risk class. Choose devices that are network-connected, deployed in sensitive environments, or used in regulated workflows. The first audit will surface findings that justify the programme.

Establish ownership. Decide who patches firmware, who reviews update signatures, who responds when a CVE drops against an embedded library. Without explicit ownership, firmware patching remains everyone's problem and no one's responsibility.

Build the audit into procurement. Future device purchases should require SBOM disclosure, signed update support, secure boot, and disabled debug interfaces as contractual minimums. Vendors who can't meet these requirements in 2026 are signalling something important about their security maturity.

Establish detection. Most security operations centres have no visibility into firmware integrity. Tools that monitor firmware versions, detect unauthorised update attempts, or flag anomalous device behaviour are increasingly available and worth piloting.

The Quiet Strategic Shift

The reason firmware security matters in 2026 isn't that attackers suddenly discovered embedded code. They've been there for years. What's changing is the exposure: more connected devices in every enterprise, longer device lifetimes, deeper integration with critical workflows, and growing regulatory pressure (CRA in Europe, FDA cybersecurity premarket requirements, sector-specific mandates).

Security leaders who treat firmware as out of scope are accepting a structural blind spot in their programme. The ones who bring it into the audit perimeter, even imperfectly, even one device class at a time, are closing a gap their adversaries have already noticed. The frontline moved. The question is whether the security programme moved with it.