Trendsvol. 6

The Silent Threat of Software Errors

How mathematical techniques can prevent software failures with real-world consequences

—By Shaw Young


On a quiet Tuesday morning, the trading floor of a major financial institution buzzed with its usual frenetic energy. Suddenly, screens began flashing erratic numbers, alarms sounded, and within minutes, the institution had lost millions. Automated trading algorithms executed nonsensical trades, and confidential client data streamed into the wrong hands. The source of this chaos? A silent bug introduced by a compiler—an invisible error in the software development pipeline that led to catastrophic real-world consequences.

Although this scenario is purely fictitious, it serves as a stark reminder of the hidden vulnerabilities that lie within the very tools developers trust implicitly. Compilers, the essential programs that translate human-written code into machine-executable instructions, are supposed to be infallible bridges between intention and execution. Yet, when they falter, the ripple effects can be devastating.

The hidden risks lurking in compilers

Compilers are often taken for granted in the software development process. Developers write code, and compilers convert it into a form that computers can understand and execute. However, the complexity of modern software systems has increased the likelihood of compiler-induced errors. These errors are particularly insidious because they can remain undetected during testing, only to manifest under specific conditions in production environments.

A study presented at the 2023 USENIX Security Symposium, “Silent Bugs Matter: A Study of Compiler-Introduced Security Bugs,” sheds light on this issue. The research found that 21.6% of compiler-introduced bugs lead to system crashes or Denial of Service (DoS), while 19.2% result in information leaks. These statistics translate into real-world risks, affecting everything from personal data security to the stability of critical infrastructure.

Formal verification: the key to safer compilers

To address these vulnerabilities, the field of formal verification offers a promising solution. Formal verification involves using mathematical methods to prove the correctness of algorithms underlying a system. In the context of compilers, this means ensuring that the machine code output is a faithful and reliable translation of the source code.

Xavier Leroy, a senior researcher at Inria and a pioneer in this field, developed CompCert—a formally verified C compiler. CompCert has been recognized for its reliability and received prestigious awards for its contributions to software safety. In his paper,“Formal Verification of a Realistic Compiler,” published in the Communications of the ACM, Leroy notes, “An obviously better approach is to apply formal methods to the compiler itself in order to gain assurance that it preserves the semantics of the source programs.” Which is to say formal verification allows us to provide mathematical guarantees about a compiler’s correctness, eliminating whole classes of potential bugs.

David Monniaux, another expert in formal methods, emphasizes the importance of verified compilers in enhancing security. In his paper, “Memory Simulations, Security and Optimization in a Verified Compiler,” published in Proceedings of the 13th ACM SIGPLAN International Conference on Certified Programs and Proof, he discusses how memory simulations and optimizations can be safely handled within a verified compiler framework, reducing the risk of introducing new vulnerabilities during the optimization process.

Catastrophic implications: high-stakes software development

Industries that rely on high-assurance software, such as aerospace, automotive, healthcare, and finance, have the most to gain—or lose—in this arena. In aerospace, for example, the software controlling an aircraft’s flight systems must be impeccably reliable. A compiler bug in such a system could lead to incorrect calculations, putting lives at risk.

Benjamin Altermatt, a software engineer at Bloomberg, underscores the stakes involved in the financial sector. “In our industry, precision is non-negotiable. A minor software error can lead to significant financial losses and damage client trust. Compiler vulnerabilities introduce an unpredictable element that we need to eliminate,” he explains. “We rely on compilers to do their job perfectly every time. Formal verification gives us a tool to ensure that’s the case.”

In the automotive industry, the rise of autonomous vehicles adds another layer of complexity. Darius Kianarsi, a former compiler engineer at NVIDIA, which develops hardware and software for self-driving cars, highlights the challenges. “Autonomous vehicles process vast amounts of data in real-time. A compiler error could cause a misinterpretation of sensor data, leading to incorrect decisions on the road,” he says. “Formal verification isn’t just a nice-to-have; it’s a necessity for ensuring safety.”

Balancing performance and correctness

One of the significant challenges in adopting formal verification techniques is balancing the need for compiler optimization with the assurance of correctness. Optimizing compilers aim to improve the performance of the generated code, making it run faster and more efficiently. However, these optimizations can introduce subtle bugs if not implemented carefully.

“As someone who has seen the evolution of compilers over decades, I can tell you that while performance has always been a priority, we must now shift our focus to reliability and security,” says Alex Tran, a retired compiler engineer from NVIDIA. “The complexity of modern systems means that even a small bug can escalate into a crisis. Investing in formal methods is not just an option—it’s a necessity if we want to keep pace with the challenges of today’s software landscape.”

Developers often face pressure to deliver software quickly, which can lead to reliance on compilers that prioritize optimization over correctness. “In the race to improve performance, sometimes correctness takes a backseat,” notes Kianarsi. “But when you’re dealing with systems where errors can have serious consequences, that approach is untenable.”

A growing threat to cybersecurity

As software systems become more interconnected and complex, compiler vulnerabilities pose an escalating threat to cybersecurity. Attackers are increasingly sophisticated, seeking out weak points in the software supply chain to introduce malicious code or exploit existing bugs. The 2020 SolarWinds attack for instance, where hackers exploited IT monitoring software used by thousands of organizations, demonstrated how compromising a single point in the software supply chain can have widespread consequences.

“We’re seeing an uptick in supply chain attacks where the compiler itself becomes a vector,” says Altermatt. “If an attacker can insert a vulnerability at the compiler level, they can potentially compromise every piece of software built with that compiler. It’s a nightmare scenario.”

This concern is echoed in the broader cybersecurity community. Affected US companies lost 14% in annual revenue as a consequence of the attack. While the 2020 SolarWinds attack didn’t involve compilers, it highlighted the need for securing every aspect of the development process. 

The path forward: strategies for mitigation

Addressing compiler vulnerabilities requires a comprehensive approach that integrates technology, industry practices, and education. One key aspect is the adoption of formal verification. Encouraging the use of formally verified compilers, such as CompCert, especially in safety-critical industries, can greatly reduce the risk of compiler-induced errors. Kianarsi emphasizes, “It’s about building trust in our tools. When we know our compilers are verified, we can have greater confidence in the software we produce.”

Another important element is investment in research. Supporting ongoing research into formal methods and verification can lead to new techniques that make these approaches more accessible and practical. Altermatt highlights, “Research is crucial for advancing the field. We need methods that scale and integrate seamlessly into existing workflows.”

Industry collaboration also plays a significant role. Partnerships between academia and industry can accelerate the adoption of formal verification. For instance, with his position of professor at Collège de France and industry exposure as a member of the research team of Inria, Leroy was able to construct CompCert. Similarly, CakeML, a formally verified compiler, is contributed by academics from various academic institutions (UNSW Sydney, Chalmers Institute of Technology, the Australian National University, and more) and industry partners including ARM and the Institute for Infocomm Research. These collaborative efforts can result in tools that meet developers’ practical needs while maintaining rigorous correctness standards. Kianarsi notes, “No one can tackle this alone. By working together, we can share knowledge and develop best practices.”

Education and training are essential as well. Incorporating formal methods into computer science curricula can prepare future developers to prioritize correctness from the start. Altermatt stresses, “Education is the foundation. When developers understand the importance of formal verification from the start, it becomes a natural part of the development process.”

Challenges ahead

Despite the clear benefits, there are challenges to the widespread adoption of formal verification in compilers. Complexity and accessibility are significant obstacles, as formal methods can be mathematically intensive and difficult to understand without specialized training. Making these tools more user-friendly is crucial for broader adoption. Performance overheads are another concern, as verified compilers may introduce performance limitations or optimization challenges. Balancing these trade-offs remains an ongoing area of research. Additionally, economic factors come into play, as developing and implementing formally verified compilers can be expensive. Companies may hesitate to invest without clear short-term returns, even if the long-term benefits are substantial.

A call to action

The silent threat of compiler vulnerabilities is not a problem that can be ignored. As software continues to permeate every aspect of our lives—from the phones in our pockets to the cars we drive and the financial systems we rely on—the need for reliable, secure compilers becomes ever more critical.

“Software is the backbone of modern society,” says Kianarsi. “If we don’t ensure the tools we use to build that software are trustworthy, we’re building on shaky ground.”

Altermatt adds, “We have the technology and the knowledge to address these vulnerabilities. It’s time for the industry to take compiler correctness seriously and invest in the solutions that will protect us all.”

 

Feature Photo, Compiler Architecture; Photo Credit, Wikimedia Commons