Creating and Implementing a Liability Regime for Software Vendors
By: McKenzie Stoker
Executive Summary:
Insecure software is a national security risk, costs the U.S. billions of dollars annually, and exposes users’ information to malicious actors. Software developers (vendors) who fail to securely develop their products currently face few legal repercussions, even if they engage in industry-agreed bad practices. Without a current legal framework holding vendors accountable for their products, users assume most (if not all) the risk when cyberattacks occur. Therefore, to ensure the responsibility for secure software is appropriately vested in the entities creating, maintaining, and profiting from its use, a federal torts-based software liability regime should be implemented, and the Department of Justice (DOJ) should collaborate with state attorneys general (AGs) to initiate lawsuits under state laws.
Challenge:
Digital technology is a core aspect of life in the U.S., with over 90% of teenagers and adults using the internet.[1] U.S. critical infrastructure also heavily relies on such technology.[2] The prevalence of cyber-capable devices in American homes, schools, workplaces, and other vital sectors implicates national security issues and users’ rights. Yet, despite the importance of this technology in America, vendors continuously fail to securely develop their products. According to the Department of Homeland Security’s Cybersecurity and Infrastructure Security Agency (CISA), “[t]he vast majority of exploited vulnerabilities . . . can be prevented at scale.”[3] With both the know-how and capability to develop more secure products,[4] there is little excuse for vendors who chose not to.
Insecure software development poses severe security risks and causes significant damage to the American public and U.S. government. While perfect code is nearly impossible to create,[5] a recent report found 28% of vendors were not even familiar with secure development practices.[6] Yet, insecure development practices are implicated in over 40% of cyber incidents.[7] CISA maintains a list of bad practices all vendors should avoid including developing new software in memory unsafe languages, failure to support multifactor authentication procedures (unless MFA procedures would introduce safety risks, such as in emergency medical devices), and releasing products with known exploitable vulnerabilities.[8] However, this list is non-binding.[9]
Releasing software with known exploitable vulnerabilities is especially problematic, since malicious actors routinely target older, unpatched vulnerabilities to maximize the cost-efficiency and scale of their operations.[10] For instance, BadPilot, an arm of Russia’s cyberwar unit Sandworm, has recently been targeting internet-facing software with known but unpatched vulnerabilities to access networks in the U.S., Canada, Australia, and the U.K.[11] Best practice dictates that vendors patch known vulnerabilities prior to software release (and issue a patch to address any new KEVs), but with no legal requirements incentivizing them to do so, they often do not.[12] For example, a Microsoft employee warned the company of a vulnerability in its software, which was later used by Russian intelligence in the SolarWinds attack, but Microsoft decided against patching it over fear that it would affect an upcoming contract with the U.S. government.[13] Other vendors do similar things.[14]
When a cybersecurity breach does occur, users are impacted in a variety of ways. Malicious cyberattacks are estimated to cost the United States between $57 billion and $207 billion annually.[15] Additionally, users often experience loss of personal and professional data, exposure of personal identifying information, stolen intellectual property, property damage, etc. Despite the seriousness of the issue, vendors face little to no liability if their products are insecurely designed.[16]
Recommendations:
1. Vendors should be subject to a federal software liability regime based on negligence and strict liability.
The 2023 National Cybersecurity Strategy emphasizes the importance of secure software development and calls for shifting liability for insecure software to vendors.[17] In March 2024, a legal symposium on software liability was held to begin soliciting feedback from vendors on how to craft a liability regime.[18] Unsurprisingly, vendors oppose imposing liability for software developers.[19] However, given the pervasiveness of digital technology in Americans’ lives combined with the cost to the U.S. when cyberattacks occur, vendors should be held liable when they do not create products in a secure manner.
Because of the fluid, evolving nature of secure software development practices, a fair federal liability regime needs to be flexible and predictable enough to adjust with changing industry standards, while also recognizing the danger that insecure software poses to the U.S. Therefore, this regime should be based on both negligence and strict liability (which holds entities liable for their actions regardless of their intentions/fault), and allow for civil enforcement by the DOJ and private causes of action.
Some software development practices are so risky that they should be avoided altogether, such as releasing products with known vulnerabilities.[20] Strict liability should be imposed on vendors who use any of the bad practices noted by CISA in their products. Not only do these bad practices increasingly expose the public to malicious cyber actors, like Microsoft’s known vulnerability exploited in the SolarWinds attack, but imposing strict liability for these practices would likely be more cost effective to enforce, make it easier for users to recover, and encourage vendors to maintain minimum standards in their products.
For development practices that fall outside CISA’s worse-practice guidance, negligence is the best approach to determine vendor liability. Under this framework, the duty of care vendors owe their users should be a professional standard of care: a requisite degree of learning, skill, and ability that a reasonably prudent vendor would exercise under the circumstances.[21] This standard should be promulgated as part of the proposed federal software liability regime. For example, the failure to use pseudo-random numbers within the encryption scheme has been linked with multiple attacks.[22] Vendors who use pseudo-random numbers while encrypting their software would fall within the standard of care as that is what a reasonably prudent vendor would do in those circumstances. In contrast, a vendor who did not use pseudo-randomness would fall below the standard of care and may be held liable. Furthermore, reasonableness is a flexible standard, so legislation specifying certain factors for courts to weigh in the analysis may be helpful for both judicial efficiency and predictability purposes.[23] [24] [25]
Yet, vendors should not be wholly without defense. First, safe harbor provisions could provide protection from liability under certain circumstances, such as giving vendors time to bring their code in compliance with the proposed federal software liability regime.[26] However, vendors should have less time to cure any practices falling within the strict liability framework, given the danger they pose. The liability regime could mandate that CISA takes point on managing the safe harbor provisions and any case-by-case extensions. If a vendor fails to comply by the end of the safe harbor period, CISA would then notify DOJ to determine whether to initiate a lawsuit.
Second, vendors should be permitted to assert the comparative negligence defense.[27] For example, if users do not update their software in a timely manner, then damages they experience from vulnerabilities with available patches would be reduced based upon the users’ own negligence in failing to deploy them.[28] This would encourage vendors to develop secure software and timely updates, while simultaneously encouraging users to reduce the patch window if they want to recover. While this may run the risk of incentivizing vendors to constantly deploy patches – becoming burdensome on both small vendors and users – it is still in a vendor’s best interests to design secure software initially that does not require numerous patches (as the vendor could still be held liable for damages incurred before a patch was available).
To be sure, there is opposition to imposing a software liability regime. Some argue that imposing liability will be at the expense of innovation, including innovating advanced security procedures.[29] While innovation is key to promote competition and encourage developers to push the limits of what is technically possible, there is less value in innovating new products if they are prone to cyberattacks in the same (often avoidable) ways as the old products. Instead of prioritizing reducing the time to market for new products, vendors need to strike a balance between timely innovation and ensuring the release of secure software.
Relatedly, opponents to software liability argue that software liability would result in more money being spent on litigation and burdensome compliance costs rather than actually making the software secure,[30] and/or vendors would go bankrupt either trying to comply with the law or paying out judgments. The imposition of liability does run the risk that vendors will pay significant legal and compliance-associated costs. However, this does not mean vendors will cease to exist as a result. First, the liability regime suggested here does not require vendors to create perfect products. Aside from being nearly impossible,[31] perfect code is not the goal. The goal is to hold vendors accountable when they deviate from the professional standard of care, as articulated above, that they are capable of exercising when developing their products. Second, vendors are inherently self-interested. They want to protect their bottom line, and if that means spending more money upfront to produce more secure products and avoid the cost of litigation, government-imposed fines, and settlements in the event of cyberattacks, they are likely to do so.
2. DOJ’s Consumer Protection Branch should collaborate with state AGs to initiate torts-based litigation against vendors under their respective state laws.[32]
States are beginning to implement legislation to protect users, but most state laws only focus on how companies collect and use consumers’ information rather than secure development.[33] However, all states allow negligence-based suits initiated by persons, companies, or the AG.
In the absence of a federal liability regime, the next-best approach is state-by-state litigation. DOJ should encourage AGs to bring negligence suits against vendors arguing that vendors should be held to the professional standard of care, as articulated above.
However, it is important to note that some state laws may bar negligence-based software liability suits for various reasons. For example, the economic loss doctrine bars claims between parties in contractual privity when the product defect or failure leads to solely economic damages (rather than personal injuries or losses to other property).[34] However, the argument should be made that damages caused by cyberattacks are considered “other property” and/or should be exempt from the economic loss doctrine altogether. Cyberattacks often result in financial losses, but they can cause physical damage to software-capable devices as well as invaluable data loss. The wide-ranging effects of cyberattacks are more often than not related to damages outside of the actual software itself.
Additionally, some courts that have considered the issue of software vendor liability have held that vendors are not professionals and thus not held to a professional duty of care.[35] Holding vendors liable under state law in these jurisdictions would be difficult, if not impossible (under a negligence cause of action). Yet AGs should advocate for these rulings to be overturned, especially given the substantial role digital technology plays in peoples’ lives – even more so since these cases were decided over a decade ago.
Conclusion:
Vendors presently face few legal repercussions when they do not securely develop their products. The lack of federal protection for users often leaves them without recourse when cyberattacks happen and they experience some type of financial or data loss. In addition to financial and data losses, insecure software is a national security risk. Therefore, a federal software liability regime should be implemented allowing for civil enforcement and private causes of action under strict liability and negligence torts theories.
[1] “Teens and Internet, Device Access Fact Sheet,” Pew Research Center, January 5, 2024, https://www.pewresearch.org/internet/fact-sheet/teens-and-internet-device-access-fact-sheet/; Risa Gelles-Watnick, “Americans’ Use of Mobile Technology and Home Broadband,” Pew Research Center, January 31, 2024, https://www.pewresearch.org/internet/2024/01/31/americans-use-of-mobile-technology-and-home-broadband/.
[2] United States Department of Homeland Security, Cybersecurity & Infrastructure Security Agency, “Critical Infrastructure Sectors,” accessed November 1, 2024, https://www.cisa.gov/topics/critical-infrastructure-security-and-resilience/critical-infrastructure-sectors.
[3] Department of Homeland Security, Cybersecurity & Infrastructure Agency, Secure by Demand Guide: How Software Customers Can Drive a Secure Technology Ecosystem, August 2024, https://www.cisa.gov/sites/default/files/2024-08/SecureByDemandGuide_080624_508c.pdf.
[4] Jack Cable, “Preventing Ransomware Attacks at Scale,” Harvard Business Review, April 23, 2024, https://hbr.org/2024/04/preventing-ransomware-attacks-at-scale.
[5] Eugene H. Spafford, Leigh Metcalf, and Josiah Dykstra, Cybersecurity Myths and Misconceptions (Boston: Addison-Wesley, 2023), 6.
[6] Linux Foundation, “Why are Organizations Struggling to Implement Secure Software Development?” Open Source Security Foundation, July 5, 2024, https://openssf.org/blog/2024/07/05/why-are-organizations-struggling-to-implement-secure-software-development/.
[7] Linux Foundation, “Why are Organizations Struggling to Implement Secure Software Development?”
[8] Department of Homeland Security, Cybersecurity & Infrastructure Agency, “Product Security Bad Practices,” October 16, 2024, https://www.cisa.gov/resources-tools/resources/product-security-bad-practices.
[9] Department of Homeland Security, “Product Security Bad Practices.”
[10] United States Department of Homeland Security, Cybersecurity & Infrastructure Security Agency, “2022 Top Routinely Exploited Vulnerabilities,” August 3, 2023, https://www.cisa.gov/news-events/cybersecurity-advisories/aa23-215a.
[11] Andy Greenberg, “A Hacker Group Within Russia’s Notorious Sandworm Unit Is Breaching Western Networks,” Wired, February 12, 2025, https://www.wired.com/story/russia-sandworm-badpilot-cyberattacks-western-countries/.
[12] United States Department of Homeland Security, “2022 Top Routinely Exploited Vulnerabilities”; Lily Hay Newman, “Sloppy Software Patches Are a ‘Disturbing Trend’,” Wired, August 11, 2022, https://www.wired.com/story/software-patch-flaw-uptick-zdi/; Jon Levenson, “Bad Cyber Hygiene: 60 Percent Of Breaches Tied to Unpatched Vulnerabilities,” Automox, June 18, 2019, https://www.automox.com/blog/bad-cyber-hygiene-breaches-tied-to-unpatched-vulnerabilities.
[13] Renee Dudley and Doris Burke, “Microsoft Chose Profit Over Security and Left U.S. Government Vulnerable to Russian Hack, Whistleblower Says,” ProPublica, June 13, 2024, https://www.propublica.org/article/microsoft-solarwinds-golden-saml-data-breach-russian-hackers.
[14] Levenson, “Bad Cyber Hygiene.”
[15] Anna Scherbina, “How much do US businesses lose due to malicious cyber activity?” The Hill, May 3, 2024, https://thehill.com/opinion/cybersecurity/4641199-cyberattack-businesses-money-loss-malicious-cybersecurity/; Executive Office of the President of the United States, The Council of Economic Advisors, “The Cost of Malicious Cyber Activity to the U.S. Economy,” February 2018, https://trumpwhitehouse.archives.gov/wp-content/uploads/2018/02/The-Cost-of-Malicious-Cyber-Activity-to-the-U.S.-Economy.pdf; Steve Morgan, “Cybercrime To Cost The World $10.5 Trillion Annually By 2025,” Cybercrime Magazine, November 13, 2020, https://cybersecurityventures.com/hackerpocalypse-cybercrime-report-2016/.
[16] Derek E. Bambauer and Melanie J. Teplinsky, “Shields Up For Software,” Lawfare, December 19, 2023, https://www.lawfaremedia.org/article/shields-up-for-software.
[17] The White House, “National Cybersecurity Strategy,” March 2023, https://bidenwhitehouse.archives.gov/wp-content/uploads/2023/03/National-Cybersecurity-Strategy-2023.pdf.
[18] The White House, “Readout: Office of the National Cyber Director Convenes Professors & Think Tank Experts at a Legal Symposium on Software Liability,” March 27, 2024, https://bidenwhitehouse.archives.gov/oncd/briefing-room/2024/03/27/readout-software-liability-symposium/; David DiMolfetta, “As part of a broad cybersecurity strategy, the U.S. wants to create incentives for the tech industry to manufacture products and software that don’t contain major security flaws,” Nextgov/FCW, May 6, 2024, https://www.nextgov.com/cybersecurity/2024/05/white-house-talks-industry-build-legal-framework-software-liability/396330/.
[19] Eric Geller, “The struggle for software liability: Inside a ‘very, very, very hard problem’,” The Record, October 22, 2024, https://therecord.media/cybersecurity-software-liability-standards-white-house-struggle.
[20] Department of Homeland Security, “Product Security Bad Practices.”
[21] This is comparable to duties owed by doctors and attorneys to their patients and clients, respectively.
[22] John Graham-Cunning, “Why secure systems require random numbers,” The Cloudflare Blog, September 13, 2013, https://blog.cloudflare.com/why-randomness-matters/; Ben Buchanan, The Hacker and the State (Cambridge: Harvard University Press, 2022), 65-68, 70-74.
[23] Bambauer and Teplinsky, “Shields Up For Software”; Derek E. Bambauer and Melanie J. Teplinsky, “Standards of Care and Safe Harbors in Software Liability: A Primer,” Lawfare, May 31, 2024, https://www.lawfaremedia.org/article/standards-of-care-and-safe-harbors-in-software-liability--a-primer.
[24] Traditional negligence cases often require a determination that a duty of care is owed to the person. Courts often undergo a factor analysis to determine whether a duty exists. However, it is not uncommon for courts to also employ a factor analysis to determine whether a defendant acted reasonably under the circumstances. See e.g., Sprecher v. Adamson Cos., 636 P.2d 1121, 1128-29 (Cal. 1981).
[25] The damage calculation would also need to provide for both monetary and non-monetary losses outside the traditional negligence framework. These losses include money paid in ransomware attacks; replacing unrecoverable, corrupted physical systems; lost personal and business files; spill over physical damage; and stolen identities.
[26] Bambauer and Teplinsky, “Shields Up For Software”; The White House, “National Cybersecurity Strategy.”
[27] Comparative negligence reduces the damages awarded to a plaintiff based on the proportion of the plaintiff’s own negligence. There are four versions of comparative negligence: pure (recovery is reduced by the percentage of the plaintiff’s own fault), less than 50% (recovery is reduced by the percentage of the plaintiff’s own fault so long as it is less than the defendant’s fault; if a plaintiff’s fault is greater than or equal, recovery is barred), no more than 50% (recovery is reduced by the percentage of the plaintiff’s own fault so long as it is not greater than the defendant’s fault; equal fault allows recovery but if a plaintiff’s fault is greater than the defendant’s, recovery is barred), and slight (a plaintiff can recover if their fault is only slight compared to the defendant’s). A pure comparative negligence model is preferred to hold both vendors and users accountable for poor cybersecurity practices.
[28] If the only damages a plaintiff incurred could have been prevented by deploying a software update, then the plaintiff could not recover. However, if damages resulted from both unpatched, insecurely designed software and patchable software (where the plaintiff did not patch), then the vendor could only be held liable for the former.
[29] Daniel Castro, “Should Software Companies Be Held Liable for Security Flaws?” The Wall Street Journal, June 6, 2023, https://www.wsj.com/articles/should-software-companies-be-held-liable-security-flaws-d2a3f5db.
[30] Steven B. Lipner, “Incentives for Improving Software Security: Product Liability and Alternatives,” Lawfare, May 14, 2024, https://www.lawfaremedia.org/article/incentives-for-improving-software-security-product-liability-and-alternatives.
[31] Spafford, Metcalf, and Dykstra, Cybersecurity Myths and Misconceptions, 6.
[32] National Association of Attorneys General, “Interjurisdictional Collaboration,” accessed November 1, 2024, https://www.naag.org/issues/consumer-protection/interjurisdictional-collaboration.
[33] “Which States Have Consumer Data Privacy Laws?” Bloomberg Law, September 10, 2024, https://pro.bloomberglaw.com/insights/privacy/state-privacy-legislation-tracker/#map-of-state-privacy-laws.
[34] Matthiesen, Wickert & Lehrer, S.C., “Economic Loss Doctrine in all 50 States,” January 13, 2022, https://www.mwl-law.com/wp-content/uploads/2018/02/ECONOMIC-LOSS-DOCTRINE-CHART-1.pdf.
[35] See e.g., Avazapour Networking Servs. v. Falconstor Software, Inc., 937 F. Supp. 2d 355, 364-65 (E.D.N.Y. 2013); Ferris & Salter, P.C. v. Thomson Reuters Corp., 889 F. Supp. 2d 1149, 1152 (D. Minn. 2012).