Avoiding a Legal Disaster: Déjà Vu All Over Again

In April 1995, Datapro Reports on Information Security published a Disaster Avoidance brief (IS38-200-101) entitled “Avoiding a Legal Disaster: Business Continuity Planning for Multinationals.” In that paper, the author analogizes a famous 1932 “technology” case decided by the Second Circuit Court of Appeals in the United States, to the growing potential liability of users in managing their technology and information security resources. Specifically, the article states that “In 1932, a famous case entitled The T.J. Hooper (60 F.2d 737; 2nd Circuit, 1932) held that the failure to take advantage of existing and available technology—even though it was not in widespread or common use—was not evidence that the defendant’s duty to take reasonable care had been fulfilled. By analogy, when a disaster occurs, it will not be a defense to argue that a recovery or security system or preventive measure is not commonly in use, especially if using it would have averted the disaster or minimized the loss.”

The article, which focuses on what organizations can do to minimize risk, goes on to note that, “The more reliant business and operations become on technology, the more available preventive and risk management tools become, the less excusable a failure to implement meaningful measures and exercise due diligence over company assets will become to government, employees, customers, suppliers, and shareholders—all potential plaintiffs.”

Now this fact and the author would probably be relegated to obscurity but for an interesting article on I.T. Litigation that has just appeared in the February 1, 2004 issue of CIO Magazine, entitled “Courts Make Users Liable for Security Glitches.” The author notes that an interesting turning point arose in the wake of 9/11 when, in October 2001, Hartford Insurance removed computer damages from its general commercial liability policy coverage. The article goes on to cite three recent cases which are beginning to look a lot like a legal trend in this area. First, a case in which Verizon asked a court to order the State of Maine to refund money because Verizon wasn’t using Maine’s network while Verizon was “down” because of the “Slammer” worm. Verizon had not implemented a Slammer patch and last April the Court ruled that while one may not be able to control a worm attack, they are foreseeable—no refund (Maine Public Utilities Commission v. Verizon).

In Cobell v. Norton, the U.S. Department of the Interior’s website and computer security became an issue in a case involving benefits allegedly and to American Indians. The Court was sufficiently irritated by the Department’s conduct related to security audits, that the Judge actually commenced contempt proceedings! Finally, in the last case cited by the article, the American Civil Liberties Union hoped to avoid liability for accidentally publishing donor information by pleading it had outsourced its security to a third-party vendor. Although the case settled, it is doubtful such a defense would have worked and it is almost certain regulated companies will not be able to escape accountability for compliance by outsourcing regulated activities—the responsibility will remain theirs!

There appears to be an increasing, and not-so-subtle, shift away from the notion that programming errors related to security breaches, computer viruses, worms, logic bombs and other malicious code or hacker and denial of service attacks are somehow equivalent to unpredictable natural disasters like earthquakes or fires—thus not subject to a “fault” analysis, but more appropriately covered by ‘accident’ insurance. Indeed, these and other cases arising in the courts treat breaches of security as fair game for negligence lawsuits—especially where damage has been done to a consumer (e.g., identity theft) or where the assets of a company—tangible or intellectual property—have been compromised. As noted in the 1995 article, liability for failure to implement available security is likely to increasingly hold both providers and users of technology liable where negligence can be shown—or even reckless disregard where safety or the protection of assets are concerned. You can read the CIO Magazine article here and, by the way, the obscure author of the 1995 Datapro article can be reached at joseph.rosenbaum@rimonlaw.com should anyone wish to see a copy or discuss the issues raised—then or now!

Got Indemnification!

In a world increasingly dependent on information, technology and intellectual property rights, contract indemnities—especially if you are an innocent third party—can be critical. “Innocent” means you are a licensee or user of technology (e.g., software, database information) from a provider or licensor and a third party claims that your provider or licensor has wrongfully furnished you with intellectual property that belongs to them. While space doesn’t allow us to go into the finer points of contributory infringement, third-party claims and the distinctions between insurance, breach of representation, and warranty or contract claims and an indemnity, there is enough space to alert you to the fact that a third-party indemnity claim—even if you, the user/licensee, have not knowingly done anything wrong—is disruptive and unnerving at best and at worst can lead to damage claims. For example, the third-party, if successful, will require a new license agreement with you and new license fees (remember those license fees you already paid your current licensor/provider?). Caveat emptor (or, in this case, caveat licensor)!