Companies Should Share Cyberattack Information
Disclosure of data breaches might help strengthen cybersecurity for everyone. But keeping attacks and responses secret may lead to quicker fixes and less reputational harm.
Damage from cyberattacks comes in layers. Direct harm, in the form of theft and other losses. Damage to the reputation of the companies affected when news gets out. And the slow erosion of confidence in overall online security, a malaise that grows worse with each new breach.
How do we limit the damage and, more important, restore confidence in online security? That is a question that bedevils policy makers as much as it does network analysts and computer scientists.
Requiring companies to report when they’ve been attacked and to share details about how it was done might help strengthen cyber-defenses for everyone. But it can also complicate the process of trying to keep systems secure, and injure the companies’ reputations in the meantime. Conversely, allowing breached companies to work on solutions in secret may fix problems quickly and prevent reputational harm. But keeping attacks secret may also increase the danger for others.
Making the case for required disclosure is Denise Zheng, deputy director and senior fellow in the Strategic Technologies Program at the Center for Strategic and International Studies. Andrea Castillo, program manager in the Technology Policy Program at George Mason University’s Mercatus Center, argues against such a mandate.
Last year, losses or thefts of more than half a billion identities were reported, including the largest data breach in history, involving more than 191 million US voter-registration records.
At the same time, more companies are choosing to keep the scope of the breaches they suffer secret. Both trends are worrisome, and are likely the tip of the iceberg. Underreporting of cyberattacks contributes to an incomplete understanding of the magnitude of the threat. It means we are relying on anecdotal information to determine effective defenses against cyber-threats. Breaches that are disclosed generally involve loss of personally identifiable information or medical records, because reporting of this kind of attack is mandated by state and federal data-breach notification laws.
But the laws we have can best be described as a patchwork of requirements inadequate to protect consumers and burdensome for companies. Indeed, state requirements vary in 47 states, and federal rules are inconsistent across sectors and lack specificity. The Securities and Exchange Commission clarified in 2011 that “material” cyber-risks and intrusions must be disclosed to investors. But the SEC didn’t offer formal guidance on what is “material.” Such vagueness means that most public companies file generic statements about cyber-risk and many still don't disclose intrusions at all.
The Cybersecurity Act signed into law in December, as part of a broader spending bill, creates a framework for voluntary sharing of cyber-threat information. This is a significant step, but it, too, doesn't go far enough. Nothing in the law compels companies to disclose incidents or technical details about breaches. There is liability protection against suits resulting from efforts by companies to monitor their own networks and share threat information. But there is no liability protection for companies sued as a result of a breach.
Of course, even if there were liability protection in the case of a breach, companies could still suffer reputational harm by reporting the breach. But the benefits to society by requiring reporting would outweigh the costs to the individual companies. Requiring not only cyber incidents to be reported but the tactics and techniques used by hackers would create greater transparency, allowing businesses, policy makers and consumers to make more informed decisions about how to manage cyber-risk. It would enable decision makers in companies and government to assess risk as well as progress.
Not all incidents should be disclosed, and not all the details should be made public. We wouldn't want to give malicious hackers a road map to conduct additional attacks. But major attacks—those that have significant consequences for the economy, public health and safety, or national security—should be disclosed to relevant government agencies and to downstream stakeholders that may be affected by the incident. Disclosure creates incentives for improvement. The alternative is to fly blind when it comes to cybersecurity, which is the approach we use now without much success.
After the congressional and presidential elections in the fall, Congress and the new administration should make a serious effort to overhaul state and federal data-breach-notification laws, harmonize requirements and toughen the standard for data-breach notification.
Congress should also enact new laws to encourage more disclosure of incidents as well as relevant technical details to enable IT vendors and companies to close gaps and vulnerabilities that attackers could use to conduct similar intrusions on others.
WSJ: http://on.wsj.com/1TpjGMk