Cybersecurity Information Sharing Act (CISA): Privacy, Implementation and the Zombie Apocalypse - Fordham Intellectual Property, Media & Entertainment Law Journal
post-template-default,single,single-post,postid-22557,single-format-standard,ajax_fade,page_not_loaded,,select-theme-ver-3.3,wpb-js-composer js-comp-ver-6.6.0,vc_responsive

Cybersecurity Information Sharing Act (CISA): Privacy, Implementation and the Zombie Apocalypse

Cybersecurity Information Sharing Act (CISA): Privacy, Implementation and the Zombie Apocalypse

Over the summer a series of cyber attacks on private companies and the government prompted discussion of several bills in Congress.1 On Tuesday, October 27th, the Senate moved forward with their 74-21 vote on one (S.754 – Cybersecurity Information Sharing Act of 2015, known as CISA) that is expected to merge with several in the House and be ready soon for Presidential signature. The bill sets into motion a chain of events that would “automatically” start when a cyberattack occurs at a company or in a government agency.

When an attack happens, the technology systems will send a prompt of the incident to government computer systems shared jointly by the Director of National Intelligence, the Secretary of Homeland Security, the Secretary of Defense, and the Attorney General. That prompt would also include redacted information about the affected data. The systems would take that information in, understand what was attacked based on the information sent and would “immediately” let other companies know about the attack. Those companies would take that information in and process it right away to determine if there is anything additional they need to do to protect their own information.

In some ways, this seems a lot like what happens when something goes wrong on my own computer when I’m working on Microsoft Word or some other app. That big alert message suddenly comes up (usually after I have not saved my work for a while) and tells me that something unexpected has happened and Word will now close. Right after that happens, it also asks me whether I want to send that error information to the company, so they can “help diagnose and prevent the problem in the future”. Usually depends on my mood and how many times the crash occurs as to whether I say yes. (And I’m still not convinced information is not sent even when I say no.) Let’s say I say ‘Yes’, then some fancy arrows on the screen start to move to imply information is being sent. Now, what information are they sending? The error message? The set of tasks I was doing before the quit happened? Log files? My files? This is not really clear and different companies have different processes.

CISA implies that there will be one set of information and codes sent from the hacked company to the government and then sent onto the rest of the companies involved. Each of those companies have different information and different systems. But let’s assume some really smart people have already considered the codes and features to do this (banks for instance have their own systems for passing money around between their systems and large energy companies the same, so it’s not improbable, though even these two groups do not use the same methods).

But now the question is what are they actually sending? When that bad message pops up that something is wrong in the system, what does the government expect the companies to send? The Law implies that these are details for the agencies to sort out. Fair enough, that’s what agencies do. So now the agencies will work together to come up with what data should be collected from all these various companies when something bad happens. But computers don’t just do something when you ask them; you have to actually code them to make these processes happen. First you have to go into your systems and find places where issues happen, write code to translate and map those error messages to the codes the government will understand and then you have to put all of the code in place in all of the places it’s needed to then send the required information to the government (oh and don’t forget to encrypt it when you do send so someone does not hack that too).

And what about those times when the only error message you get is that the application quit unexpectedly and there is no error code or worse when you are talking to tech support, they realize your error code is the “catch all” something-happened-somewhere-the-programmer-did-not-identify-there-would-be-a-problem-and-yet-there-was. Because, let’s face it, often times you can predict an error or an issue. But when something new happens, the system needs new code to make it respond to new situations.

All of this is also true on the other side of the connection. Though the government agency systems will define how the communications will work between systems, the unhacked companies will also need to process the messages coming out and react to them. I sure hope the government programmers are very succinct and descriptive with the error details they are sending out – or there will be a lot of confused systems with error messages similar to “Something bad just happened at company X”. (Sounds like the real coding should be done to keep a list of competitors and immediately send that message via text to the CEO.) And what about when the 100th error of the day comes in – like the notifications on your iPhone – eventually it’s fun to just see how high the number gets before you go in and “Mark All Read”.

Most of the talk about CISA revolves around the privacy questions involved with moving all the affected data to the government agencies (including our current main intelligence agencies leading the collection efforts) and what they might do with the information once they get it. There should not be any secret what will most likely happen – the same thing that is already happening, the government is reading our data and doing nefarious things. No one is shocked by this anymore. Old news. (You can also cleanly send data without PII (Personally Identifiable Information) in order to diagnose issues. It would be interesting to see what data the agencies DO actually decide to communicate and THEN see if there is still cause for privacy concern.)

This bill is being raised also as a means to help companies prevent hacks in the future. The reality? Companies are hacked because they are not keeping track of all their systems properly and get stuck when things go south. But instead of them taking time to put those proper security procedures in place – or being forced by a law to do so, this bill is actually asking companies to take time to participate in a massive coding effort to send out error messages and process in error messages from companies when issues happen. Great. Companies who did not have priority about security before will now have to take time to code for a government regulation – and still will not have time to fix the security issues they face. In essence, all the government is asking for is the information that an issue has occurred. Who’s actually going to really diagnose the problem and fix it so we can stop getting those pesky error messages in the first place?



Natasha Lennard, The Lesson of CISA’s Success, or How to Fight a Zombie, The Intercept (Nov. 3, 2015, 2:19 PM), [].

Luke Winkie, Everything You Need to Know About the Recently Passed, Privacy-Decimating CISA Bill, Vice (Nov. 3, 2015), [].

Andy Greenberg & Yael Grauer, CISA Security Bill Passes Senate With Privacy Flaws Unfixed, Wired (Oct. 27, 2015, 5:30 PM), [].

Sam Thielman, Senate Passes Controversial Cybersecurity Bill CISA 74 to 21, The Guardian (Oct. 27, 2015, 5:29 PM), [].

Jose Pagliery, Senate Overwhelmingly Passes Historic Cybersecurity Bill, CNN Money (Oct. 30, 2015, 1:57 PM), [].

Jonathan Keane, The CISA Act Passed Last Week and Here’s Why You Should Care, Paste Magazine (Nov. 2, 2015, 2:00 PM), [].

Cat Zakrzewski, Senate Passes Cybersecurity Threat Sharing Bill That Tech Hates, TechCrunch (Oct. 27, 2015), [].


The Bill:

Cybersecurity Information Sharing Act of 2015, S. 754, 114th Cong. (1st Sess. 2015), [].

  1. Natasha Lennard, The Lesson of CISA’s Success, or How to Fight a Zombie, The Intercept (Nov. 3, 2015, 2:19 PM), [].


Kathy Walter

Kathy Walter is a product and brand manager turned law student. She’s spent almost two decades creating and launching products for firms like Instinet, Gillette, Proctor and Gamble, Iron Mountain, NYC Department of Education, Macmillan New Ventures and now for her own company, Nsoma. Kathy is a 2L law student studying Education and IP/Information Law at Fordham.