Fencing the Frontier (II)

After getting over my irritation (and mild embarrassment) that the Senate legislative content server doesn’t even have a proper domain name – and that their cybercrime bill 2796 comes in so close on the heels of the United States’ abortive SOPA and PIPA legislative disasters – my next gut response was “okay, just what have our legislators got up to now?”

I’ve been a consumer of Internet content for over a decade, and a programmer and Web developer for almost a decade; and SB 2796 troubles me in a number of ways. I’ll walk through these sections of the bill that bother me.

Cybercrimes against things

SB 2796 is not about your right to use electronic communication with, say, reasonable expectations of privacy. The Declaration of Policy (Section 2) emphasizes protection of computer systems

from all forms of misuse, abuse, and illegal access by making punishable under the law such conduct or conducts.

It is, straightforwardly, about the State adopting “sufficient powers to effectively prevent and combat such offenses by facilitating their detection, investigation, and prosecution at both the domestic and international levels.”

I dislike the imprecision in their definition of the terms in Section 3. Take one mild example, its definition of a “service provider”:

  • any public or private entity that provides to users of its service the ability to communicate by means of a computer system, and
  • any other entity that processes or stores computer data on behalf of such communication service or users of such service.

Now, am I a service provider, if I operate a weblog where my readers are allowed to post comments, and converse among themselves? How about when I establish a chat service for my friends on my rented web server – are both I and my ISP “service providers”?  I was hoping to set up a site that’s a kind of mashup between Mathoverflow and Wolfram Alpha – will that make me a “service provider?” These questions are relevant to reading Chapter II, “Punishable acts”, where it is lawful for a service provider to intercept, use, or otherwise disclose the content of activity on the service.

There’s a few more bits in Section 3 about electronic interception, subscriber information (basically any information related to a service user’s location, service details, and billing history), and “traffic data or non-content data” (network traffic data ), and that all-important modifier phrase, “without right,” that are germane to the discussion of punishable acts, which I’ll spell out in detail later.

It is Section 4 that defines three categories of “cybercrime” offenses: Those that detrimentally affect confidentiality of data and reliability of computer systems; computer-assisted fraud, and forgery; and content-related offenses.

We Ownses Your Datas

The first category includes two specific offenses: Illegal intercept, and misuse of devices, both of which are of interest to users and, crucially, system administrators.  Illegal interception is

The intentional interception made by technical means
without right of any non-public transmission of computer data to,  from, or within a computer system including electromagnetic emissions from a computer system carrying such computer data: Provided, however, That it shall not be unlawful for an officer, employee, or agent of a service provider, whose facilities are used in the transmission of communications, to intercept, disclose, or use that communication in the normal course of his employment while engaged in any activity that is necessary to the rendition of his service or to the protection of the rights or property of the service provider, except  that the latter shall not utilize service observing or random monitoring except for mechanical or service control quality checks.  [Emphasis mine].

It is striking that nothing is said about restrictions about how that data may be handled or used. There is no provision for how certain kinds of system data should or should not be handled.  Shouldn’t there be something explicitly said about what is broadly prohibited to be done with system users’ data? Shouldn’t information about third parties be protected (i.e. by encrypting it) while it is in the possession of the service provider?

I’ll give you an example: as a computer system administrator I may make a USB flash drive copy of a database table containing my subscribers’ transactions on my site, and transfer it to a different, (perhaps offline) system inside my company as part of my daily routines. Then I can take the flash drive home, can’t I? Nothing wrong there. What if that database table contains our customers’ credit card billing data – and I lose the flash drive? No problem! It’s a sad loss, but completely legal for my company to be operating without data integrity safeguards. Or, I could pass it on to a senior manager, who’ll be converting those tables into mail lists of our highest-spending site users, which we can sell on to e-marketing firms. All completely legal, as this is done as part of my duties as an employee.

(Update, 25 September, 2012: The Data Privacy Act of 2012, specifically Section 11 (General Data Privacy Principles) up to Section 16 (Rights of the Data Subject) address this issue, and specify the scope of responsibility of so-called “personal information processors.”  More on this in later posts.)

There are simply no good business or economic reasons to say that improving the security around citizens’ data is infeasible, too costly, or too complicated. We largely have the software and semiconductor industries to thank for this. While data encryption, for example, is still compute-intensive, it is no longer as costly as it used to be only a few years ago. What I mean by this is that there are programming practices and techniques for, say, writing Web site business logic, or for designing complex desktop programs, that can improve privacy and security using encryption techniques which, only a few years ago, would have required faster processors or more memory. Well, guess what? Today, we have those faster processors and more memory.

Methodology for computing systems design has evolved to both meet time-to-market pressures and attain software quality goals. There are programming and system administration practices which can be put in place to foil casual data theft by system operators and employees, and can be reasonably mandated by a cybercrime law to be a service provider’s obligation to exercise. Societies’ know-how has evolved to the point where we can reasonably enforce responsibilities of third parties that handle citizen data, that they do so with sufficient safeguards to privacy and freedom from unwanted use of that data.

As it stands now, the definition of “illegal access” provides implicit, blanket license to “service providers” to do practically anything they want with the information that enters their domain – which may not be same as what the owner or subject of that data might want.

 

Possession of ‘ping’ a punishable offense?

This innocuous-sounding definition “Misuse of devices”, Section 4A(5), is as much useless as it is apparently poorly thought out, as it ignores “dual use” capability of most computing equipment and software. We need only point out two phenomena to demonstrate why: commodity software (including open source software), and malware.

The main point about software being a commodity (free or paid for) is simply that software is ubiquitous, and the crucial thing about commodity operating systems is that there is a whole bucket of tools in each of them – be it Windows, OS X, or Linux – that can be used to find out things about other networks or computers, usually by interacting with them in some way. These tools can be used to analyze, for example, an Internet site target to find out whether it’s visible on the `Net, learn what software the site is running, learn it’s vulnerabilities, and so on – and thence, defend, or attack it. Many software tools used by IT professionals can be used both ways: as diagnostic tools, for instance, or for illegal intercept. The picture processing tool Photoshop, a favorite of web designers and graphic artists the world over, is also able to be used for digital image forgery.

Worse, there are inevitable defects, particularly in new software, that make them prone to being used to attack the machine on which they are used. This is why the newest software isn’t always the best thing to have running on your computer, and why Windows XP is still a better choice for privacy and security conscious computer users (at least if you have no choice) – it’s simply been through a lot more “consumer testing” and has more bug fixes than the newest iteration of that Microsoft operating system.

Which brings us to software that’s been written for purposes that a computer owner does not intend – malware. Anybody who’s been bit by malware knows the signs: Odd behavior from the trusty desktop; increased traffic and reduced Internet access speed; possibly even lost or corrupted files.  It is very likely that many more computer users have been afflicted by these pieces of rogue software that may have been passed on to them by a coworker’s USB stick or, more commonly, by downloading it from the Internet, and who aren’t aware that their computer has been compromised.

It would be more useful to specify creation and dissemination of malware as a punishable act. Otherwise mere possession of a computer containing ping, telnet, nmap, wireshark, dig, or tcping, let alone socat (a general-purpose network socket tool) puts the holder at risk of falling foul of the law. The committee that drafted this Bill needed to go get a clue, perhaps starting with watching a TED Talk or two, and getting a grip on this simple idea: Electronic communication devices and software are “dual use” tools. If government intends to be up to the task of prosecuting misuse of these tools, they would do well to specify the who and what needs protecting, perhaps more than specifying a broad class of dual-use technology.  They could have done better by identifying, generically, breaches of information systems that put life, property, and rights at risk, rather than the tools with which these risks may be created.