In case you didn’t know, many of the rules that built the Internet started as RFCs–requests for comments. These were online discussions in which proposals were made, discussed, and (sometimes) adopted as official rules. Each one has a number, and many Internet technologies you’ve heard about (or at least use every day) have their origins in one RFC or another. For example, the original email protocol was defined in RFC 196; the original hypertext transfer protocol (HTTP), which all websites use some version of, was first officialized with RFC 1945; and version of the Internet protocol (IPv4), which specifies a network addressing system that essentially every network-connected device uses, came out of RFC 791. But the Internet Engineering Task Force, which oversees the development of these RFCs, is not without a sense of humor. Every so often, the IETF puts out a joke RFC. Enter: RFC 3514, otherwise known as the “evil” bit specification.
The (humorous) purpose of the “evil” bit was for individuals with malicious intent to mark their online traffic accordingly, so that firewalls, servers, and other systems can safely guard against it. For obvious reasons, in practice nobody ever actually sets this, except in jest.
It’s an ironic bit (pun definitely intended) of Internet humor, as a common problem with new technologies is that the people developing them tend not to think about how they could be misused. Designs tend to focus on what is novel, innovative, or potentially revolutionary about the new algorithm, gadget, or system. Less often is attention paid to how this invention might be used to harm others.
This is where two recent examples come in. One involves a popular programming platform called Node.js. Node has a sizable, vibrant community, and members of that community share code with one another via the Node Package Manager, or npm. Unlike most other software platforms, however, npm’s philosophy is that code modules should be kept very small and single-purpose. This allows people to mix and match dozens or even hundreds of so-called “microlibraries” to build their applications. To make things easier for developers and users, npm can be used to automatically install any modules used by a given Node program. Developers are encouraged to contribute their own modules to npm, too, and plenty of them have obliged. One such developer–Azer Koçulu–had over 273 npm modules to his name. I say “had” because, in a fit of pique over a frankly uninteresting trademark dispute, Koçulu decided to take his ball and go home. He used npm’s “unpublish” feature, meant to allow developers to remove clutter, unused modules, and other wastage, to remove every last one of his 273 modules from npm’s service.
It just so happened that one of those modules was a mere 17 lines of code and was crucial to the operation of numerous large (and small) projects. Its abrupt removal broke those projects–the code they depended on could no longer be obtained via npm. This debacle was described, perhaps hyperbolically, as “breaking the Internet.” It wasn’t quite that bad–the dependencies were sorted out relatively quickly and new versions put in place to cover what Koçulu deleted–but it showed a crucial oversight in npm’s design. The developers behind it failed to account for bad intentions. There can be little dispute that Koçulu intended to harm npm, Inc. (the company behind npm) in retaliation for a perceived slight. But countless users of npm and projects that rely on it were collateral damage, too.
The end result of all this was for npm to update their policies and not allow unilateral unpublishing, which is probably how it should have been all along.
Another recent example comes from Microsoft’s latest foray into the world of artificial intelligence twitter bots. No doubt the designers of the “Tay” bot thought it would be cute to create an automated twitter account which emulates the textual mannerisms and personality of a teenage girl and have it interact with and learn from other users. But, again, nobody seemed to think about how others might abuse such an outlet. The jerks of the Internet, true to form, took the opportunity to teach Tay about Hitler, racism, pornography, and genocide. She began spouting racial slurs and Holocaust denial while simultaneously begging twitter users for sex. Of course, this suggests Tay is alive and has human motivations–it doesn’t. It’s just a program. It’s a program whose designers didn’t bother to undertake the basic due diligence of filtering out racial slurs and other abusive language. Word filters have been a common feature of Web applications since the mid-to-late 1990s–those would have been a bare minimum, but it somehow didn’t occur Microsoft’s engineers.
Beyond spewing random garbage, Tay was also used to attack individuals. At the behest of those polluting her program, she called independent game developer–and frequent target of online harassment–Zoe Quinn a “stupid whore.” In this age of GamerGate, of doxxing, of endless online harassment, what company with any sense would unleash an unmonitored, unfiltered AI bot and put it at the tender mercies of random Internet users? Microsoft, at least, apologized for the incident, and took Tay offline within hours of her going haywire. But this was a lesson that shouldn’t have had to be relearned.
No matter what it is, no matter how it’s intended to be used, anything put into the hands of unaccountable Internet users is going to be abused. “Netiquette” is a nice idea and everyone should do what they can to follow it, but it’s by no means a binding contract or enforceable law. People who design applications, gadgets, and other technologies to be used by others must consider how they could be used maliciously, as well. Good design isn’t just what brings the “wow” and “cool” factors–it’s what protects innocent people from the potential harms of your brilliant invention.
Update: Microsoft brought Tay back online, only for the bot to immediately begin tweeting about smoking drugs in front of the police, followed by a spammy meltdown. It’s been taken offline again. Give it up, Microsoft!