Mastodon

Flawed Concept: Security Through Annoyance

Electronic security is important. Like we have grown to accept intrusive airport security as part of a valid effort to protect travelers from getting blown up by Islamic fascists and other terrorists, we have also grown to accept passwords, PIN numbers, verification codes, CAPTCHAs, and more in the name of electronic security. In this Internet age, you are more likely than ever to be a victim of credit-ruining identity theft. These inconveniences help to protect you.

Electronic security is also important for the government and businesses. Having worked on Department of Defense web sites, one of which including private (though non-classified) information, I’m well aware of the precautions taken on the systems side. DoD sites have to pass the Defense Information Assurance Certification and Accreditation Process (DIACAP), be compliant with the DoD Public Key Infrastructure, and more. When you combine all of this with web browsers and servers that support SSL encryption and rational user account/access policies, it’s pretty tough for bad guys to get information they shouldn’t have access to.

The problem is that increasing security—at least when it directly inconveniences the user—comes with diminishing returns. Extreme security requirements (like those often mandated in government settings) often result in a less secure technological infrastructure.

Complex passwords are a prime example. It makes sense to require that passwords exhibit some complexity, or else you end up with a situation where most people have passwords that are really easy to guess. In 2001, when I worked at a civilian federal agency (non-DoD), probably 80 percent of my federal and contractor co-workers had a password that was either their first name, their last name, a well-known nickname, or the word ‘password’. But it’s just as easy (and misguided) to go too far in the opposite direction. Many government agencies and private businesses now have complex password requirements. For example, each password may be required to be at least 10 characters in length, include at least 2 lowercase letters, 2 uppercase letters, 2 numbers, and 2 symbols/punctuation, and be significantly changed (i.e., not just a couple characters different) regularly—sometimes every 30 or 60 days.

A password of ‘password’ might be really easy to crack, but a password of ‘LiMorK83$%’ is essentially impossible to remember. It’ll be even worse 60 days later when, just as you’re starting to remember ‘LiMorK83$%’, you have to change your password to something else like ‘R8^6@ggToc’. So what do people do? They write the password down and put it somewhere near their computer. You can’t assume anymore that government passwords will be ‘password’, but you can usually assume it’s written down somewhere nearby (even though that’s technically against policy—password complexity is enforced by the system, but nobody is rummaging through desks to check on compliance with the other policies).

Another example is the User Account Control system introduced in Windows 6.0 (Vista). It’s not a bad idea to make sure that users approve of potentially system-damaging tasks before they happen, and in that respect it’s quite an improvement from Windows 5.1 (XP). But the UAC prompts (unlike similar prompts in Linux and Mac OS X) occur far too often—sometimes multiple times for singular tasks—and serve to annoy much more than they protect. As a result, many frustrated users disable the UAC prompts entirely. After disabling UAC you no longer get pestered, but it’s much easier for malicious software to damage your system.

Not only can onerous security policies undermine system security, but they can also serve to interfere with general work/task performance. In my DoD content management work, there was a move on our client’s part about two years ago to have me and my team move to a government office (we had been doing our work from a private company office). The servers themselves that we were managing content on were located at a Navy-owned hosting facility, and the office we were slated to move to was an Army-operated facility. We wrote a list of the software we would need to edit web pages, test them, and post them to the server—and, in addition, the network ports and protocols we would need to be able to use to contact and work with the server.

The answer: No. Opening ports was against security policy. Using SSH/SFTP protocols was against security policy. Installing server software (even for testing) on desktop machines was against security policy. Installing web browsers other than Internet Explorer (like Opera and Firefox—again, for testing) was against security policy. Installing web development and management software like Adobe Dreamweaver, WinSCP, or PuTTY was against security policy (unless they had been through a lengthy review and testing process).

So what did we do? Well, we stayed at the corporate office. Because it would be a violation of security policies to do our job on a government-controlled network, we took our job to a corporate network that—even if it was secure—had not been audited or reviewed by the government for security.

Not only was there a potential security risk (though not a practical one, since I know to take appropriate security precautions), but had the circumstances been different the ruling from the Network Czars would have stopped us in our tracks. What if the government had required in the contract that our tasks were to be performed on-site? Then we would literally have been unable to do the job we were contracted to do.

Ultimately, the valid need for security must be balanced against practical concerns. Security through annoyance is not, in the long term, effective—primarily because people quickly find ways to bypass or minimize annoyance. Neither is security through complete lock-down effective, since a complete lock-down negates all benefit of networking and connectivity and people revert to insecure transfer methods. It is not uncommon in government offices for users to burn data to a CD (since thumb-drives are often prohibited) and walk them across the building to a collegue since the network does not allow (again, for security reasons) the easy transfer of large files through email or file sharing. This ‘sneakernet‘ method of file transfers negates any in-built network security since it bypasses the network entirely, and the loss of the CD—through insecure destruction/disposal or accidentally taking it out of secure premeses—is a likely a much greater security risk than somebody hacking the network/email system.

So what’s the solution? Personally, I am a big fan of biometric validation. A thumbprint reader is more secure than a password of any kind, and thumbprint validation is as easy as putting your thumb on a reader. This makes security easy for the users without making it any less secure. I strongly recommend that the government move to biometric validation, preferably some kind of standardized system that can be implemented throughout the government and the private sector, and eliminate passwords altogether. Once a sufficient level of systemic security is in place, then network administrators can be a bit more lenient in their policies about software, ports, and so on so people on government networks can get some work done. This is where the dollars should be going, not toward million-dollar software to block people from making lunchtime visits to MySpace.com.

Scott Bradford has been putting his opinions on his website since 1995—before most people knew what a website was. He has been a professional web developer in the public- and private-sector for over twenty years. He is an independent constitutional conservative who believes in human rights and limited government, and a Catholic Christian whose beliefs are summarized in the Nicene Creed. He holds a bachelor’s degree in Public Administration from George Mason University. He loves Pink Floyd and can play the bass guitar . . . sort-of. He’s a husband, pet lover, amateur radio operator, and classic AMC/Jeep enthusiast.