If user ignorance is the most obvious cybersecurity vulnerability, is a public awareness campaign the best solution? Or just a waste of money?
Security officials will tell you that the biggest vulnerability in most systems is user ignorance. Hackers often are able to penetrate networks by exploiting well-known flaws in commercial software — flaws for which fixes are readily available if only people would take the time to download the patches or reset their security settings.
Systems administrators would love to find a hidden switch in the human brain that would make users care about security. But the fact of the matter is that technology is not the solution for ignorance, education is.
Consider McGruff the Crime Dog. For three decades now, the cartoon bloodhound has been the icon for a public relations campaign to encourage people to keep an eye out for suspicious goings-on — to “take a bite out of crime.”
Should the federal government launch a new PR campaign to raise awareness and protect itself, the public and the economy from cyber warfare? That is the question we put to the readers at FCW.com and GovLoop — and to Robert Dix.
Dix is vice president of government affairs at Juniper Networks. He was previously staff director for the House Government Reform Committee's Technology, Information Policy, Intergovernmental Relations and the Census Subcommittee.
Click through the following pages to see the expert view and comments from the community.
Cybersecurity: Who Will Be the Lead Dog?
By Robert B. Dix Jr.
The challenge of protecting information and information systems is not new. What is new is that society is being impacted in a different and more sinister manner than ever before. As a result, we must now be thinking about unprecedented ways to protect ourselves and our information assets.
The problem is that many people have no idea how to protect themselves against these growing cyber threats, including identity theft — a consumer fraud and burgeoning underground criminal activity that is costing Americans billions of dollars a year. Loss of intellectual property and trade secrets threatens our economic security and even our national security.
It’s been estimated that as many as 80 percent of exploitable vulnerabilities would be mitigated with basic cybersecurity hygiene, such as patching, antivirus updates, password management and so on. These simple tasks do not require huge investments or large information technology staffs, but they do require greater awareness and education.
During his historic speech on cybersecurity in May 2009, President Barack Obama called for a national public awareness and education campaign as one of 10 short-term action items. Who will lead that effort? Who will be the public face that raises awareness for home users, small businesses, nonprofits and computer users of all ages to improve their cyber hygiene and raise the bar of protection for all of us?
History provides us with plenty of examples of failed efforts to legislate or regulate personal behavior. In fact, during my time on Capitol Hill, I saw many examples of a rush to pass legislation that produced unintended consequences, and the cure itself was often worse than the disease.
Instead, we need to do a better job of educating folks about cyber threats, risks, vulnerabilities and consequences — and most importantly, what each of us can do to improve our cybersecurity.
Over the years, we have relied on Smokey the Bear to help us learn how to prevent forest fires, McGruff the Crime Dog to raise our awareness about crime prevention, and the Drug Abuse Resistance Education program to warn kids and parents about the dangers of drugs. The ongoing successes of Smokey, McGruff and DARE were achieved after children began to remind their parents of the basic crime and fire prevention tactics that these friendly but persuasive figures taught them.
With the continued leadership of the White House and the help of a McGruff-like figure, we can forge a national campaign that includes our preschool, K–12 and higher education students; provides enhanced training and awareness for employees across the public and private sectors; and recruits Internet service providers in a collaborative partnership that routinely reminds us all to better protect ourselves from cyber thieves and miscreants. The Homeland Security Department has made a good start with its National Cybersecurity Awareness campaign.
So as the president directed, let’s get to it. Hey, it might even put a little fun into cybersecurity for a change.
[Editor's note: Comments have been edited for length, clarity and style.]
What foolishness! Of course, attackers are going to pick the lowest-hanging fruit first. But patching is not easy or affordable for resource-scarce shops. The IT security industry preaches this pap constantly. A well-patched system is not a secure system; it is only a less vulnerable system. Systems were not designed to be secure. All serious adversaries have zero-day attacks sitting on the shelf that can p0wn fully patched systems. Low-assurance systems are sitting ducks!
[Editor’s note: According to UrbanDictionary.com, “p0wn” means to “'own' a person (in video games mostly). To be considerably better than another person.”]
Security for Dummies
We have undergone public awareness programs, both internal to an organization and in open forums such as this. And yet hackers still get in, patches are applied but not properly validated, and people still make moronic mistakes.
The fact that a person can attend a user security awareness refresher class and a few minutes later make an absolutely idiotic mistake tells me that one of three things is occurring. One, there is no accountability for mistakes. If a person makes one, there is no penalty applied. Two, there is no absolute requirement to ensure that something has been done. Yes, patches are applied. But how many times have we applied patches only to find the same patch vulnerabilities showing up on the next scan? And three, there appears to have been a major reduction in the use of common sense. If the No. 1 password is 123456, we have a larger problem than the need for better training.
We are vulnerable because we really have not made it a priority other than by regulation to practice solid security. Until enforcement becomes personal, all the classes and training will not solve the problem.
— Michael Bartholf
Who Am I?
A major vulnerability is the intersection of the human and digital spaces. Frequently, the attacker is invited in the front door through social engineering. There are readily available solutions to this problem. It all involves clear and reliable identification of the person seeking entry. This cannot be accomplished with user name and password systems or biometrics. The crucial question is who really is the person using the information or biometric. Are they using their own identity or a stolen one? This is the question that must be answered every time. From that information seen in context, their intent can be determined and social engineering effectively defeated.
— John Ellingson
Awareness has always been an issue, at least since computing power was widely distributed, and probably always will be an issue. Those organizations/agencies that understand this are willing to continuously invest the resources to provide more than lip service to keep the awareness at a high enough level to protect all of their resources all the time. I suspect that a PR campaign will do little more than get various organizations/agencies that don't really understand the importance of this issue to provide some level of lip service to the issue and wait for the next cyberattack/PR campaign.
— Henry Brown
What a PR campaign should do is highlight a sustained, ongoing effort to educate and inform. How many times can you hear a McGruff message before you start to tune him out?
— Christa M. Miller
I agree, with our busy schedules, we will tune it out. But we need to come up with a better way to protect our government systems.
— Maria S. Wieand
News You Can Use
I like the idea but also worry it would be criticized on all sides by the "government conspiracy" folks. Internal training for federal employees could be improved. Make it less a boring requirement and actually practical and useful.
— GovLoop (Steve Ressler)
There are two laws that govern bureaucracies: Parkinson's and Pournelle's. Parkinson's Law states that bureaucracies tend to grow when an official wants to multiply subordinates, not rivals, and when officials make work for one another.
Pournelle's Iron Law of Bureaucracy states that in any bureaucratic organization there will be two kinds of people: those who work to further the goals of the organization and those who work for the organization itself. Examples in education would be teachers who work and sacrifice to teach children vs. union representatives who work to protect any teacher, including the most incompetent. The Iron Law states that in all cases, the second type of person will always gain control of the organization and will always write the rules under which the organization functions.
Too often, the default position is to do nothing. If you do nothing, then you will not have a failed project and cannot be criticized. I think that explains many of the problems with cybersecurity in government.
— Airborne All The Way
The place where I work, which has approximately 2,000 employees, has an extensive security awareness program, including monthly newsletters, intranet reminders, etc. Incidents are reported in the newsletters, and much can be learned from reading these victim stories. Using actual events as the basis for training keeps our employees more attuned to security matters. They'd rather read about the foibles of others than become a victim themselves. This doesn't mean that we are perfect. But relying on just the annual training tutorial is not enough. Security education demands real and constant feedback in order to be effective.
On the day following the Sept. 11, 2001, terrorist attacks, every federal agency in Washington greeted its badged employees with a ride through a magnetometer, and the process still goes on today. Were these badged employees ever a threat? No, but this symbolic action gave the impression that the government was in control of the situation. Lesson: When your only tool is a hammer, every problem begins to look like a nail.
In the case of cybersecurity, we all adhere to NIST 800-53, etc. By doing so, we merely ensure that our applications and infrastructure are compliant with the National Institute of Standards and Technology's guidelines — not that they are secure. Any bound, hard-copy document on cybersecurity that has gone through agency vetting and formal publication is going to be out of date by the time it is viewed and understood by its using audience.
On the operational level, every cybersecurity office insists that the application development team must develop the risk assessment, security plan and authority to operate application — the security office merely evaluates the completed work. These development teams are not necessarily competent to address all of the sophisticated issues that surround deficiencies in protocols, proprietary products, operating environments, etc., that are exploitable by hackers/crackers. Usually, these risk assessments and security plans are cobbled together by application developers and analysts after plagiarizing other security plans that previously passed muster with the security office without regard to the pertinence and completeness of the materials that are reused. It has become a paperwork drill.
The truth is that we don’t have sufficient security expertise and security experts to provide a secure online environment for the entire world. The technology — and the hacker — is moving too fast for that. We would be better off to reconsider:
- If everything that is online has to be online.
- If everything that has to be online needs to be on a public network. A judicious decision here would at least reduce the vulnerability to our most precious assets.
Paying the Price
An anonymous reader at FCW.com pointed out that everyone in the Defense Department already gets annual training, but they ignore it because DOD officials “never discipline anyone for allowing breaches.” The reader’s solution was to put DOD and the rest of the federal government on its own trusted Internet for mission-critical work, “and make people walk over to a machine in the corner to interact with the outside world.”
So FCW editors put this question to readers: Would security training be more effective if people were indeed motivated by fear? Here are some responses.
Reacting to insider threats is just as important as outside threats. But implementation is a problem. Some govies in some organizations cannot be fired or disciplined, while some contractors are immediately fired for a single security violation. I think we need more lower-impact speeding tickets issued than career-killer felonies and "Bygones" (ref. Ally McBeal).
We've nearly perfected security. Every time a new e-mail comes in, Outlook stops working to scan it. It may take 10 minutes to write three lines, but our security is good.
One thing every security weenie should understand is that perfect security is attainable only by shutting down the operation you support. If your security plan is to move increments toward a shutdown of the functions of the operation you support, then you should be fired. Your job is to secure the fully functioning operation and not hobble or disable it. Get it? If you view the people you support as the enemy, go flip burgers!
M: Maybe you're on to something. If the people we support want to cause a security breach, whether by accident, deliberate ignorance or malice, who are we to stop them? You've got important work to do, and these security things are really blown way out of proportion.
NEXT STORY: NASA's FISMA stance stirs up a debate