Q&A: Rep. Ted Lieu
- By Adam Mazmanian
- Jul 20, 2015
California Democrat Ted Lieu has a degree in computer science from Stanford University.
Ted Lieu is that rara avis on Capitol Hill -- a member of Congress who knows what he's talking about when it comes to technology.
Lieu has a degree in computer science from Stanford, and can dive in when questioning witnesses about the particulars of the out-of-date computer languages still at use in the federal government. He's also a lawyer, an Air Force veteran (and a member of the USAF Reserve), and a firebrand when it comes to privacy rights, as he showed in a recent hearing on encryption, when he counseled a panel of law enforcement witnesses to "just follow the damn Constitution."
As a freshman Democrat, Lieu doesn't have a lot of clout on the Hill. He occupies a cramped three-room warren of offices on a high floor in the Cannon House Office Building, where the air conditioning barely reaches. But he's making a mark as president of the small 2014 class of House Democrats, and as a vocal member of the House Oversight and Government Information Technology Subcommittee.
FCW spoke to Lieu in his office before an IT Subcommittee hearing on cybersecurity at the Department of the Interior. This is an edited transcript of that interview.
FCW: You've been outspoken in opposition to efforts by law enforcement to obtain access to encrypted communications. Computer scientists say that backdoor access, even by law enforcement, introduces risks. Do you think the people advocating for more law enforcement access understand these objections?
LIEU: I respect law enforcement. But their mission is to catch bad guys and prevent bad things from happening. Their mission is not to really think about privacy or think about what could happen if you put in an encryption key. It is clear it will help them catch bad guys. But there's a whole series of other consequences, such as you're weakening encryption systems. And this key is really neutral -- it's just a series of ones and zeros. The computer can't tell if it's the FBI director entering this key or the leader of Hamas or a criminal. All they know is, if this code matches it unlocks encryption. You already have problems with governments being able to keep secrets, so I don't see how there's a way to have any government entity to have some secret encryption key that only they can enter, because eventually someone will figure it out that's not the government. The other problem -- we'd essentially be forcing a private sector product to be weaker because of some possible chance in the future that they might need some information. That seems like a really huge and disproportionate proposal. Because now you're weakening every single iPhone on the off-chance that some terrorist might use the Apple iPhone. And it's so easily defeated. As of right now, there are programs on the Internet where two people can encrypt, which already makes it so law enforcement can't tap.
FCW: On the OPM hack -- with your background in computer science, what are you impressions of the government's cybersecurity posture, and overall approach to IT.
LIEU: Both the private sector and the public sector have problems with cybersecurity. You've seen it at OPM, but you've seen it at Anthem, Target, Home Depot, and on and on. This is clearly a cultural problem in the public and private sector. Within government, I think it does vary by agency and department. The Department of Defense figured out very quickly that we are in a cyber war, and every day hackers are trying to get our sensitive data. That's why they stood up U.S. Cyber Command. That's why they put in two-factor authentication. I was active duty in the Air Force, I'm still in the reserves. Just to use my word processor, it's not enough for me to have a password and login. I'd then have to stick in my physical ID card and a pin code. That makes it harder for foreign adversaries and domestic hackers to breach. Many of the civilian agencies, if you read the IG reports, just don't have that. In the case of OPM, for years they ignored IG report after IG report that said you need to do two-factor authentication. The last IG report, in 2014, said that OPM was in violation of the administration's own guidance. So it does vary by agency.
To me, there is no reason why OPM should be protecting the database of security clearance data. Security clearances are a national security function. It is not a personnel function primarily. If you are going to have an agency with years and years of weak technology and security controls, you can do that, but you just better not store the crown jewels of American intelligence at that agency, which is what we did. That's why [Oklahoma Republican Rep.} Steve Russell and I are working on legislation to move that responsibility out of OPM and into a department that has as its mission fundamentally either national security or intelligence or homeland security.
Every time I go on reserve duty, there are multiple emails that warn me about cyberattacks. There are designated units in the Air Force that try to fool you [with phishing emails] and tell you what you did wrong. There's annual training and refresher courses on cybersecurity. There's a huge culture within the Department of Defense that tells every employee that cybersecurity is really important. You don't see that at OPM, which wouldn't be necessarily so bad, if they didn't have national security information there. It would take a huge amount of resources to ensure that you never have a breach, ever.
But there may be certain databases for which you have to take that view. So the CIA can't have the view that, every now and then we'll have a breach, and people will get our list of spies. Which is why they specifically refused to have their database at OPM. We're having cost constraints in the federal government, so you may have to do better segmentation in figuring out what databases we're going to spend a lot of money on to try and protect, and which ones we're going to spend a reasonable amount of money on to try to protect. Regardless of whether you spend reasonable or a lot, we have to upgrade every single system in most of the federal agencies.
FCW: Is that something that's possible in the current fiscal climate?
LIEU: I would hope so. I think Congress is now very aware of what happens when you don't do that. It's my hope we can get more funding, more resources. But on the civilian side, as opposed to the dot-mil side, you need a huge change in culture, where everybody goes to two-factor authentication. It will be annoying. It will slow things down. People are going to forget their second level authentication, and it's going to make it so that some people are not productive for half a day. But it is safer, and I think people would rather have safety.
FCW: More generally, how do you think the government does in IT delivery?
LIEU: Something I've noticed in government that tends not to happen in the private sector. In the private sector, if, for example, Microsoft knows that the next version of Word is going to crash half the time when you launch it, they're not going to release it. They're going to test that over and over again and fix it so that they know when they launch, they know it's going to be reasonably reliable. In government, for whatever reason, government will launch technology products or upgrades when they know it is unlikely to work.
You saw that with HealthCare.gov. The people who worked on it knew on day one that it was likely to fail, and they still did it. You see it with the Los Angeles Unified School District [which includes some of Lieu's district]. There was a program that was going to track students. There were newspaper articles where people were saying it was not ready to go. The superintendent of LAUSD decided to launch it anyway -- it's one reason he's not there any more. The program fails. You have kids who were literally unable to go to class because no one knew where they were supposed to go. It was chaos in some of these schools. You see it in Oregon - their Affordable Care Act website. Massive problems. You see it at all levels of government. I don't, to this day, understand why government launches products without adequate testing when they know it's going to fail. We should just stop doing that.
You know, 46 years ago we sent men to the moon and brought them back. So it's not as if the federal government can't do immensely complicated projects. What you're seeing now is a culture issue. A lot of folks, many don't understand how difficult technology can be if you don't do it right. But also just expertise. You don't need agency leaders to have computer science degrees, but you would want their CIO to have a very good grasp of cybersecurity and other technology matters, and have authorities to implement policies, and then to have consequences when those are not implemented.
If you look at the multiple IG reports on OPM, it's pretty damning. However you're going to see different agencies have similar IG reports, identifying either materially weak systems or seriously deficient systems. That's a problem. You need people to have policies and authority to fix those, and have consequences if those don't happen. Otherwise why have an inspector general?