Chris Palmer, Security Engineer, Google Chrome
Chris works at Google as a software security engineer on Chrome, where he focuses on the security of Chrome for mobile platforms (Android and iOS), and duct-taping over the foibles of the web PKI. Prior to Google, Chris was the Technology Director at EFF, a security engineering consultant at iSEC Partners, and a web developer. Majoring in linguistics and in French literature prepared him well for these careers, weirdly. Chris is a Mentor at Hackbright Academy.
As a Hackbright student or alumna, you probably plan to participate in building the foundation of our shiny new automated world. (Thanks for joining us! We need you.)
Software, firmware, and computing hardware underlie essentially all aspects of our society — the safety systems in our cars (and trains, and airplanes), our financial system, critical infrastructure like energy and water purification, our healthcare system, and our culture. Even hand-crafted clothing is sold on Etsy and is made of cotton spun by a robot.
But it’s not enough that our infrastructure merely work. It has to work well and reliably under all kinds of pressure: human error (operator — and developer!), bad weather, bad luck, radio interference, hardware failure, network outages, criminal malfeasance. Even war.
Security engineering requires adopting a new mindset, at once cautious and conservative, yet also willing to calculate risks and experiment. Either perspective on its own is not enough; we must be of two minds to succeed.
Software security engineers are the professional pessimists who insist that Twitter must encrypt and authenticate all its network traffic even though it might seem less important than, say, banking. (Ironically, we then beg and plead with banks to adopt security at least as good as Twitter’s.) We worry about how impossible it is to audit the hardware which we have to assume is safe. Normal people see a TV, but we see Winston Smith’s telescreen. We are those annoying friends who remind their co-workers that computers cannot, in fact, correctly add two numbers together (not without significant help, at least).
Software security engineers are the professional optimists who try to make computers work safely in spite of Murphy’s best efforts — we will try to program Satan’s computer. We dream of a world in which robot cars tell each other only the truth about their position and speed. We dream of a world in which credit card and ATM fraud is mere statistical noise. We dream of a world in which your phone is really off when you turn it off, and which keeps your communications with your doctor confidential when it is on.
We dream of a world in which books cannot be burned.
If you’re interested in security engineering (and I hope you are, even if you don’t choose to make it your specialty), you can get involved at any point in your career. One of the best ways to get started is — as always — simply getting your hands dirty.
* Use Wireshark to learn what is happening on your network, and learn about the structure of network packets and connections.
* Use an HTTP proxy like Burp to learn what your browser is saying to web servers, and learn what it takes to intercept encrypted communications.
* Check out Michal Zalewski’s excellent Browser Security Handbook to learn why, exactly, the nytimes.com web site cannot read your Gmail. (Hopefully.)
* If you’re interested in cryptography, an excellent beginning book is Cryptography Engineering by Ferguson, Schneier, and Kohno.
* It’s important and hilariously fun to learn the C programming language, and to learn how C programs can go so badly wrong. Get your hands dirty with a debugger and disassembler, and learn what the machine is really doing.
This post was originally posted at Chris Palmer’s blog.