Skip to main content

Rethinking Encryption

Posted by Staci R Norman on Friday, October 16, 2009

As information such as passwords and account numbers moves from computer to computer across the Internet, it is encrypted—jumbled into a non-comprehensible form.

Encryption occurs when two parties—your personal computer and Amazon.com, for example—connect and exchange a “key.” Your browser then encrypts the information you want to send (like a credit card number), it flows in a jumble to Amazon, and then Amazon decrypts it based on the key that the two computers exchanged.

You know your information is encrypted when you see the small lock icon on your Web browser or the “https” appears in a Web address.

Unfortunately, attackers can intercept this encrypted data as they flow through the system and use it maliciously. They can also access encrypted data from where it sits in storage, whether it’s on a server you maintain or on one that lives elsewhere (for example, in a huge Google server-farm). The latter, dubbed “cloud computing,” is becoming more and more common and will require changes in the way that data are protected.

“Many of the assumptions of the past are changing rapidly in the face of new systems such as cloud computing,” says Assistant Professor of Computer Science Brent Waters.

Image
Cloud Computing
Computing in the cloud means users are connecting to and storing information on servers outside of their immediate control. For example, people can rent server space from Amazon.com. Recently, the City of Los Angeles made headlines as it consider outsourcing server resources to the cloud (in that case, to Google.com). Facebook, too, is on the cloud.

In the cloud, encrypted data sits on a server right next to space that others have access to.

“When I encrypt all the information on my hard drive and then my machine gets stolen, that’s one kind of problem,” says Waters. “With cloud storage, this information will be stored on third party servers, and encryption is going to be very important for that.”

Waters is working toward a new paradigm that he and Brent Waters, Assistant Professor for The University of Texas at Austin Department of Computer ScienceAssistant Professor Brent Watershis collaborators call “functional encryption.”

Image
Assistant Professor Brent Waters
Traditional encryption involves knowing the identity of every individual that is allowed to see certain data. For example, Brent Waters may have access to items A and B, while Vitaly Shmatikov has access to B and C.

Functional encryption, says Waters, will work by giving people with certain attributes—not just their identity—the ability to decrypt information.

For example, city police share a lot of sensitive information about cases within their organization and with other law enforcement agencies. At present, people are given access to this information based on their identity, not necessarily the role they play in an investigation.

Functional encryption could help the organization implement an overall policy (for example, ‘only undercover detectives can see this information’) and as the agents join a case (or leave it) they are automatically given access to certain privileged information. The key used to unlock the encrypted record would be based on multiple and dynamic criteria, rather than just a static name or ID number.

“We need to re-envision what we really mean by encryption,” says Waters.

Waters also recently worked with Witchel on a project related to self-destructing data. There is no way to permanently delete any material posted or sent through the Internet, and this leaves people’s information vulnerable to breaches in privacy. No paper shredder or lighted match exists to erase digital data out of history.

But researchers from the University of Washington built a program, called “Vanish,” that promised to make tagged computer data, such as emails and photographs, disappear after about eight hours.

Waters, Witchel, graduate student Owen Hofmann and post-doc Christopher Rossbach, proved (unfortunately) that Vanish didn’t work. They created a program called “Unvanish” that makes vanished data recoverable after they should have disappeared.

“Our goal with Unvanish is to discourage people from relying on the privacy of a system that is not actually private,” says Witchel, assistant professor of computer science.

Waters adds, “Messages that self-destruct at a predetermined time would be very useful, especially where privacy is important, but a true self-destruction feature continues to be challenging to provide.”

From: In Computers We Trust? | TexasScience | By:Lee Clippard | Posted: Friday, October 16, 2009