By Heather Hamilton, contributing writer
A team of researchers from the University of Washington has proven that it is possible to encode malicious software onto strands of DNA, causing the gene sequencer analyzing it to turn the data into a program capable of corrupting gene-sequencing software and then taking control of the computer.
While it sounds impractical for spy or criminal use, the researchers warn that it could become more likely over time as DNA sequencing becomes more common. The research also represents what Wired calls a “sci-fi feat of sheer hacker ingenuity.”
Tadayoshi Kohno, lead researcher and University of Washington computer science professor, says that if an adversary has control over the data that a computer is processing, it could lead to a takeover of the computer. “That means [that] when you’re looking at the security of computational biology systems, you’re not only thinking about the network connectivity and the USB drive and the user at the keyboard but also the information stored in the DNA they’re sequencing,” he said. “It’s about considering a different class of threat.”
DNA samples come from outside sources, making them difficult to vet and leaving them potentially open to forward-thinking hackers. Kohno’s team of researchers believes that hackers could actually gain access to valuable intellectual property or even taint genetic analysis in criminal cases, using fake blood or spit samples to gain access to university computers, gain information from police labs, or infect genome files in research facilities. Code placed in the DNA of genetically modified products may also act as a way to protect trade secrets.
The Atlantic reports that it cost $100 million to sequence one human genome as recently as the early 2000s. Now, it costs less than $1,000, and there are portable sequencers easily available. The threat isn’t immediate, but it is potentially significant.
“There are a lot of interesting — or threatening may be a better word — applications of this coming in the future,” says project researcher Peter Ney.
To conduct their research, the team wrote an exploit referred to as a buffer overflow, which fills space meant for a particular piece of data in a computer’s memory, spilling into another part of the memory with malicious commands.
A DNA sequencer works by mixing chemicals that bind in different ways to DNA’s units of code (chemical bases A, T, G, and C), giving off different colors of light, seen in a photo of DNA molecules. The process is quickened by splitting up the images of millions of bases into thousands of chunks that are analyzed in parallel. This means that, in order for the research team’s hack to work, all their data had to fit into a few hundred bases to increase the chance of it remaining intact throughout the parallel processing.
Along the way, the researchers discovered that they needed to maintain a particular ratio of Gs and Cs to As and Ts to ensure stability because DNA depends on a regular proportion of A-T and G-C pairs. While buffer overflow frequently allows for the use of the same strings of data, in this instance, it caused the DNA strand to fold in on itself. Because of this, the group rewrote their exploit code a number of times so that it could survive as DNA. Eventually, their efforts resulted in attack software that could survive the translation from actual DNA to digital format, referred to as FASTQ, used to store the DNA sequence. Upon compression, the FASTQ file hacks the compression software using the buffer overflow exploit, which breaks it out of the program and into the computer’s memory.
The attack fully worked approximately 37% of the time, occasionally being cut short or decoded backward, which the team suggests can be worked out over time.
Luckily, the research team believes that real world instances of DNA hacking are at least years away. The team had the luxury of taking shortcuts where it suited them — modifying open-source code to insert a flaw allowing a buffer overflow, for example. Malicious hackers operating outside a research setting would have no such advantages.
Sources: University of Washington, Wired, The Atlantic
Image Source: Pixabay
Learn more about Electronic Products Magazine