Is Science Mostly Driven by Ideas or by Tools?

December 16, 2012

This week’s issue of Science has a pair of interesting essays looking at the history of science and what drives progress in scientific research.  In the first essay, Freeman Dyson examines the question through the lens of two books: Thomas Kuhn’s The Structure of Scientific Revolutions and Peter Galison’s Image and Logic.  Whereas Kuhn argues that new ideas and paradigms (e.g. relativity and quantum theory) drive science forward, Galison sees the invention of new technologies (in particular the move from analog technologies to digital technologies) as having shaped the evolution of science.  Dyson falls more on the side of Galison, arguing that new technologies drive the creation of new ideas (e.g. invention of the steam engine led to the development of thermodynamics) and that new tools enable the discoveries that create new paridigms:

The great recent discoveries in the physical sciences were dark matter and dark energy, two mysterious monsters together constituting 97% of the mass of the universe. These discoveries did not give rise to new paradigms. We cannot build paradigms out of ignorance. The monsters were discovered by using the new tools of astronomy, wide-field cameras, and digital data processing. We must study the monsters patiently with new and more precise digital tools before we can begin to understand them. Galisonian science will continue to explore, with constantly evolving tools, the structures of space and time and galaxies and particles and genomes and brains.

On the other hand, Sydney Brenner looks at the history of biology in a companion essay and comes to the opposite conclusion:

We can now see exactly what constituted the new paradigm in the life sciences: It was the introduction of the idea of information and its physical embodiment in DNA sequences of four different bases. Thus, although the components of DNA are simple chemicals, the complexity that can be generated by different sequences is enormous. In 1953, biochemists were preoccupied only with questions of matter and energy, but now they had to add information. In the study of protein synthesis, most biochemists were concerned with the source of energy for the synthesis of the peptide bond; a few wrote about the “patternization” problem. For molecular biologists, the problem was how one sequence of four nucleotides encoded another sequence of 20 amino acids.

In essence, the revolution in biology was sparked by the simple idea that DNA encodes information.  Prior to the discovery of the double helix in 1953, biologists had been studying the physical and chemical properties of chromosomes, trying to understand how they could make a cell.  After 1953, however, the ideas born from Watson and Crick’s paper focused experimenters attention to the important question of how the information within DNA gets read.  Indeed, the answer to that question forms the central dogma of modern molecular biology.

As a biologist myself, I tend to side more with Brenner.  While it is true that new tools are essential for making new discoveries in science, new ideas are even more important because they define the important questions in science.  Indeed, today new tools such as next generation sequencing technologies are helping make new discoveries, but in many cases our ability to generate data outpaces our ability to understand it.

Of course, in the end Kuhnian science (driven by ideas) and Galisonian science (driven by tools) constantly feed back upon one another.  New tools can generate new findings that challenge old ideas, perhaps sparking the Kuhnians to seek new ideas.  These new ideas, in turn, generate new questions that the Galisonians can answer with new tools.  This feedback is already apparent in the various efforts to map the connections between all of the neurons in the brain (to create the so-called connectome).  The effort is a purely Galisonian one, to be driven forward by new microscopy technologies and data analysis techniques, but understanding how to interpret the connectome and make any sense of the data will require the invention of new paradigms for how we think of the brain.

Building New Forms of Life

July 16, 2011

Even though life on Earth spans such diverse creatures from bacteria to humans, at its core, all life on Earth is pretty much the same. Life as we know it uses the same four bases to store information in DNA, the same 20 amino acids to build proteins, and the same genetic code to convert DNA sequences into protein sequences1. While astrobiologists and many others dream of finding “life as we don’t know it” – life built from different components and chemistries – some scientists are taking a different approach. Instead of searching for these organisms, these scientists are actively trying to build and engineer unnatural forms of life: organisms that build their DNA from different components, use amino acids not found in nature and have an altered genetic code. In addition to being a great achievement in synthetic biology, creating these unnatural forms of life would have many practical applications. Because the genetic material from these organisms would have fundamental differences in how it is built and read, the DNA from organisms would make no sense to our cells and vice versa. This genetic firewall separating these unnatural organisms from all other life on Earth would be a great advantage for biomanufacturing purposes; these engineered organisms could not productively exchange genetic material with the outside world and would be resistant to all known viruses. The last point is particularly important as there have been cases where a rogue virus has sneaked into fermentors at biomanufacturing plants forcing the plants to shut down and decontaminate, resulting in millions of dollars in damages and lost productivity. Two recent papers, which take two different approaches to creating unnatural life, report progress toward this goal of constructing unnatural life. The first focuses on creating bacteria that use different materials in its DNA while the other paper reports on progress rewriting a bacterial genome to introduce new amino acids and fundamentally change the organism’s genetic code.

A team of researchers led by Rupert Mutzel of the Free University of Berlin have gotten a strain of E. coli to swap the DNA base thymine for an unnatural base, 5-chlorouracil. 5-chlorouracil is actually pretty similar to thymine; where thymine would have a methyl group hanging off of the base, 5-chlorouracil has a chlorine atom. In the past, other researchers have been able to grow cells in the presence of thymine analogs like 5-chlorouracil and found that these unnatural bases can substitute for up to ~90% of the thymines in DNA. Getting cells to completely switch to the unnatural base has, however, been much more problematic. In order to coax the bacteria incorporate only 5-chlorouracil, the researchers devised a clever approach to evolve a population of bacteria capable of surviving presence of the unnatural base and absence of thymine. While slowly ramping up the concentration of the unnatural base and ramping down the concentration of thymine, they carefully monitor the rate of growth of the cells. If the cells start dying, they inject a “relaxing medium” to increase the concentration of thymine and allow the cells begin growing again. Once growth rate passes a certain threshold, they then inject more of the unnatural base to stress the cells and kill off those that cannot as efficiently utilize the unnatural base. At the end of their evolution experiment, the bacteria had accumulated over 1500 mutations (who knows how many are actually helping the bacteria tolerate the unnatural base) and were capable of surviving in media containing only the unnatural base and no thymine. The thymine content of the evolved bacteria’s DNA was just above the limit of detection at 1.5% (the residual thymine may be due to unidentified thymine synthesis pathways in the bacteria).

Of course, these bacteria are nowhere near close to being able to be classified as an unnatural form of life. The change to the structure of their DNA is very slight and furthermore, they can still use the regular thymine base if it is provided. In order to classify this organism as unnatural life, the researchers would have to continue evolving the bacteria until it could no longer recognize the natural base thymine. It is unclear whether the current technique would be suitable for this goal. Furthermore, while their evolutionary technique is very powerful and elegant, it is unclear whether it could be used to engineer more drastic changes into the structure of an organisms DNA such as an altered sugar-phosphate backbone.

While Mutzel’s team used an evolutionary approach to engineer their organism, a team led by Farren Isaacs at Harvard and Peter Carr at MIT used a high-powered genome engineering tool to rewrite the genome of E. coli. Specifically, they are interested in removing one type of codon from the genome of the E. coli. Now remember that when the cell reads DNA in order to make proteins, it reads the DNA in three-letter words known as codons. Each of these codons corresponds to a particular amino acid. The table that matches these codons with amino acids is known as the genetic code. Of course, since there are 64 possible three-letter codons but only 20 different amino acids, the genetic code has a lot of redundancy. For example, there are three codons that specify the end of a protein: TAA, TAG, and TGA. Since the E. coli genome encodes only 314 TAG stop codons, the team set out to replace these 314 TAG codons with TAA. To do this, they divided the E. coli genome into 32 segments, each containing about 10 TAG codons, and used their genome engineering tool to create 32 strains each with 10 TAG to TAA substitutions. Next, they developed a clever large-scale genome stitching tool to combine the mutations from these different strains. So the 32 strains with 10 mutations became 16 strains with 20 mutations, then 8 strains with 40 mutations, and 4 strains with 80 mutations.

This stage is, unfortunately, where the paper ends (this is the one of the few scientific papers that actually ends with a cliffhanger!). It is unclear whether the genome stitching tool that they developed can work to stitch together the large fragments of the genome needed to create the final bacterium lacking any TAG codons or whether they need to develop a different tool for the final stitching steps. However, once they create this bacterium lacking TAG codons, they can then begin thinking about reassigning the now unused TAG codon to a different amino acid, such as an unnatural amino acid not found in any other organism. Furthermore, the technologies they have developed could allow them to completely re-write the genetic code of the organism and create an unnatural form of life behind a genetic firewall.

While neither of these two papers actually report creating an unnatural form of life, they present new powerful new tools that will aid greatly in these endeavors. While the goal of creating unnatural life may seem a mere academic exercise, the tools that these scientists have developed will certainly find many uses in the field of synthetic biology and aid scientists in solving practical problems in biomanufacturing and other related areas.

Further Reading:
Maliere et al. (2011) Chemical evolution of a bacterium’s genome. Angew Chem Int Ed Engl. doi: 10.1002/anie.201100535

Isaacs et al. (2011) Precise manipulation of chromosomes in vivo enables genome-wide codon replacement. Science 333: 348. doi: 10.1126/science.1205822

A piece by Newscientist on re-writing the genetic code to put organisms behind a genetic firewall.

1 There are a few organisms that have already replaced the TAG stop codon with a 21st amino acid, but the great majority of organisms use only 20 amino acids. There are also a number of species with slight differences in their genetic code, but most organisms all use the same genetic code.

Hello world!

February 26, 2010

Welcome to This is your first post. Edit or delete it and start blogging!