One of the things that my group is working on – in collaboration with David Schwab at Princeton and Mo’ Khalils Lab at the BU Center for synthetic biology – is thinking about thermodynamics, information, and learning in cells. In particular, we are starting to ask questions about how the laws of physics, and in particular thermodynamics, shape and constrain the ability of bacteria and other cells to perform computations and learn about world around them.
I was trained as a condensed matter theorist, and when I started thinking about biological systems, one of the things that really struck me were the amazing computations that single cell organisms can do.
We are familiar with the idea of computations and computers – but largely in the context of physical devices. After all, most of us spend way more time than we probably should sitting in front of our computers or using our smart phones.
What’s less obvious is that cells also our continuously performing computations. But if we think about what bacteria have to do, it’s clear that they really are performing what can be genuinely thought of as computational tasks. For examples, a common task that cells must do is to estimate the concentration of some environmental signal. This signal could me some kind of food, metabolite, or a repellant from a predator. After a little thought, its pretty easy to convince oneself that cells can’t directly measure the concentration of some molecule, especially if this molecule is at low concentrations. The way cells get around this problem is that they implement a statistical procedure for estimating the concentration indirectly and then actually compute these statistical quantities.
This leads to a series of fascinating questions about relating to how cells compute and how can cells learn about their environments. A lot of people have been tackling these questions. Perhaps the people thinking about these questions the hardest are synthetic biologists. A central goal of synthetic biology is to design sophisticated biological circuits that can perform complicated “computing-like” behaviors. Synthetic biologists have designed gene circuits with a wide range of functionalities ranging from switches, oscillators, counters, to basic logical elements such as AND and OR gates. For this reason, there is a renewed interest in really thinking about how the biochemical networks in cells allow them to do elaborate computations.
As synthetic biology starts to assemble increasingly complex gene networks that exhibit robust, predictable behaviors, several natural questions arise: What are the physical limitations (thermodynamics and kinetics) on the behavior and design of these gene circuits? How do these laws constrain the inventory of potential biological components and modules? This where we physicists have a lot to offer.
Back in 1961, Rolf Landauer, working at IBM at the dawn of modern computers, asked similar questions about physical computing devices. Landauer derived a theoretical lower limit on the amount of energy that must be consumed by computation. In particular, he showed that there is a fundamental relationship between the idea of “information” – an extremely abstract concept – and very physical thermodynamic quantities such as energy and entropy.
What’s interesting is that in unlike in 1961 when Landauer was thinking about these things, we have some direct experience with these ideas. When we do a particularly computational intensive task such as streaming a high-resolution movie, we know that computers tend to get hot. This heating reflects the production of entropy during the computing process. And intuitively we know that the more intense the computations – the more bits that must be erased and written to memory– the more energy the computation will consume.
The genius of Landauer is that he recognized that is energy consumption and entropy production is fundamental to the logic of computation itself. Thermodynamic irreversibility was a direct result of logical irreversibility. In particular, it did not matter what kind of computing device one was considering: whether a physical computer like a laptop, or a biological circuit.
Today, we are at the dawn of the synthetic biology era. Engineers are now starting to construct biochemical circuits that can do complex computations. So in this exciting time, it makes sense to revisit Landauer’s seminal insights in the context of biology. If we take seriously our idea that bacteria and other cells are doing computation, then we know that there must be fundamental relationship between the ability of cells to do computations and thermodynamic quantities such as energy consumption and entropy production.
Over the last two years, my group, along with David Schwab at Princeton, have been exploring these questions in the context of one of the simple biological problem – estimating the concentration of a external signaling molecule. By using methods from nonequilibrium statistical mechanics, we have shown that learning about external concentration necessarily requires the cell to consume energy. This is the biological manifestation of Landauer’s principle! Furthermore, we have shown at least within some class of biochemical networks, learning more information about the environment requires more energy. We are in the process of writing up this latter work.
We believe that this is a manifestation of a general principle: we conjecture that no matter the implementation, whether physical or biological, “learning more” requires more energy. We think that this should be true even for non-biological systems. For this reason, we believe that thinking about information and computation in cells is likely to yield new insights into the somewhat mysterious relationship between information and thermodynamics.
More practically, we have also started working with Mo Khalil at the BU Center for Synthetic Biology (of which I am also a member) to start thinking about how we can start using thermodynamic principles to design better synthetic circuits. Can we start designing synthetic circuits that actively consume energy to increase the computational power of synthetic circuits? In general, it is our belief that thinking about the constraints placed by thermodynamic considerations such as energy and entropy, we can start designing and building more sophisticated synthetic circuits.