Two arcticles

Lets see if I can regularly update this thing. Anyway, here are two recent articles I have written, an obituary for the late great Richard Levins – The People’s Scientist. He is everything I aspire to be as a scientist. I hope I can live up to his example.


I also have a review of Mathbabe‘s excellent new book, Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy coming out in the In These Times.


Quanta Article on our recent preprint


Quanta Magazine just wrote a piece on our arXiv preprint relating Deep Learning and the Renormalization group. To find out more, please check out the piece or our preprint. Deep learning techniques have recently yielded record-breaking results on a diverse set of difficult machine learning tasks including computer vision, speech recognition, and natural language processing. Deep learning is one of the most exciting new techniques to emerge for unsupervised learning and companies such as Google, Microsoft, and Facebook have invested heavily in this field. It is commonly touted in both the popular press and as well as in the scientific community as a major breakthrough for machine learning.

Despite the enormous success of deep learning, relatively little is understood theoretically about why it is so successful at uncovering relevant features in structured data. In our preprint, we show that deep learning is intimately related to one of the most important and successful techniques in theoretical physics, the Renormalization Group (RG). This suggests that deep learning architectures may be employing a generalized RG-like scheme to learn relevant features from data! For more, check out the Quanta article and our preprint.

Vandana Shiva responds to pro-Monsanto hit piece in the New Yorker



The New Yorker recently penned an extremely misleading piece painting anti-GMO activists as anti-science hacks. It focuses on the prominent anti-GMO activist Vandana Shiva. It regurgitates hackneyed arguments from the Biotech industry painting people against GMOs are irrational Luddites who are harming the world.  Luckily, Vandana Shiva has written a spirited and extremely devastating response. Everyone should read the response.

While I don’t fully agree with all of Vandana Shiva’s politics, between Monsanto and Shiva, I think there is no choice at all: one is an amoral corporation who ruthlessly prioritizes profits over people, the other a dedicated activist who has helped amplify the voices of everyday peasants devastated by neoliberal science and development policies. Give me Vandana Shiva everytime!

Information, Computation, and Thermodynamics in Cells


One of the things that my group is working on – in collaboration with  David Schwab at Princeton and Mo’ Khalils Lab at the BU Center for synthetic biology – is thinking about thermodynamics, information, and learning in cells. In particular, we are starting to ask questions about how the laws of physics, and in particular thermodynamics, shape and constrain the ability of bacteria and other cells to perform computations and learn about world around them.

I was trained as a condensed matter theorist, and when I started thinking about biological systems, one of the things that really struck me were the amazing computations that single cell organisms can do.

We are familiar with the idea of computations and computers – but largely in the context of physical devices. After all, most of us spend way more time than we probably should sitting in front of our computers or using our smart phones.

What’s less obvious is that cells also our continuously performing computations. But if we think about what bacteria have to do, it’s clear that they really are performing what can be genuinely thought of as computational tasks. For examples, a common task that cells must do is to estimate the concentration of some environmental signal. This signal could me some kind of food, metabolite, or a repellant from a predator. After a little thought, its pretty easy to convince oneself that cells can’t directly measure the concentration of some molecule, especially if this molecule is at low concentrations. The way cells get around this problem is that they implement a statistical procedure for estimating the concentration indirectly and then actually compute these statistical quantities.

This leads to a series of fascinating questions about relating to how cells compute and how can cells learn about their environments.  A lot of people have been tackling these questions. Perhaps the people thinking about these questions the hardest are synthetic biologists. A central goal of synthetic biology is to design sophisticated biological circuits that can perform complicated “computing-like” behaviors. Synthetic biologists have designed gene circuits with a wide range of functionalities ranging from switches, oscillators, counters, to basic logical elements such as AND and OR gates. For this reason, there is a renewed interest in really thinking about how the biochemical networks in cells allow them to do elaborate computations.

As synthetic biology starts to assemble increasingly complex gene networks that exhibit robust, predictable behaviors, several natural questions arise: What are the physical limitations (thermodynamics and kinetics) on the behavior and design of these gene circuits? How do these laws constrain the inventory of potential biological components and modules? This where we physicists have a lot to offer.

Back in 1961,  Rolf Landauer, working at IBM at the dawn of modern computers, asked similar questions about physical computing devices. Landauer derived a theoretical lower limit on the amount of energy that must be consumed by computation.  In particular, he showed that there is a fundamental relationship between the idea of “information” – an extremely abstract concept – and very physical thermodynamic quantities such as energy and entropy.

What’s interesting is that in unlike in 1961 when Landauer was thinking about these things, we have some direct experience with these ideas. When we do a particularly computational intensive task such as streaming a high-resolution movie, we know that computers tend to get hot. This heating reflects the production of entropy during the computing process. And intuitively we know that the more intense the computations – the more bits that must be erased and written to memory– the more energy the computation will consume.

The genius of Landauer is that he recognized that is energy consumption and entropy production is fundamental to the logic of computation itself. Thermodynamic irreversibility was a direct result of logical irreversibility. In particular, it did not matter what kind of computing device one was considering: whether a physical computer like a laptop, or a biological circuit.

Today, we are at the dawn of the synthetic biology era.  Engineers are now starting to construct biochemical circuits that can do complex computations. So in this exciting time, it makes sense to revisit Landauer’s seminal insights in the context of biology. If we take seriously our idea that bacteria and other cells are doing computation, then we know that there must be fundamental relationship between the ability of cells to do computations and thermodynamic quantities such as energy consumption and entropy production.

Over the last two years, my group, along with David Schwab at Princeton, have been exploring these questions in the context of one of the simple biological problem – estimating the concentration of a external signaling molecule. By using methods from nonequilibrium statistical mechanics, we have shown that learning about external concentration necessarily requires the cell to consume energy. This is the biological manifestation of Landauer’s principle! Furthermore, we have shown at least within some class of biochemical networks, learning more information about the environment requires more energy. We are in the process of writing up this latter work.

We believe that this is a manifestation of a general principle: we conjecture that no matter the implementation, whether physical or biological, “learning more” requires more energy. We think that this should be true even for non-biological systems. For this reason, we believe that thinking about information and computation in cells is likely to yield new insights into the somewhat mysterious relationship between information and thermodynamics.

More practically, we have also started working with Mo Khalil at the BU Center for Synthetic Biology (of which I am also a member) to start thinking about how we can start using thermodynamic principles to design better synthetic circuits. Can we start designing synthetic circuits that actively consume energy to increase the computational power of synthetic circuits? In general, it is our belief that thinking about the constraints placed by thermodynamic considerations such as energy and entropy, we can start designing and building more sophisticated synthetic circuits.

Super interesting article on Tyrone Hayes from New Yorker

Hayes has devoted the past fifteen years to studying atrazine, a widely used herbicide made by Syngenta. The company’s notes reveal that it struggled to make sense of him, and plotted ways to discredit him.

Hayes has devoted the past fifteen years to studying atrazine, a widely used herbicide made by Syngenta. The company’s notes reveal that it struggled to make sense of him, and plotted ways to discredit him. Photograph by Dan Winters.

In 2001, seven years after joining the biology faculty of the University of California, Berkeley, Tyrone Hayes stopped talking about his research with people he didn’t trust. He instructed the students in his lab, where he was raising three thousand frogs, to hang up the phone if they heard a click, a signal that a third party might be on the line. Other scientists seemed to remember events differently, he noticed, so he started carrying an audio recorder to meetings. “The secret to a happy, successful life of paranoia,” he liked to say, “is to keep careful track of your persecutors.”

Three years earlier, Syngenta, one of the largest agribusinesses in the world, had asked Hayes to conduct experiments on the herbicide atrazine, which is applied to more than half the corn in the United States. Hayes was thirty-one, and he had already published twenty papers on the endocrinology of amphibians. David Wake, a professor in Hayes’s department, said that Hayes “may have had the greatest potential of anyone in the field.” But, when Hayes discovered that atrazine might impede the sexual development of frogs, his dealings with Syngenta became strained, and, in November, 2000, he ended his relationship with the company.

I encourage people to read the whole article here.