Modeling and the scientific method

February 15, 2011 § Leave a comment

Last Friday’s Theory Lunch was interesting for a reason the speaker didn’t entirely intend.  In the preamble for his talk, Daniel Beard wanted the audience to agree with him that the vision of the scientific method articulated by John R Platt — devise hypotheses, devise experiments to distinguish among them, perform said experiments, repeat (known as “strong inference”) — is the way all science should be done, and that systems biology has special importance as a way to articulate and test hypotheses.  He was unexpectedly ambushed by Tim Mitchison, who made a spirited argument that hypothesis-driven approaches often limit the size of the conceptual advance that can be made.  If I understood Tim correctly, he was arguing that forcing a hypothesis into clear alternatives that can immediately be tested almost always makes the question too small.  Big ideas, he said, usually start out fuzzy.  The Platt-style scientific method is useful for crisping up the edges of the big ideas, but not for having them in the first place.

Beard, who was courteous and open-minded throughout, allowed himself to be dragged away from his planned topic — not just (or even, not primarily) by Tim but also by other TL participants.  [This does happen, in TL.  It takes a strong-willed speaker to avoid being distracted by the barrage of semi-relevant questions.  I love the fact that TL participants aren’t afraid to ask questions, but sometimes I wish they would stay on point.]  It was a shame that Beard allowed himself to be diverted, because he had an interesting story to tell.  I thought it might be salutary to go through the story he might have told — if he’d been allowed to — and then discuss whether this story is indeed an example of a Platt-style cycle of hypothesis/experiment.

The story I think Beard wanted to tell is discussed in one of the papers he suggested as pre-reading for the discussion (Beard & Kushmerick, 2009, Strong inference for systems biology, PLoS Comp. Biol. 5 e1000459).  This paper gives a summary of Platt’s prescriptions for good science, and then moves on to consider the case of a long-standing belief in cardiac metabolism.

In the late ’80s, RS Balaban and co-workers asked how ATP production in the heart was balanced to workload. They measured the levels of ATP and related metabolites — phosphocreatine, ADP, AMP and inorganic phosphate — in isolated dog hearts forced to beat at different rates.  Surprisingly, they found that all the values they could measure remained essentially constant at different levels of myocardial oxygen consumption (a measure of workload).  These experimental results were taken to disprove the idea that ATP production was controlled by direct or indirect feedback from ATP consumption via the products of ATP hydrolysis, and for the next 20 years or so researchers devised increasingly elaborate hypotheses for how the ATP production rate could be matched to ATP consumption in the absence of this mechanism.  One suggestion, termed “parallel activation”, proposed that the activities of all the enzymes and transporters involved in ATP synthesis are simultaneously (and separately) controlled by changes in the rate of ATP consumption.  The observation that ATP levels remain stable in the absence of obvious feedback control was given the name “metabolic stability”.

In the early to mid 2000’s, new data began to become available that were not consistent with the notion of metabolic stability.  New methods of measuring inorganic phosphate (Pi) levels showed that [Pi] increases significantly with increasing oxygen consumption, in direct contradiction to the results from Balaban and co-workers.  It’s possible that the reason for the discrepancy is that the older studies were unable to distinguish between 2,3-diphosphoglycerate and inorganic phosphate in (31)P magnetic resonance spectroscopy.  The original data indicating that ADP levels remained constant as oxygen consumption increased were indirectly inferred from other measurements, so they were suspect as well.  Could it be that the direct feedback model had been unfairly dismissed because of inaccurate or insufficient data?

Well, yes, it could.  Whatever your view of the scientific method, it shouldn’t surprise you that sometimes researchers reach the wrong conclusion because their data lead them astray.  Beard and colleagues went to great lengths to show, through computational modeling, that (under certain reasonable assumptions) feedback control from ADP and Pi is all that is necessary to link ATP consumption to ATP production.  What’s more, the predictions from these simulations fit rather well to the available (modern) data, and the model can unify results from purified mitochondria and from whole hearts, which had previously seemed to be in conflict.  And whether the computational modeling is perfect or not, Beard & Kushmerick assert that the observation that Pi actually does increase by at least 2-fold as oxygen consumption increases should be enough all by itself to contradict the theory of metabolic stability.

Beard & Kushmerick tell this story as a cautionary tale: the conclusions from the original papers by Balaban and colleagues were stated too definitely, as final answers rather than hypotheses that were open to testing.  As a result, these conclusions were not challenged for about 20 years.  Even when new data became available, the finding that [Pi] did vary with oxygen consumption was presented as if it were consistent with the older data, and therefore not in conflict with the established belief in metabolic stability.  Perhaps this was due to insufficient attention to Platt’s scientific method, or to the normal human fear of conflict (especially conflict with well-accepted ideas), or both.  Disarmingly, Beard & Kushmerick point out that “one of us [Beard] is guilty of this sort of equivocation intended to avoid conflict.  In discussing our model predictions in comparison to the data on inorganic phosphate, Wu et al. state “our predictions agree with [the older observations] that total Pi …. remains constant within limits of detection at moderate work rates”…. [this] says nothing more than “there is no conflict in the regime where there are no data”…. where there are data, Pi does change significantly with [oxygen consumption].  So the emphasized point of agreement is trivial while the crucial point of disagreement is deemphasized.”

Beard & Kushmerick argue that their demonstration, through computer simulations, that the hypothesis of metabolic stability is inconsistent with the data is a triumph of the Platt-style scientific method.  Maybe so.  Once the new data were available, though, was the computational modeling essential?  Or is the model mostly important to help people who are committed to the old idea recognize the new idea?

One could draw a number of other conclusions from this story, for example:

– Don’t draw firm conclusions from inadequate data.  If you know your data are inadequate and you’re tempted to propose a conclusion anyway, make sure you point out how the gaps in your data could have led you astray;

– If there’s a belief in your field that leads to highly complex and unlikely mechanistic hypotheses, it’s probably worth another look to see what the belief is based on;

– Try out new methods when you can; and

– Scientists, whatever they tell you, are human; the acceptance of new ideas by the scientific community is not always a straightforward matter of logic.

Personally I think it’s pretty hard to define a single path that all scientific progress must follow.  I don’t doubt that most great discoveries can be shoe-horned into the Platt model given a little goodwill, especially since Beard & Kushmerick appear to be willing to consider a decade or more of undirected data collection (e.g. the Human Genome Project) as “hypothesis generation”.   This strikes me as generous.  Perhaps something like Thomas Edison’s famous statement that “genius is one per cent inspiration, ninety-nine per cent perspiration” applies here; something like “science is x% exploration and y% hypothesis testing”.  In any case, despite all politics the work of Beard and colleagues (together with the new data collected by others) seems to have developed a serious case for the supporters of metabolic stability to answer, and perhaps helped to resolve a deep and long-standing puzzle.

Beard DA, & Kushmerick MJ (2009). Strong inference for systems biology. PLoS computational biology, 5 (8) PMID: 19714210

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

What’s this?

You are currently reading Modeling and the scientific method at It Takes 30.


%d bloggers like this: