Skip to main content
  • Comment
  • Published:

Shadows on the wall

'In order to thrive,' writes Boston College Director of American Studies Carlo Rotella in a splendid essay on Magic Slim and Buddy Guy, the last of the great 1950 s Chicago blues musicians, in The Boston Globe (13 September 2010), 'every genre or style needs both visionary innovators and orthodox practitioners. Without the former, it becomes hidebound. Without the latter, it drifts and loses its center.' But what happens when orthodoxy becomes dogma? What is the fate of innovators when they pose a threat, not to the accepted view, but to the accepted truth?

The best discussion of that situation I have ever read is over 2000 years old. It's Plato's Allegory of the Cave, and it's one of my favorite passages in classical literature.

The allegory is presented as an imaginary dialogue between Socrates and Plato's brother Glaucon, but it's really Plato speaking. Imagine, he says, a group of people who are born and live all their lives in a cave. They are forced to sit in chairs facing the back wall of the cave, restrained so that they cannot look anywhere else. Behind them, at the mouth of the cave, is a large fire, and between them and the fire is a walkway along which people carrying things, including replicas of animals, pass continuously. All the people of the cave can ever see are the shadows cast on the wall in front of them by those passing behind. All they can hear are the echoes in the cave produced by the movements they never see. Would they not, Plato asks, come to believe that those shadows and echoes are reality? Would they not assume that the entire world consists of the cave and the shadows on the wall? Wouldn't they praise as clever whoever could best guess which shadow would come next as someone who understood the nature of the world? And wouldn't the whole of their society come to depend on the shadows on the wall?

It's a powerful image, but Plato takes it further. Now let us suppose, he says, that one of the people of the cave is freed from his chair and allowed to face the outside. Would he not first be blinded by the fire? And then, as his eyes adapted and he saw the people passing by on the walkway, would he not distrust the things he saw, believing that his eyes deceived him, because what they showed him contradicted what he knew reality had to be?

Then Plato goes still one step more. Let us now imagine that our freed cave dweller eventually acclimates to the world outside the cave, and recognizes that as reality. 'Wouldn't he then remember his first home, what passed for wisdom there, and his fellow prisoners, and consider himself happy and them pitiable? And wouldn't he disdain whatever honors, praises, and prizes were awarded there to the ones who guessed best which shadows followed which? Moreover, were he to return there, and try to explain to them that their reality was all an illusion, wouldn't it be said of him that he went up whole and came back with his eyes corrupted? And if the people of the cave were somehow able to get their hands on him, wouldn't they try to kill him?'

I've been thinking about this allegory a lot lately because there are many things about our current situation that cause me to wonder whether a lot of people haven't been looking at shadows on the wall and mistaking them for reality. It seems to be particularly true in American politics and economics. For example, despite mountains of evidence to the contrary, many Americans believe that Barack Obama is a Muslim (he isn't) and that the Obama Administration was responsible for the financial crisis (it wasn't; it hadn't even been elected yet) - and the number who believe these things is actually increasing. And before you put this down to closet bigotry (which some of it may be), let me remind you that over 75% of my fellow countrymen believe in angels and less than 50% believe in evolution, even though the first do not represent reality and the second does. Perhaps the greatest success of the right wing in the United States is having convinced most middle- and lower-class Americans that their own happiness and material well-being depend on unregulated capitalism, even though all examples of that unfettered beast known to date have been characterized chiefly by its feasting on those same Americans.

Declining standards of education - and the creeping hegemony of the religious right over local schools - is one reason for this, but a bigger reason is that it is very easy nowadays to spend your entire life, figuratively speaking, looking at the same comforting set of shadows, without ever having to turn and face the world outside the cave. Ideologically driven cable 'news' channels, which claim to be 'fair and balanced' but are actually neither, make it possible for people to derive all their information from a source that never challenges their view of the world, and the same is true of the plethora of biased internet 'information' sites. Journalism has been replaced by opinion, and objectivity in media is threatening to go the way of the dodo, because people become angry when their worldview is challenged, and advertisers, who call most of the shots these days, don't like angry people.

And woe betide the individual who tries to convince his or her fellow citizens that what they have been looking at are nothing but shadows on the wall. They aren't always killed - Plato was exaggerating for effect, though it has happened, especially in countries where there is an orthodox religion and/or a totalitarian regime - but they are certainly ridiculed, marginalized, scorned, and often abused.

We've all seen that scenario, in many aspects of life. When innovation is viewed as heresy, those who have much invested in the status quo may become not just master journeymen, but witch-hunters. Nothing is more stifling to progress, not only in the arts, education and politics, but in science. Especially in science.

Scientific progress depends on constant challenges to our notion of what reality is. The moment we believe something is completely understood, we lose the drive to explore. At the turn of the century, many physicists believed that classical physics had provided a complete description of the world; all that was necessary to do henceforth was to measure things to ever increasing precision. The mavericks who challenged that assumption eventually discovered quantum mechanics, but until people became convinced that the new physics gave a more accurate description of reality these pioneers were ignored or reviled. This is why Max Planck, in a famous remark aimed at his own detractors, said, 'Truth never triumphs, but its opponents eventually die.'

Of course, many new ideas really are wrong, but it's when we start to assume that any new idea must be wrong because it doesn't fit into what we are certain is right that we become obstacles to progress. Skepticism is a good thing, and extraordinary claims really do require extraordinary evidence, but the most exciting time in science is when paradigms fall, shibboleths become signs of stodginess, and everything is up for grabs.

Looking at biology today, I can see a number of paradigms that seem ripe for toppling, but that will probably evoke a lot of resistance when challenged. Here are a few:

The idea that a number of highly expressed proteins are 'natively unfolded' or 'intrinsically disordered' in the cell. This is most often said about alpha-synuclein, the membrane-associated, Parkinson's disease-related protein that makes up almost 1% of the protein content of neurons. When isolated, often by a boiling step, synuclein behaves as a random coil until incubated with lipids, at which point it acquires a fair amount of helical structure. But, really, how likely is it that it isn't at least partially folded in vivo? Cells have elaborate machinery to fold proteins that have trouble folding, and equally elaborate machinery to degrade those that don't fold. Do you really believe that 1% of the protein content of a neuron is made up of something with all the structural order of a plate of spaghetti? I have grave doubts. That some portions of many proteins are disordered I am sure of, but that an abundant cellular protein should be unfolded most of the time strains credulity. Part of the problem, I suspect, is that the term 'natively unfolded' is ambiguous. If it means unfolded in the cell, as I said, I'm dubious. If it means that it would be unfolded unless it came into contact with lipids, well, I can accept that, but isn't that the case with any integral membrane protein, for example, yet no one would ever call them 'intrinsically disordered'. But heaven help anyone who challenges the idea that synuclein is unfolded most of the time. This paradigm has completely taken over the Parkinson's research community, and it will die hard.

The notion that prokaryotic cells are much less organized than eukaryotic cells. This one may actually be on the way out, I'm glad to say, but all the biology and biochemistry textbooks I'm aware of still imply it, if they don't say so directly. I suspect many scientists who work on eukaryotic systems still hold to it, if only subconsciously. The more we learn about prokaryotes, though, the more highly organized and complex their interiors seem to be. The view of bacteria as bags of enzymes and nucleic acids while mammalian cells are models of organizational complexity and sophistication is probably about as much a description of reality as the ancient notion that the earth was the back of a giant turtle. Bacteria may even turn out to be more sophisticated, because they have had make do with a smaller cell volume and fewer genes.

In genome biology, the idea that all projects aimed at gathering massive amounts of data are equally worthwhile. Most scientists probably wouldn't subscribe to this, at least not publicly, but unfortunately, many science administrators do. Data-mining has become so linked with genomics that it consumes most of the funding, even when, as in the case of projects like structural genomics and genome-wide association studies, the results have proven to be worth far less than their originators' hype proclaimed they would be. I don't mind trying such things out to see if they might be useful, but we seem utterly unable to pull the plug on them, or even phase them out gradually, when it becomes clear that they are not. The human genome sequencing project was a great achievement, and has already repaid its cost many times over in knowledge and in the spawning of other great science. Neither of the projects I just mentioned have done so, and I think it's a pretty safe bet that they never will. We need a balance in the types of science we support and value, but balance is something that seems in short supply these days.

The belief that to model a system is to understand it. Systems biology, which started out as a nice modern version of physiology, has almost been hijacked by this paradigm. I've got nothing against models, but I do question the blind notion that they equate to understanding. Sometimes they do, but far more often they represent, not a more sophisticated view of a complex system, but an oversimplified one (albeit an oversimplified view that, one hopes, can make quite useful predictions). The tacit assumption that those who don't model are archaic reductionists is almost insulting to geneticists and physiologists, who have long understood the importance of considering pathways and processes, interconnected and parallel, in interpreting their experimental data. The other day I heard a computational biologist describe one of his simulations as 'an experiment'. I know what he meant, and I suppose it's okay to use that term for any procedure undertaken to make a discovery or test a hypothesis, but it still made me cringe. The best modelers have one foot firmly planted on measured data from real organisms or molecules (and, to be fair, I think the best experimentalists these days maybe should at least have one toe dabbling in the sea of modeling).

Each of these paradigms is characterized by two things: a sense that it represents the only right view of the world and a coterie of staunch defenders whose reputations and funding depend on acceptance of that view. I've seen people who challenge one of them dismissed, not with a careful critique of their evidence for challenging it, but with the statement that 'everybody knows this is the way it is, so you must be wrong.' It's been said that there are three stages in the development of an idea: (1) that's ridiculous, we all know it's not that way; (2) there may be something to what you say, but it isn't important; and (3) oh, we all knew that all along. What looks like a true perspective may be nothing more than a rut based on untested assumptions, but try telling that to those whose livelihood revolves around it.

We can all fall into this trap so easily. If we're not careful, we can mistake our assumptions about reality for reality itself. We can become comfortable, unquestioning, robotic - even dogmatic. We can forget that science only thrives when everything is examined, everything is questioned, and assumptions are not confused with facts. Mavericks are discomforting, often annoying, but without them we risk spending our scientific lives in a cave, never realizing that the things we believe in are merely shadows on the wall.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Gregory A Petsko.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Petsko, G.A. Shadows on the wall. Genome Biol 11, 136 (2010). https://doi.org/10.1186/gb-2010-11-9-136

Download citation

  • Published:

  • DOI: https://doi.org/10.1186/gb-2010-11-9-136

Keywords