Denialists Are Not Really Skeptics from Science Gone Sideways (Chapter VIII) of The Scientific Attitude by Lee McIntyre

Denialists are perhaps the toughest type of charlatans to deal with because so many of them indulge in the fantasy that they are actually embracing the highest standards of scientific rigor, even while they repudiate scientific standards of evidence. On topics like anthropogenic climate change, whether HIV causes AIDS, or whether vaccines cause autism, most denialists really don’t have any other science to offer; they just don’t like the science we’ve got. They will believe what they want to believe and wait for the evidence to catch up to them. Like their brethren “Birthers” (who do not accept Barack Obama’s birth certificate) or “Truthers” (who think that George W. Bush was a co-conspirator on 9/11), they will look for any excuse to show that their ill-warranted beliefs actually fit the facts better than the more obvious (and likely) rational consensus. While they may not actually care about empirical evidence in a conventional sense (in that no evidence could convince them to give up their beliefs), they nonetheless seem eager to use any existing evidence – no matter how flimsy – to back up their preferred belief. But this is all based on a radical misunderstanding or misuse of the role of warrant in scientific belief. As we know, scientific belief does not require proof or certainty, but it had better be able to survive a challenge from refuting evidence and the critical scrutiny of one’s peers. But that is just the problem. Denialist hypotheses seem based on intuition, not fact. If a belief is not based on empirical evidence, how can we convince someone to modify it based on empirical evidence? It is almost as if denialists are making faith-based assertions.

Unsurprisingly, most denialists do not see themselves as denialists and bristle at the name; they prefer to call themselves “skeptics” and see themselves as upholding the highest standards of science, which they feel have been compromised by those who are ready too soon to reach a scientific conclusion before all of the evidence is in. Climate change is not “settled science,” they will tell you. Liberal climate scientists around the world are hyping the data and refusing to consider alternative hypotheses, because they want to create more work for themselves or get more grant money. Denialists customarily claim that the best available evidence is fraudulent or has been tainted by those who are trying to cover something up. This is what makes it so frustrating to deal with denialists. They do not see themselves as ideologues, but as doubters who will not be bamboozled by the poor scientific reasoning of others, when in fact they are the ones who are succumbing to highly improbable conspiracy theories about why the available evidence is insufficient and their own beliefs are warranted despite lack of empirical support. This is why they feel justified in their adamant refusal to change their beliefs. After all, isn’t that what good skeptics are supposed to do? Actually, no.

Skepticism plays an important role in science. When one hears the word “skepticism” one might immediately think of the philosopher’s claim that one cannot know anything: that knowledge requires certainty and that, where certainty is lacking, all belief should be withheld. Call this philosophical skepticism. When one is concerned with nonempirical beliefs – such as in Descartes’s Meditations, where he is concerned with both sensory and rational belief – we could have a nice discussion over whether fallibalism is an appropriate epistemological response to the wider quest for certainty. But, as far as science is concerned, we need not take it this far, for here we are concerned with the value of doubt in obtaining warrant for empirical beliefs.

Are scientists skeptics? I believe that most are, not in the sense that they believe knowledge to be impossible, but in that they must rely on doubt as a crucible to test their own beliefs before they have even been compared to the data. Call this scientific skepticism. The ability to critique one’s own work, so that it can be fixed in advance of showing it to anyone else, is an important tool of science. As we have seen, when a scientist offers a theory to the world one thing is certain: it will not be treated gently. Scientists are not usually out to gather only the data that support their theory, because no one else will do that. As Popper stated, the best way to learn whether a theory is any good is to subject it to as much critical scrutiny as possible to see if it fails.

There is a deeply felt sense of skepticism in scientific work. What is distinctive about scientists, however, is that unlike philosophers, they are not limited to reason; they are able to test their theories against empirical evidence. Scientists embrace skepticism both by withholding belief in a theory until it has been tested and also by trying to anticipate anything that might be wrong in their methodology. As we have seen, doubt alone is not enough when engaging in empirical inquiry; one must be open to new ideas as well. But doubt is a start. By doubting, one is ensuring that any new ideas are first run through out critical faculties.

What of scientists whose skepticism leads them to reject a widely supported theory – perhaps because of an alternative hypothesis that they think (or hope) might replace it – but with no empirical evidence to back up the idea that the current theory is false or that their own is true? In an important sense, they cease to be scientists. We cannot assess the truth or likelihood of a scientific theory based solely on whether it “seems” right or fits with our ideological preconceptions or intuitions. Wishing that something is true is not acceptable in science. Our theory must be put to the test.

And this is why I believe that denialists are not entitled to call themselves skeptics in any rightful sense of the word. Philosophical skepticism is when we doubt everything – whether it comes from faith, reason, sensory evidence, or intuition – because we cannot be certain that it is true. Scientific skepticism is when we withhold belief on empirical matters because the evidence does not yet allow us to meet the customarily high standards of justification in science. By contrast, denialism is when we refuse to believe something – even in the face of what most others would take to be compelling evidence – because we do not want it to be true. Denialists may use doubt, but only selectively. Denialists know quite well what they hope to be true, and may even shop for reasons to believe it. When one is in the throes of denial, it may feel a lot like skepticism. One may wonder how others can be so gullible in believing that something like climate change is “true” before all of the data are in. But it should be a warning sign when one feels so self-righteous about a particular belief that it means more than maintaining the consistent standards of evidence that are the hallmark of science.

As Daniel Kahneman so eloquently demonstrates in his book Thinking Fast and Slow, the human mind is wired with all sorts of cognitive biases that can help us to rationalize our preferred beliefs. Are these unconscious biases perhaps the basis for denialism even in the face of overwhelming evidence? There is good empirical support to back this up. Furthermore, it cannot be overlooked that the phenomenon of “news silos” that we spoke of earlier may exacerbate the problem by giving denialists a feeling of community support for their fringe beliefs. Yet this opens the door to a kind of credulousness that is anathema to real skeptical thinking.

In fact, denialism seems to have much more in common with conspiracy theories than with skepticism. How many times have you heard a conspiracy theorist claim that we have not yet met a sufficiently high standard of evidence to believe a well-documented fact (such as that vaccines do not cause autism), then immediately exhibit complete gullibility that the most unlikely correlations are true (for instance, that the CDC paid the Institute for Medicine to suppress the data on thimerosal)? This fits completely with the denialist pattern: to have impossibly high standards of proof for the things that one does not want to believe and extremely low standards of acceptance for the things that fit one’s ideology. Why does this occur? Because unlike skeptics, denialists’ beliefs are not borne of caring about evidence in the first place; they do not have the scientific attitude. The double standard toward evidence is tolerated because it serves the denialists’ purpose. What they care about most is protecting their beliefs. This is why one sees all of the cheating on scientific standards of evidence, even when empirical matters are under discussion.

…[I]t seems wrong to classify denialists as skeptics. They may use evidence selectively and pounce on the tiniest holes in someone else’s theory, but this is not because they are being rigorous; the criteria being used here are ideological, not evidential. To be selective in a biased way is not the same thing as being skeptical. In fact, considering most of the beliefs that denialists prefer to scientific ones, one must conclude that they are really quite gullible.

The full book can be purchased here.

Excerpt from Chapter II (The State of Nature: Tribal Truth) of The Constitution of Knowledge: A Defense of Truth by Jonathan Rauch

“Confirmation bias and motivated reasoning have been widely researched and explored in both political and nonpolitical contexts,” write the RAND Corporation’s Jennifer Kavanagh and Michael D. Rich in their 2018 report, Truth Decay: An Initial Exploration of the Diminishing Role of Facts and Analysis in American Public Life. The research would not please Socrates, who teaches us to be humble about our beliefs, to assume we are often wrong, and to seek out challenging information and opinions. Unfortunately, those are usually the last things we want to do. Instead, we seek out congenial beliefs, then look for evidence and arguments to defend them. The British psychologist Peter Wason, who coined the term “confirmation bias,” found in experiments in the 1960s that people who were asked to guess the rule which was used to generate a string of numbers (such as 2, 4, 6) by proposing additional numbers would come up with a rule easily, but then test it only by offering additional numbers that confirmed their guess (such as 8, 10, 12). They hardly ever tested their guess by offering numbers which would disconfirm their theory, such as 7, 8, 9 — which would have worked, because the rule was “increasing integers.” Neglecting to seek disconfirmation is like seeing three black cats, hypothesizing that all cats are black, and then not bothering to look around for any non-black cats.

Other studies since then have confirmed the same tendency. Kavanagh and Rich, of RAND, cite research finding that “people will choose search and decision methods that are most likely to lead to desired outcomes or conclusion, not to the best-informed ones.” Confirming partisan beliefs delivers a dose of satisfaction by triggering a little hit of dopamine in the brain, according to Haidt. “Like rats that cannot stop pressing the button, partisans may be simply unable to stop believing weird things,” he writes. “Extreme partisanship may be literally addictive.” By the same token, studies find that people are strongly averse toward seeking out, listening to, or even noticing information which challenges their beliefs. One study in 2017 found that two-thirds of subjects would pay money to avoid the discomfort of exposing themselves to the other side’s political views. “Over a third of Obama voters and more than half of Romney voters,” reported The Economist, in its account of these experiments, “compared the experience of listening to the other side’s voters to having a tooth pulled.”

The full book can be purchased here.

Excerpts from the Introduction to The Assault on American Excellence by Anthony Kronman

“[O]ur colleges and universities have resisted the demand to make themselves over in the image of the democratic values of the culture as a whole. Even while striving to make the process of admission more open and fair, they have held to the idea that part of the work of our most distinguished institutions of higher learning is to preserve, transmit, and honor an aristocratic tradition of respect for human greatness.

This is important for two reasons. The first is the preservation of a cultured appreciation of excellence in human living, as distinct from vocational success. The latter produces inequalities of wealth, status, and power. But it is consistent with the democratic belief that no one’s humanity is greater than anyone else’s. This is true if we are talking about political and legal rights. It is false if we assume that the universal powers of enjoyment, expression, and judgment that all human beings possess are more developed in some souls than others — that in some they are particularly subtle and refined, especially when it comes to the most intellectually, aesthetically, and spiritually demanding exertions. This is an aristocratic assumption. In a democracy like ours, it is in constant danger of being derided or dismissed. But if it is, we lose something of value. Without the idea of greatness of soul, human life becomes smaller and flatter. It becomes both less noble and less tragic. Protecting this idea from democratic diminution is the first reason our colleges and universities need to nurture the aristocratic love of what is brilliant and fine.

The second is that this love itself contributes to the strength and stability of our democratic way of life.

Every adult in our country gets to vote. Each has the right to decide for him- or herself which candidates and policies are best. But the forces of conformity are great. The principle of universal equality, and its corollary the principle of individual self-rule, in fact make these forces stronger. The freedom to make up one’s own mind is a large responsibility. Many ease the burden by embracing the opinions of others with little or no independent reflection. The result is a kind of groupthink, partly the result of ignorance and partly of fear. This makes it easier for would-be tyrants to manipulate the democratic masses and eventually deprive them of their freedom. Tocqueville’s greatest concern for the future of America was that conformity of thought would ease the way to despotism.

There are many counterweights to this, of course. Tocqueville puts special emphasis on the role of a free press. An education in human greatness contributes to democratic life as well. To some this will seem paradoxical. How can the cultivation of a spirit of aristocratic connoisseurship make our democracy stronger? The answer is by developing the habit of judging people and events from a point of view that is less vulnerable to the moods of the moment; by increasing the self-reliance of those who, because they recognize the distinction between what is excellent and common, have less need to base their standards on what ‘everyone knows’ or ‘goes without saying’; and by strengthening the ability to subject one’s own opinions and feelings to higher and more durable measures of truth and justice. In all these ways, an aristocratic education promotes the independent-mindedness that is needed to combat the tyranny of majority opinion that, in Tocqueville’s view, is the greatest danger our democracy confronts.

***

[Oliver Wendell Holmes Jr.] used the expression [‘effervescence of democratic negation’] in a talk he gave to the Harvard Law School Association in 1886. The subject of his talk was legal education, but in the course of it he made some striking observations about the nature of education in general.

Education, Holmes says, ‘lies mainly in the shaping of men’s interests and aims. If you convince a man that another way of looking at things is more profound, another form of pleasure more subtile [sic] than that to which he has been accustomed — if you make him really see it — the very nature of man is such that he will desire the profounder thought or the subtiler [sic] joy.’ The ideal of education is threatened by a form of aggressive egalitarianism that Holmes descries. The passage is worth quoting in full:

‘I think we should all agree that the passion for equality has passed far beyond the political or even the social sphere. We are not only unwilling to admit that any class or society is better than that in which we move, but our customary attitude towards every one in authority of any kind is that he is only the lucky recipient of honor or salary above the average, which any average man might as well receive as he. When the effervescence of democratic negation extends its workings beyond the abolition of external distinctions of rank to spiritual things — when the passion for equality is not content with founding social intercourse upon universal human sympathy, and a community of interests in which all men share, but attacks the lines of Nature which establish orders and degrees among the souls of men — they are not only wrong, but ignobly wrong. Modesty and reverence are no less virtues of freemen than the democratic feeling which will submit neither to arrogance nor to servility.’

***

Where it is a question of the weight or worth of different ‘souls’ — by which Holmes means, roughly speaking, the intellectual and spiritual life of human beings, insofar as it is directed toward the great questions of existence and finds expression in the works of science, philosophy, and art that constitute the realm of culture or cultivated experience — the principle of equality is not merely misplaced but destructive.

***

Holmes chooses his words carefully. ‘Democratic negation’ is a program of leveling; of knocking down what is ‘subtile’ and ‘profound’; of denying the greatness of exceptional souls; of making their greatness look less conclusive, comprehensive, or significant than the ‘lines of Nature’ declare. It is a campaign of belittlement that draws its energy from the language and mentality of politics and law, where equality is the norm. But when, like the froth from a bottle of champagne, it giddily overspills its proper domain and invades the province of college and university life, it belittles what it ought to revere: the greatness of the highest and best things that attract the finest souls and afford them a ‘joy’ that coarser ones never know. This is the ‘effervescence’ against which Holmes warns.

***

We live in an age that prides itself on its aspirations to overcome every form of prejudice. But there is one that remains so strong we hardly notice it at all. It is the unspoken belief that, by comparison with the morally enlightened position we occupy today, those who lived before us dwelt in darkness and confusion, groping to find truths we now securely possess. Many believe that we are no more obliged to take their backward views seriously than we are to endorse the beliefs of the medieval astronomers who put the earth at the center of things. We are free, they say, to refashion the past according to our contemporary moral scruples. Indeed, they insist we have a duty to do so. Until we have scrubbed our inheritance clean and brought it into conformity with what we now know to be the truth, the world remains disfigured by emblems of unrighteousness that spoil its integrity from an ethical point of view. The passion for renaming that is sweeping America’s campuses today springs from this demand.

It is a dangerous demand. It destroys our capacity for sympathy with the very large number of human beings who are no longer among the living and therefore cannot speak for themselves. It obscures the truth that we are no more able to see things in a perfect light than our ancestors were, even if we judge their morality to have been, in certain respects, backward or incomplete. It encourages a species of pride that blinds us to the greatness of what was said and done by those whose values correspond only imperfectly to ours.

Our colleges and universities have a special duty to resist this. They are, in an obvious sense, the custodians of the past. Their libraries preserve the works, and their departments the traditions of learning, on which the continued existence of civilized life depends. But beyond this they have a particular responsibility to foster the tolerance for ambiguity and dissonance that is the best antidote to the spirit of righteous conviction that confines the soul within narrow bounds by conferring a moral authority on its existing prejudices and exposes it to the danger of blindly deferring to the opinions of the group or tribe to which one belongs. Many campus monuments need supplemental commentary. But few if any ought to be torn down or erased. Whether one should, in any particular case, is a question that calls for the most careful judgment. How it is made depends on the spirit in which it is approached. To approach it with the evangelical conviction that the past must be remade to look like the present violates an educational duty of the first importance.

What I am calling a ‘tolerance for ambiguity and dissonance’ Learned Hand described as the ‘spirit of liberty.’ It encourages doubt and self-reflection and breaks the tendency to go along with what ‘everyone is saying.’ It is therefore an essential condition of democratic life. It is also a condition of the refinement and growth of the individual human being. Those who succeed in acquiring it take in more of the world and of themselves. They achieve a spaciousness of outlook and feeling that others never reach. Their souls are larger, freer, more developed. They are aristocrats of the spirit, in a sense that both Freud and Whitman would have understood and approved. This is perhaps the most credible sense of aristocracy to which we can still aspire in the ‘democratic centuries’ that Tocqueville forecast.

The campaign for renaming aims to level the distinction between the present and the past. It too is inflamed by the spirit of ‘democratic negation’ that inspires the current understanding of diversity and the demand that campus speech be cleansed for the sake of creating a community of inclusion…[W]e must resist [such aims], for when the egalitarianism that is vital to our political well-being is extended to those islands of aristocratic sentiment that Tocqueville wisely viewed as precious in their own right, and as a needed balance to the excesses of democratic life, it does great damage not just to our colleges and universities but to our civilization as a whole.”

The full book can be purchased here.

Excerpt from Chapter IX (The Populist Plutocrat) from What Was Liberalism? by James Traub

The liberal faith in free speech presupposes a baseline rationality, so that more speech, and more untrammeled speech, enables the collective search for truth. [John Stuart] Mill proposed a sort of ethics of truth seeking, obliging each of us not only to protect the speech rights of those with whom we disagree but to test out own point of view against the strongest version of rival claims. The free-speech argument rests on a doctrine of intellectual humility: we need to remain open to the possibility that we are wrong and our rival is right. Mill, of course, was addressing “society.” He could not have anticipated the totalitarian era in which states would seek to gain control over the truth. It was left to the generation of Berlin, Popper, and Orwell to assert that only an unswerving commitment to reason could defend society from unscrupulous leaders prepared to exploit the beast in man.

[However,] It is not authoritarianism but populism that constitutes America’s native threat to the freedom of speech and thought[, as] Populism virtually requires a conspiratorial cast of mind, since the populist asserts that dark forces are foiling the will of the people…[So far,] American politics have been immune to totalitarianism, but hardly to totalitarian habits of thought.

The book can be purchased here.

Excerpts from Chapter IV (Isaiah Berlin and the Anti-totalitarians) from What Was Liberalism? by James Traub

Berlin’s field was political and moral philosophy, but he did not think of himself as a philosopher standing in the line of succession from Locke and Rousseau and Mill. Rather, he studied the ideas of those thinkers like an epidemiologist tracing the path of a disease back to the source. Berlin was, in short, a historian of ideas, or, as he came to think of himself, a historian of philosophy. He did not deliver a comprehensive statement of his own views until 1952, when he delivered a series of lectures on Enlightenment and Romantic thinkers…Berlin’s point: the heroic idealism of the Romantic era had offered a blueprint, and a banner, for modern tyrants.

…In the second lecture, on Rousseau…[Berlin] explained that, like a mathematician who has cut the knot of an ancient conundrum, Rousseau solved, or believed he had solved, the apparently intrinsic tension between liberty and authority, or legitimate power. Rousseau viewed both values as absolute: individual liberty was inalienable, but so, too, were the moral rules that govern right behavior and thus must govern society. Must one, then, continually adjust the dial between these two poles, as previous thinkers had supposed? Not at all. Both were absolute. However, Rousseau insisted, a rational man would never will for himself what was at variance with the “general will.” If he did, it was not his true, natural self speaking but the self corrupted by the artifice, the pervasive falseness, of society. In a just society, liberty and authority would be fully harmonized. By this strange turn of logic, Berlin went on, Rousseau, the champion of liberty who helped inspire the French Revolution, ended up betraying the idea of liberty — just as the Revolution itself did. Berlin was hardly the first thinker to lodge this allegation; so, too, had Constant. But Berlin demonstrated the link between the apparently noble principle of the general will and the slave ideology of Lenin and Stalin in a way that no listener in 1952 could have missed:

You want to give people unlimited liberty, because otherwise they cease to be men; and yet at the same time you want them to live according to the rules. If they can be made to love the rules, then they will want the rules, not so much because the rules are rules as because they love them…So Rousseau says, “Man is born free, and yet he is everywhere in chains.” What sort of chains?…If the chains are simply rules the very obedience to which is the most free, the strongest, the most spontaneous expression of your own inner nature, then the chains no longer bind you — since self-control is not control. Self-control is freedom.

In these lectures Berlin asked how liberty, the greatest of all political values, comes to be perverted into its opposite….The grain of totalitarianism is planted when we say that we can achieve all our goals through a single formula, that we can enjoy, as Rousseau insisted, absolute liberty even amidst absolute authority. Modern man, beset by agoraphobia, as Berlin put it, is drawn to the promise of an effortless resolution. The total system is beautiful and sleek, precisely because it is total. But of course it is false, for all good things do not, in fact, go together. We must choose among competing goods — so much liberty, so much equality, so much justice, and so on. There is no “right” choice; each of us will have different preferences. A liberal society does not tell us what we should wish but honors the variety of wishes. Mill remained a hero to Berlin throughout his life, in part because Mill understood the moral importance of variety — of pluralism. He defended the right of every individual to act and think according to his or her own particular and even peculiar nature. Because, Berlin wrote, Mill’s view of human nature accepts individuals’ “perpetual incompleteness, self-transformation and novelty, his words are today alive and relevant to our own problems.”

…One kind of liberty had been enough for John Stuart Mill, but in the twentieth century a new and, Berlin thought, insidious understanding of freedom had gained a powerful purchase in people’s minds. Today, Berlin said, an “open war” is being fought between “two systems of ideas” — liberalism and communism — founded on two very different understandings of what it means to be free.

Berlin began by observing that all classical liberal thinkers have understood liberty as freedom from coercion. He called this doctrine “negative liberty” because it defines the sphere within which no external obligation may be imposed on an individual. For Berlin, as for Mill, political thinking began with the autonomy, and thus the sanctity, of the person. Indeed, in his essay “John Stuart Mill and the Ends of Life,” Berlin observed that, despite Mill’s dutiful effort in On Liberty to provide a utilitarian justification for maximum individual freedom, he plainly regarded liberty as a good in itself. People have different tastes, wishes, ideas, temperaments; the great fact of human life is not uniformity but diversity. To deprive people of their freedom to speak and behave as they wish is to deny them the fullness of their individuality. Berlin…wrote that liberal thinkers naturally disagree on the scope and nature of Mill’s inviolable sphere of private thought and action, but the minimum must be understood as “that which a man cannot give up without offending against the essence of his human nature.”

But here Berlin made a striking observation: Mill, he said, has confused two different ideas. The idea of coercion is bad and liberty good is logically distinct from Mill’s belief that individuals in conditions of liberty will be able to fully realize their own gifts and that a society marked by a commitment to freedom will reach the greatest heights of which humans are capable. The “fiery individualism” Mill held out as an affirmative good often emerged from highly disciplined societies, like the Scottish Puritans, that routinely and intentionally infringed on the sphere of private speech and behavior. Negative liberty, in turn, may flourish in the absence of the kind of self-government that Mill thought of as the end point of a truly liberal society; one need only think of the despot who leaves his subjects alone so long as they obey. The liberal wish for private space and the accompanying vision of human and social self-realization are not only not the same thing, but they may conflict with one another. One, Berlin wrote, is “freedom from” — that is, freedom from coercion — while the other is “freedom to” — that is, to lead a certain kind of life. It is this freedom that Berlin described as “positive liberty.”

…The latter, [Berlin] wrote, springs from the classically liberal “wish on the part of the individual to be his own master.” What’s more, negative liberty, by itself, offers no guidance on how to shape a better life, not only for oneself but for one’s fellow man. Many liberals, Berlin observed, feel beleaguered and at times tormented by the conclusion that their liberty has been purchased at the cost of the suffering of others — through, for example, an unjust economic order. They, therefore, insist on the kinds of large-scale social changes that would lead to “equality of liberty.” [George] Orwell, for example, thought just this way. Berlin accepted that such principles may well constitute “the foundations of liberal morality.” But, he added, crucially, they should not be described as aspects of liberty. If you sacrifice some of your freedom, or society’s freedom, in the name of some other good, you may well have produced more fairness or justice or common decency, but you have diminished the stock of liberty.

This sounds very much like a linguistic quibble. Why is it so important to prevent the misuse of “liberty”?…Berlin’s answer is that totalitarianism has learned to speak the language of liberty, and one must distinguish between true and specious uses of this beguiling word…If one says, as Rousseau did, that, whatever you may think you want, your true self, your rational self, must want to live in accordance with this or that set of principles, then you must put aside your puny wishes. [Johann Gottlieb] Fichte took this dangerous claim one step further by insisting that the individual discovers the highest form of self-realization by submitting to the will of the group, itself the embodiment of true freedom. “Once I take this view,” Berlin concluded, “I am in a position to ignore the actual wishes of men or societies, to bully, oppress, torture them in the name, and on the behalf, of their ‘real’ selves.”

…[F]ascism and communism also posed a dire threat to something so elemental to society that political thinkers of earlier generations had barely needed to take account of it: the very idea of truth. In Darkness at Noon, the 1940 novel that exposed the reality of the Soviet show trials, Arthur Koestler furnished a terrifying picture of a society based on slogans universally understood to be grotesque lies. The struggle to vindicate liberal principles had to be waged not only on political or economic lines but on cognitive ones as well. This conviction lay at the core of Karl Popper’s idea that the “open society” depended above all on the preservation of the spirit of scientific inquiry.

The great prophet on the totalitarian assault on truth was, of course, George Orwell, who first came to recognize the danger during the Spanish Civil War, when he saw that the newspapers of all sides described events that bore no relation to reality. In an unpublished essay apparently written in 1942, he recalled having said to Koestler, “History stopped in 1936” — the year of the Soviet show trials. Thereafter one would never know what was true and what had been fabricated. The experience, Orwell wrote, had left him with the feeling that “the very concept of objective truth is fading out of the world.” This fear came more and more to preoccupy Orwell. In 1947 he wrote that because totalitarian regimes insist that the leadership is infallible, history must be perpetually rewritten in order to eliminate evidence of past mistakes. Totalitarianism thus “demands a disbelief in the very existence of objective truth.” Orwell added darkly that “to be corrupted by totalitarianism one does not have to live in a totalitarian country”; one simply had to surrender to certain habits of thought.

…Like Berlin, Popper looked to reason to counter the pull of the tribal, the magical, the extrahuman. Rationalism, Popper wrote, “is an attitude of readiness to listen to critical arguments and to learn from experience. It is fundamentally an attitude of admitting that ‘I may be wrong and you may be right, and by an effort, we may get nearer to the truth.'” The irrational certitudes of Hitler and Stalin, the incantatory logic that hypnotized whole populations, turned rational skepticism into the sword and shield of liberalism. In his 1947 essay “Philosophy and Politics,” the philosopher Bertrand Russell asserted that “the essence of the liberal outlook lies not in what opinions are held, but in how they are held: instead of being held dogmatically, they are held tentatively, and with a consciousness that new evidence may at any moment lead to their abandonment.”

The full book can be purchased here.

When I Have Fears That I May Cease To Be by John Keats

When I have fears that I may cease to be
   Before my pen has gleaned my teeming brain,
Before high-pilèd books, in charactery,
   Hold like rich garners the full ripened grain;
When I behold, upon the night’s starred face,
   Huge cloudy symbols of a high romance,
And think that I may never live to trace
   Their shadows with the magic hand of chance;
And when I feel, fair creature of an hour,
   That I shall never look upon thee more,
Never have relish in the faery power
   Of unreflecting love—then on the shore
Of the wide world I stand alone, and think
Till love and fame to nothingness do sink.

Excerpts from Chapter XVII (Science and Values) from The Scientific Outlook by Bertrand Russell

Science in its beginnings was due to men who were in love with the world. They perceived the beauty of the stars and the sea, of the winds and the mountains. Because they loved them their thoughts dwelt upon them, and they wished to understand them more intimately than a mere outward contemplation made possible. “The world,” said Heraclitus, “is an ever-living fire, with measures kindling and measures going out.” Heraclitus and the other Ionian philosophers, from whom came the first impulse to scientific knowledge, felt the strange beauty of the world almost like a madness in the blood. They were men of Titanic passionate intellect, and from the intensity of their intellectual passion the whole movement of the modern world has sprung.

…When I come to die I shall not feel that I have lived in vain. I have seen the earth turn red at evening, the dew sparkling in the morning, and the snow shining under a frosty sun; I have smelt rain after drought, and have heard the stormy Atlantic beat upon the granite shores of Cromwell.

…Knowing and feeling are equally essential ingredients both in the life of the individual and in that of the community. Knowledge, if it is wide and intimate, brings with it a realization of distant times and places, an awareness that the individual is not omnipotent or all-important, and a perspective in which values are seen more clearly than by those to whom a distant view is impossible. Even more important than knowledge is the life of the emotions. A world without delight and without affection is a world destitute of value.

The full book can be purchased here.

Matthew vs. Luke: Whoever Wins, Coherence Loses by Tom Flynn

[L]et’s turn to the Christian record. What do the Gospel writers say about Jesus? When it comes to his birth, as a group, they say nothing. The Gospels of Mark and John never mention the Nativity. Only Matthew and Luke describe it.

But it’s misleading to say “Matthew and Luke.” One might better say “Matthew vs. Luke,” for the Gospels bearing their names contradict each other on almost every detail. The popular image of shepherds and wise men side by side before the cradle? Matthew says wise men. Luke says shepherds. Neither says both.

The star in the East? Only in Matthew.

“Hark, the herald angels sing” . . . but only in Luke. Matthew never heard of them.

But then, only Matthew heard of Herod’s slaughter of the innocents []. That’s right, the indiscriminate killing of every male baby in Judea—with one significant exception—did not merit Luke’s attention. On the other hand, no Roman historian chronicles this atrocity either, not even Flavius Josephus. Josephus reviled Herod and took care to lay at his feet every crime for which even a shred of evidence existed. Had Herod really slaughtered those innocents, it is almost unimaginable that Josephus would have failed to chronicle it.

Matthew says Joseph and Mary lived in Bethlehem, moving to Nazareth after their flight into Egypt []. But Luke says Joseph and Mary lived in Nazareth all along; Jesus was born in Bethlehem only because Joseph and Mary had traveled there to enroll in the census []. Roman records mention no such census; in fact, Roman history records no census ever in which each man was required to return to the city where his ancestral line originated. That’s not how the Romans did things.

Our litany of errors continues. Matthew and Luke both claim to catalogue the male ancestors of Jesus —through Joseph—back to King David. Matthew lists twenty-eight generations between David and Jesus. Luke lists forty-one. Matthew and Luke propose different names for Joseph’s father and grandfather. They propose different names for each ancestor separating Joseph from Zerub’babel, a late Old Testament figure. Incredibly, over the five-hundred-year span preceding the birth of Jesus, Matthew and Luke, whom many Christians consider divinely inspired, cannot agree on the name of a single one of Joseph’s ancestors!

This disparity is less troublesome if one views Christianity in historical rather than metaphysical terms. Scholars tell us the Gospels of Matthew and Luke developed independently in discrete Christian communities. Neither evangelist could know that the other had guessed differently about story details or had made different choices about which pagan traditions to borrow. But why should either evangelist include a genealogy through Joseph if Jesus were born of a virgin—in which case Joseph would not be his father?

…Next question: When was Jesus born? No one knows. Estimates that the Nativity occurred a few years B.C.E. arose from scholarly efforts to reconcile Luke’s census and Matthew’s slaughter of the innocents with known history. Modern scholarship tells us that neither event occurred, leaving us without evidence for the year of Jesus’ birth.

…Did Jesus exist? Possibly not—and if he did, surely he bore scant resemblance to the legendary figure of the Christian Gospels. Regarding his birth, we can be less equivocal. So steeped in pagan lore are the dueling accounts of Matthew and Luke, so reflective of the politics of the early Church rather than of any possible history, and so wholly contradictory in their details, that when it comes to the Nativity, Christianity’s foremost sources tell us quite literally nothing at all.

The full article can be found here.

Excerpt from Chapter XI (Technique in Society) from The Scientific Outlook by Bertrand Russell

From the technique of advertising it seems to follow that in the great majority of mankind any proposition will win acceptance if it is reiterated in such a way as to remain in the memory. Most of the things that we believe, we believe because we have heard them affirmed; we do not remember where or why they were affirmed, and we are therefore unable to be critical even when the affirmation was made by a man whose income would be increased by its acceptance and was not backed by any evidence whatever. Advertisements tend, therefore, as the technique becomes perfected, to be less and less argumentative, and more and more merely striking. So long as an impression is made, the desired result is achieved.

The full book can be purchased here.