Category Archives: Religion


For and Against The Case for Christ

The Case for Christ, by Lee Strobel

One of the best Christian apologetic books, “The Case for Christ” makes a weak, diluted case, which says volumes about the field. It is written by an apologist via interviews with apologists. Strobel appears to take on a skeptical role but his acceptance–hook, line and sinker–of poor or speculative explanations for everything from the existence of Jesus to the question of divinity makes one wonder about the sincerity of his effort. If one grants that sincerity then Strobel is displaying a large dose of believence (credulousness).

Stroebel incessantly ignores or gives insufficient arguments against common explanations like myth-making or the possibility of Jesus as an everyday preacher. The argument is made that Jesus’ resurrection must have been true because without it the Christian faith would fall apart… Um, yeah. So he goes with a belief conclusion. This type of forced reverse logic is common with apologists.

One can give the benefit of the doubt for some questions like the existence of Jesus, but Strobel does not raise the burden of proof for supernatural events like Jesus walking around after death. Sorry no, for something like that you’re going to have to do better than ‘people said they saw him.’ Walking dead is no sleight of hand card trick.

For those who want to believe in Jesus and the whole package (biased by that condition) this book will probably do the trick. But those with unclouded reason will recognize the quick insufficiency of Strobel’s conclusions. This point is punctuated in the last paragraphs where he puts forth a horrible analogy, comparing seeing a physical person in real life with “…the witness of the Holy Spirit in our hearts.” No, these two things are not alike.

You are credulous, Mr Strobel. Case closed.


The Case Against The Case For Christ, by Robert M Price

Robert M. Price’s “The Case Against The Case for Christ” (Lee Strobel’s popular book) is both a critique of Strobel’s book and its specific arguments. On the overall book Price points out the major methodological flaw, being a collection of interviews with Christian apologists rather than a diverse set of scholars on various subjects from the historicity to the divinity of Jesus. Price knows who many of these scholars are, meaning they are not inaccessible.

He cites them, others similar and his own analysis and yields alternate explanations of oddities in biblical content. He also compares the Bible to other writings of the era, noting similarities in style, content and purpose. Going further back he notes the same in much earlier writings. These draw a historical trend line as evidence of Christianity being yet another religion derived from previous supernatural beliefs, many rewriting similar elements (virgin birth, flood story, death and resurrection) while adding their own cultural spin.

Of interesting note is his observation that ancient writers often prioritized purposeful messages over historical accuracy. Their point was lesson, not history. Too, “authors” were not always those who wrote the material but attributed to those whose name would give the content more legitimacy. Similarly, named authors could be compilations of writers unified into a fictional name (ex Moses).

Moving forward to gospel times, Price points out the same pattern on a small scale–the Gospel of Mark being the earliest writing then the others being rewritten and elaborated on in a pattern consistent with mythopoeia. For more detail on this see Richard Carrier’s “On the Historicity of Jesus.”

But Price is just one person so let’s put him under the same critical microscope I previously put Strobel (see my review of “The Case for Christ,”…). Price’s hypotheses of this or that circumstance are rational but generally lack sufficient evidence–particularly corroborating evidence–to make them the reliable most-likely true historical conclusions. Is this a criticism? No but it is a limitation. We are observing a researcher trying to put together pieces of a story puzzle to find a factual story beneath, if one exists. Contrast this to an apologist whose methodology is to twine together, often rationalize, puzzle pieces to a preferential storyline. Price shows himself to be more credible by following through on what he realized he must do to investigate naggingly inadequate apologetic arguments, ironically in a attempt to resolve those inadequacies:

I knew it was a matter of basic honesty that I had to place myself, for the moment, in the shoes of the nonbeliever if I were to evaluate each argument for the historical Jesus or for Bible accuracy. I knew it would be phony for me to try to convince others by using arguments that I did not actually think were cogent. I didn’t want to use any tactics, say anything that might work, as if I were used car dealer or a mere propagandist.

His journey led him to disbelief. Others with the same intention from the same starting point reach a different conclusion. The difference with “The Case Against The Case for Christ” is that there are no dissonance-inducing moments here, no extrapolations of under-justified preferences, no hypotheses miraculously elevated to Law. I cannot say the same for any apologetic book I have read. Not one.

So in the end give both men their shot. Read Strobel’s book and read Price’s book, one after the other. See where you land.


Religious Science

I have a better internal and intuitive understanding of folklore and myth than science and technology, so in that way fantasy is easier.
-Sarah Zettel, novelist

Josh Peck is a self-described biblical researcher, author and online show host. He has written several books, including “Quantum Creation,” a book about prophecy and “quantum physics from a Christian perspective.” You may be thinking that there is no such thing as “quantum physics from a Christian perspective,” but Mr. Peck tells us otherwise. He explains there is nothing wrong with the experimental results of science, just the interpretation by those who do not cross-reference the holy book. He claims no conflict between science and religion because:

“When we have the proper interpretation of scripture and the proper interpretation of scientific observation, they should both agree in full. They both should act as two pillars holding up the true understanding of reality. If they do not agree then one or both of the pillars are broken and must be fixed, otherwise the whole structure will come crashing down.”

In other words, scientific observation and scripture have the same level of legitimacy and since scripture is correct because, you know…word of deity…any scientific finding that conflicts with scripture must be reinterpreted until it matches. There now, no conflict.

It is hard to swallow that this bastardization of sound methodology is what many believers call science. It is not, though this is what is being taught to the faithful uncomfortable with scientific findings that imply their deity is not the creator of the universe. Their thinking is rationalization (conscious and unconscious) biased by a presupposition of biblical inerrancy. The cognitive blindness is stunning, truthiness applied like a taste preference.

Interestingly, this type of science-off-the-rails often does include some true science. It may even include a great deal as this presentation by Jason Lisle demonstrates. However where Dr. Lisle goes off track can be hard to decipher if one does not already have strong science knowledge, an inherent problem. If one has been raised with a religiously dominated education where evolution, geology and psychology have been replaced with creation myth, a flood story and objective moral rules, it is nearly impossible to notice the slips. To a student listening to this mangled science such presentations can appear to reinforce scriptural texts. Passages are “matched” through numerology-like  pattern recognition, subjective interpretation and prophesy-alignment presented as evidence.

Real science does not operate in this manner. It works on a much tougher playing field where objective evidence rules. Results only sufficient for subjective interpretation are used as guideposts for further investigation (and replication by other studies); they are not touted as final conclusions to be taught to the public. (Note: despite this standard practice scientists are human and can overstep at times, but the fields are aware of this, constantly open to  internal criticism and correction. For good coverage on this, check out Robert Burton’s “A Skeptic’s Guide to the Mind: What Neuroscience Can and Cannot Tell Us About Ourselves.”

The vast majority of scientists follow evidence where it leads them. Often this disagrees with chunks of what religiously-filtered “science” teaches. A young believer can find themselves stuck trying to understand these different positions, akin to being asked to take sides in a parental dispute. Worse, parents and preachers may fight back with accusations of academic conspiracy and ivory tower arrogance. If a student buys this defense they not only learn bad science, they learn to mistrust true experts and even the scientific method.

If a student is able to remain objective she will find the position conflicts unresolvable; where there is disagreement, one side is right and one side is wrong. Contrasting this, a true believer (defined as one unwilling or unable to de-sanctify false beliefs) will learn to swim in a fog of cognitive dissonance, motivated reasoning tools at the ready, perhaps for the rest of her life.

Another feature in hesitation to accept scientists conclusions can lie in the personality trait of mistrust. You know the accusations—scientists are money mongers who deliver results their patrons desire, universities are places where snobby faculty pretend they are smarter than the rest of us, arrogant intellectuals create fancy jargon so they can talk over our heads. Tables are turned with true scientists being branded as pseudoscientists.

The believer’s solution to this supposed deception? A call to individual critical thinking (paradoxically), self-evaluation of experimental results. In other words a belief that a single person, believent and less educated, is more likely to make a better conclusion than a highly educated specialist.

This is purely wrong. First, anyone with less information is by definition less capable of making a better judgement than one with more information, though admittedly we must be wary of researcher confirmation bias. Second, this type of self-confident believer puts more weight on intuition (Type One thinking) than is valid. (See Daniel Kahneman’s “Thinking Fast and Slow.”) Third, individual conclusion, be it by a highly educated researcher or an individual, is prone to error—exactly why scientific conclusion relies on expert consensus, not expert opinion. Further yet, scientific consensus is subject to longitudinal review—study over time—subject to future refinement or replacement.

Meanwhile unscientific believers apply the tools of intuition, apologetics, argument and reinterpretation to scientific findings, mushing results into scripturally-shaped conclusions of their satisfaction. Have you ever watched a numerologist finding patterns everywhere they look? It really is amazing, the mental gymnastics humans are capable of.

Argument is not evidence, nor philosophy experimentation. Bias avoidance does not include presupposition; it starts with a null position. And “not considering deity” is not a presupposition; it is an appropriate “we don’t know” starting position.

Sadly, expect religious scientism to continue because it appears to relieve believers of some of their dissonance. Many of their conclusions will be wrong of course but lay believers may not recognize this.

Spins your head, doesn’t it? Keep this in mind the next time you consider popping a chad for a candidate who denies climate change, supports funding educational vouchers, or advocates shutting down the Department of Education. Meanwhile teach your children well. Give them science toys as gifts. Challenge their minds. Foster curiosity, wonder and intellectual interest. There is a big real world to learn about and it is much more accessible if they do not have to first dig themselves out of a false information hole.



Inter-Religion Clashes

All the religious wars that have caused blood to be shed for centuries arise from passionate feelings and facile counter-positions, such as Us and Them, good and bad, white and black.

-Umberto Eco

A video has been posted showing a British Christian group, Britain First, walking into a Luton neighborhood (a suburb of London) with a concentrated population of Muslim citizens, intending to antagonize them with large crosses. As expected both sides quickly engaged in angry shouting and middle-finger brandishing. The exchange is unsettling, though no violence ensued.

As demographics grow and shift we will watch this type of event happen with greater frequency. Though groups of humans will always find ways to conflict with one another there is a desire among many to have us grow out of this part of our nature. Do we really have to continue a conflict that has been raging regionally since 700 A.C.E.?

I remember watching a movie about Camelot years ago, Arthur the good king coming to power replacing the bad king of yore. It was exciting, romantic, righteous and triumphant. Years later it occurred to me that this narrowly-timed story, a mythical consolidation of English kings, would be but a happy(ish) episode in a long line of sad stories. Without a system to cultivate good kings it was likely that despicable kings would follow as common as not. The average person, largely powerless, would be dragged through the reign of one ruler after another, stability and happiness in their lives subject to rounds of Russian Roulette leadership quality. (See too Russian history when murder and assassination was the typical method of succession for centuries.)

This is how it feels watching religious adherents clash—endless rounds of righteous rivalry teeing up to claim territory. (This is not to imply that one side is better than the other. Certainly there are differences; your viewpoint can apply those labels.) Though both religions have scriptural elements that teach of peaceful coexistence they also teach the opposite, that their worldview is the holy correct one and the edge of the sword should be taken up in defense and offense. It is far too easy to cherry-pick justification into righteous conflict. From Roman era conflicts to twentieth century Christian/Protestant battles to Islamic sectarian rumbles of today, mankind has seen uncompromising threats of misery from literalists, aimed at those who do not yield from without or stay true from within.

But we no longer live in 3000 B.C.E., 700 C.E., or 1800 C.E.; we live in a time when the findings of physics, astronomy, biology, cognitive science and geology have demonstrated that the existence of deities is…no longer the most likely explanation for reality. However we are stuck with brains that evolution has provided—prone to belief. It feels archaic to be among those who hiss, fume and attack in the name of religion but this is modern. It is present day. And a few miles from such street conflicts colleges teach evolution, secular courts enforce civil laws, and stores sell meat products that have not been sorted by cloven hoof. Collectively, we are of mixed mind.

Humanity seems to be going through its adolescent phase—persisting with its early intuitions while, holding a thickening encyclopedia of new knowledge, not yet able to let the new information revise the old. This growth arc, the lifespan of intellectual humanity, seems to be thousands of years long. In maturity years we have just passed our teens.

Can we use cultural tools to help this maturation, to help strong believers get along despite their impulses to not? Hopefully but the time scale remains unknown. It will take an unprecedented shift within religious communities led by the leaders within. Sacred needs to be corralled. This is not to suggest that strong believers give up their belief; that aspiration is unreachable. But why can’t they agree to let God make the judgments in His appointed time (end of natural life) instead of mortal humans making and imposing judgments here and now? Where is the trust in the deity? Where is the mortal humility? Must we continue to site writings of human leaders, old and new, to force faith and impose harm onto others? Doesn’t our mortal imperfection disqualify us from imposing irreversible punishments on other mortal beings? Are we that arrogant, that unable to control our tribal impulses?

Scriptural interpretations are just that—interpretations. They may be flawed. This does not mean the scriptures are wrong, but our readings of them can be fallible. Every holy person who has disagreed over the smallest scriptural word, phrase or passage—over thousands of years—has proven this. Sounds like a decent basis for minimally, not harming one another.

In light of these issues, not the least of which are ongoing terrorist bombings, can we also agree to do more as a global population? How about an annual Copenhagen-style worldwide conference on the issue of religious conflict, on the scale of what we did to battle the HIV crisis? Time sensitive. High on the priority list. Ongoing. Until we are done. If we can’t evolve our biology out of religious conflict can we at least evolve our culture?

The Worldview Fallacy

Nothing is more dangerous than a dogmatic worldview—nothing more constraining, more blinding to innovation, more destructive of openness to novelty.
-Stephen Jay Gould

“A particular philosophy of life or conception of the world.”

Worldview is often sited by religious believers as an authorization for logic that yields their desired conclusions. An example of this came up recently at a debate (not mine but attended) where a believer didn’t like a scientific conclusion. He dismissed the evidential outcome, replacing it and stating that his own conclusion was valid given a Christian worldview. In other words, he didn’t like the real answer so he changed the rules of analysis.

Worldview has nothing to do with fact. It is an overlay that makes one prone to bias, particularly a worldview based on the sacred texts of a deity. When a person places an immutable prerequisite in front of their thinking, logic becomes unreliable. If a conclusion conflicts with the precondition it is rejected then modifications are applied—bias, rationalization, reinterpretation of evidence, dissonance-reducing confabulation—until an acceptable result is reached.

Worldview has been described as seeing through color-filtered glasses, usually rose-colored to represent a desirable bias. While there is something to this concept—we are all influenced by our knowledge and environment—it is incorrect to assume that all worldviews are equally valid. My Christian friend could have been confronted by a Scientologist, claiming that her worldview accepts that emotional baggage is traceable to engram-inducing in-utero trauma. He could have been confronted by an astrologer whose worldview holds that good and bad days are due to planetary alignment. But on this day he was confronted by a scientifically educated person with high confidence (from the knowledge of a consensus of cosmologists and physicists) that the universe is 13.72 billion years old, rather than the 6000 years his book implied.

Putting forth a false worldview as an analytic shield is like pretending to throw a magic spell that automatically elevates your conclusions to incontrovertible “truth.” At best it is a demonstration of believence in full bloom, at worst a dodge or manipulation.

I have also seen worldview used to encumber when, seeing their own argument attacked, a believer tried to counterattack but accusing his opponent of having her own falsifying worldview. This tactic is reminiscent of the accusation that atheists practice a religion—faith in science. In other words, the childish ploy of “Oh yeah? Well you too.” Not exactly high debate. The problem again is the assumption of balance, this time to an equally low level, not admitting (or worse not recognizing) the deficiency of this plane.

Some will retort that validity is in the eye of the beholder. Who is either side to judge whose worldview is valid and whose is not? Why not default them to equal footings? Rubbish. The best source of knowledge is a consensus of a majority of educated specialists within a particular field, unless of course the subject has already been dismissed (I’m thinking astrology here). Experts don’t always turn out to be correct but they have the best chance of being so. When the consensus shifts, so does the best current knowledge. Is this an Argument From Authority fallacy? No, because a consensus is not a single authority, and we simply do not have a more reliable method. Conception and non-evidential belief don’t even come close. Would you rather be subject to an Argument From Ignorance fallacy?

Until recently we have thought of these tactics as misrepresentation but it more appears to be honest belief, sad evidence that apologetic teachings are having some success molding opinions. Believers are not unthinking followers but apologists seed and feed their opinions. Given that these followers have a propensity to accept religious views, they then become resistant to physical explanations of nature. Worse, creation rationalizationists are laying claim to science itself, arguing that science is now proving deity and accusing real scientists of being the deceptive, illogical ones. It’s a classic attack switch tactic—take what your opposition accuses you of and reverse its direction.

Time will tell if the worldview tactic holds up. So far it’s providing many with a comforting belief bubble, a safe haven to reduce the dissonance stress of being in disagreement with the world’s most prominent scientists. But bubbles are thin…and they can pop.

The Role of Philosophy

What is your aim in philosophy? To show the fly the way out of the fly-bottle.
-Ludwig Wittgenstein

“Blood from the right chamber of the heart goes to -vena arteriosa – lungs – arteria venosa – left chamber…”
-Ibn Nafis (1210-1288 AD)


“We likewise discover that there cannot exist any atoms or parts of matter that are of their own nature indivisible.”
-Rene Descartes (1596-1650)

Philosophy is proto-science, the development of hypotheses and the testing of these by thought experiment using the tool of logic. Its limitation, as the above quotes demonstrate, is that its conclusions cannot be raised to a level of strong confidence since, evidence not being part of the process, there is no way to tell which conclusions are true and which are false. Sometimes it results in a hit, sometimes a miss. This is not to say that philosophy is useless—it is in fact essential—but that its conclusions are preliminary. They are the end of a road that does not continue unless and until evidence becomes available to progress the investigation with physical experimentation. The samples above are both reasonable and logical, however only one is correct. In time, evidence arose and accumulated to elevate one to the level of high theory while the other has been relegated to the dustbin of ideas that didn’t pan out.

Ask yourself then: before evidence was found to substantiate or destroy the hypotheses, what degree of confidence should have been stamped upon its plateaued conclusions? Sans evidence can any confidence even be assigned? In other words, can the output of philosophy be considered truth?

The answer is no, in of itself. Although it may derive what later becomes learned as truth, until that result is proven by evidential experimentation of positive result, a philosophical conclusion is held in a waiting position, cued up hopefully for the scientific method to take the baton and move forward. But if no runner comes along—no evidence arrives—then conclusions remained cemented at this level they have obtained, able to advance no further. They are refined speculations, educated guesses, reasoned options, even hopes.

Of what use then is philosophy? Tremendous use, particularly when evidence has not yet been discovered or when evidence may never be discovered. For the later consider the question, what is “importance?” As an abstract concept, there is no way to discuss this question without thought argument. The outcome therefore remains hypothetical and conceptual. A vase may be important or unimportant for a variety of reasons but its physical properties do not change according to its deemed importance. This is analogous to a truth vs the perception or knowledge of a truth; a truth exists independent of any knowledge or perception of it. Yes, a tree falling alone in a forest does make a sound.

In millennium past, philosophy has had a great role in leading us toward truth, though for every truth eventually matured to fact many alternate dead ends were abandoned. We kindly tend to remember the successes and forget the failures. We revere Isaac Newton for his Calculus and Theory of Gravitation while diminishing to trivia his efforts in alchemy and apocalyptic prophesy.

Though the knowledge we have gained from the last four hundred years of science has reduced the realm of philosophy—natural philosophy in particular–but there is still much we do not know. (Indeed we don’t even know how much we don’t know, so perhaps philosophy should be considered to have just moved on to new territories.) Thus philosophy will always have an important role. While science continually moves into new areas, it is often philosophy that first helps us imagine beyond the current one*. And if the history and progress of the philosophy-science team has taught us anything it is that there will always be new horizons.

However, there is a problem. Among the believent, (those with a propensity to conclude belief, particularly when evidence is scant or nonexistent), philosophy is often used beyond its boundary. When faith is criticized or considered insufficient, deities are often rationalized by argument. Religious apologists lacking physical evidence of the supernatural (by definition) make philosophical arguments to justify not only scriptural teachings but their preferred deity’s existence. This would be fine if only done to the degree of hypothesis without confidence, but they often treat their conclusions as raised to the level of likelihood, even seeing them as “truth.” This is typically an honest error, motivated reasoning being in full bloom, but it is nonetheless incorrect. Problematically, when people group and reinforce such beliefs, the result is a deficit from reality that can result in ideological, educational, political, even physical conflict.

A bigger problem: when one can generate a conclusion that is intuitive or desirable, avoiding the discomfort of the unsatisfactory, the unfamiliar or the unanswered, the search for knowledge stops and sometimes inconvenient evidence is suppressed. This is common because evolution has sculpted us to be intuitive. Intuition is a quick-decision neurological shortcut that enhanced our survival in an environment where there was often no time for slow, deliberate consideration. Infinities, time dilation, “nothing” before the Big Bang do not make intuitive sense yet they have non-supernatural explanations. But settling on a deity explanation, fanciful and teleological, is intuitive and comfortable.

So use philosophy wisely. Value its contribution in the past, present and future. But be aware of its limits and our bias to use it beyond its ken. If truth is what you’re after, philosophy is just the first step.


*Science does not progress only by the philosophical generation of hypothesis. Given the knowledge base we’re standing on now and the technology available, much (most?) of new science is investigating questions that previous work exposed. Just ask anyone who is involved with a planetary exploration project; the backlog of data to be analyzed is monstrous, not even including the reconsideration/revaluation that future findings will trigger. Almost everything we learn generates exciting new questions. Mathematics too is a field that proposes and generates new horizons, particularly in cosmology.


Atheism as Religion, Simplified

If you can’t convince them, confuse them.
-Harry S Truman

Atheism is commonly attacked as being a religion, a worldview with faith-based presuppositions. The label is applied by theists in an attempt to deflect arguments against their own religion. This is a counterattack from a weakened position (see Upping the Anti Ante), a Tu Quoquo fallacy (latin for “you also”), in which one accuses the other of what they are being accused—in this case unsubstantiated belief.

The tactic has gotten more slippery of late so let’s give it more nuanced consideration. First the definition from the Oxford English Dictionary:


1. A particular system of faith and worship:
     the world’s great religions
2. A pursuit or interest to which someone ascribes supreme importance:
     consumerism is the new religion

You may notice that this definition omits a reference to deity, an element that many dictionaries include. This is a good demonstration that the meanings of words differ depending on source and usage. Further, definitions shift over time when usage shifts.

Oxford’s first meaning sites faith and worship. Atheists decline these labels, holding that their conclusion is based on evidence and reason. Theists counter that it takes more faith to not believe in God than to believe. To parse this debate would require a breakdown of the meanings of faith and worship. See where this is going?

Next meaning, religion can be just an interest of importance. It’s peculiar that Oxford uses the adjective supreme, a word commonly associated with deity. Is great interest in vegetarianism a religion? Perhaps not if it’s not the most important interest in your life…unless it is. What if you have two great life-consuming passions, theatre and poetry. Is the practice of these arts religion? Can you have two supreme interests, more than one true religion? Is the celebration of one sacrilegious to the other?

By default, by deceit, or perhaps misdirecting intent of the accusers we are led to being unable to answer the original question. Is atheism a religion? It depends who you ask. People can give different answers and apparently all can be right. If not they can certainly claim have the right answer backed up with some form of logic. Words only have the meaning in the way they are used. Everyone wins…but then everyone loses. Why do we even bother with language?

This reminds me of the presupposition apologist tactic, trying to get their opponent to admit to possibly being wrong about anything they know (note, possibly not probably, thus the fallacy trap). They never mention to onlookers that if any person can’t be one hundred percent sure of their thoughts (not aware of living in the Matrix, etc), then nobody can—including themselves. This disarming of both sides is pointless and deceptive, as is slipperiness and shifting of definitions within an argument.

So I’ll examine the posing of the question itself. Why would a theist prefer that an atheist be engaging in a religion, be it science or the not-doing of a thing? Could it be a defensive attempt to lower their opponent to the same debased position, that of placing a leap of faith on par with a conclusion derived from evidence?

What happens if government legally deems atheism to be a religion? First off, tax-exemption for all atheist groups. Is that your goal, theists? Perhaps so as a tactic to counter atheist’s advocacy for religions to lose their historical exemption. Better to let weeds grow than pull out the monetary garden, eh? We are already living with the awarding of exemption to organized cults with all the abuse that has yielded, in particular on property accumulation. And then what—tax exemption for bowling leagues? Or is that hobby deemed not “supreme” enough in interest intensity? Who decides where that line is drawn?

So let’s dump this ridiculous gameplay. How can we know if atheism is a religion? If you can’t handle the reality, apply the walks-like-a-duck common sense test.

Christianity is a religion. Islam is a religion. Stamp collecting is a hobby. Being a Star Wars super-fan is an enjoyable obsession. Buddhism is a philosophy that many practice as a religion. Mormonism is a scam that evolved into a religion. And a mallard is a duck.

Atheism is a disbelief. Look it up.

Moral Evolution

Compassion is the basis of morality.
-Arthur Schopenhauer

A common claim of the religious is that objective morality cannot exist without a deity. How can we know right from wrong without a lawgiving creator who has imprinted the laws of proper behavior upon our souls? This is an insulting assertion that, though we have complex brains capable of science, art and abstract thought we are otherwise incapable of recognizing the wrongness of homicide, theft or adultery.

By contrast in our world that appears to be functioning by natural forces alone, moral behavior yet exists. Assigning moral source to deity is a crutch for not taking responsibility for ones own actions, or less harshly, a set of morality training wheels that we are afraid to remove.

So what is morality, objective and subjective, and what is moral relativism? Morality is the principle of distinguishing between good and bad, right and wrong, in behavior.

An objectively moral action is so regardless of circumstance or justification. Genocide, for example, is objectively wrong. In practice however this is a harsh standard to reach, perhaps philosophically impossible. One can justify killing in self-defense and war, or stealing as a desperate measure to keep one’s family alive. In this way objective morality may be considered an aspiration rather than an absolute.

Subjective morality is that based on judgment and opinion, from situational decisions made by individuals to definitions codified by society. Though generalizations are easy to agree on, gray borders surround these delineations and opinions compete to identify right behavior. Consider expressions such as “one man’s rebel is another man’s freedom fighter.”

Moral relativism is comparatively defined at the societal level. A behavior may be acceptable within one society while being completely abhorrent to another. Consider female circumcision, a cultural practice performed as a method of purification. Time period is also a factor—current day vs past ages. Witch-burning anyone? Relativism is a bothersome concept to many, a source of justification for dubious behaviors.

Morality exists in nature, most visibly in higher animals. Though proof of an evolutionary theory is a work in progress there are examples of both precursors and refined behaviors. Some colony insects work solely for the benefit of others within. Grouped animals share food and warn each other of predators. Protection of the young is innate, a behavior perpetuating the gene line. Parents in most species would starve together rather than eat their young. The behavior is so strong that it can overcome other instincts, as this amazing video demonstrates.

Primate studies continue to show examples of awareness and practice of fairness.

People exhibit moral acts voluntarily using common sense. The Golden Rule is ubiquitous. As an emergent, complex behavior of an animal with a highly developed brain it is a practice of choice and instinct, with benefits. Sam Harris makes a strong argument for inherent morality in his book, “The Moral Landscape.” To distill his thesis one can imagine two similar acts, one being more moral than another, for example, killing another person to steal their meal vs buying one yourself. Consider any two similar acts situationally close or far apart. We are able to identify the better and worse choices. This is moral distinction, the awareness of a landscape of moral highs and lows.

Collectively morality develops further at a cultural level–better and more widespread moral behavior evolving as a culture matures. In fact this can be used as a metric for cultural maturity. A civil society is an evolved culture. One that contains a system of fair laws and enforcement is more evolved that one that is anarchistic or totalitarian. The stumbling block concept of moral relativism can be overcome by agreeing upon cross-cultural standards of decent behavior. Those who act below these standards, individually or societally, may be labeled morally immature, subject to international ridicule and pressures to change. We now recognize human rights globally—disgust with human trafficking, abhorrence of chemical weapons, revulsion toward tortuous imprisonment, etc. More issues will are added to this list as international consensus builds—expanded women’s rights, LGBT acceptance, perhaps even rights to food, energy and healthcare. This is cultural evolution, not biological.

While there will always be usurpers of these standards–dictators and such–the standards remain defined through these setbacks. Politics and nationalism also intervene but eventually the moral arc leans forward.

Religious doctrines are problematic when they advocate practices below the current standards. Two examples are the suppressed role of women in many societies and inhumane forms of criminal punishment. Sacred scriptures are highly resistant to modification. Conservative clergy and apologists apply rationalizations and interpretations to justify old practices, or worse, scale up enforcement. Meanwhile followers remain stuck below global standards, their opinions suppressed.

We all know what moral behavior is with the tiniest effort of thought. It’s one of those ‘know it when you see it’ things. For guidance in a questionable situation, reference the Golden Rule—do to others as you would have done to yourself. For those who worry that we would behave horribly without a supernatural driver, relax; this is just not true. Morality is evolutionarily inherent. We all love our families, respect our friends and neighbors, and have concern for the welfare of others, near and far. It is common sense neither practical nor desirable to do otherwise, at least not for long. Those whose actions cross the lines of acceptable behavior are being sociopathic by definition; those who worse lack an ability to empathize are psychologically pathological, biologically disordered.

So be the good person you are without the baggage of instruction, guilt or coercion. Demonstrate your concern for others. Promote the expansion of moral human rights. Help raise inter-cultural standards and support the spread those standards to all corners of the globe. It’s the right thing to do.

A Latter Day Saints (Mormon) Experience

Your time is limited, so don’t waste it living someone else’s life. Don’t be trapped by dogma – which is living with the results of other people’s thinking. Don’t let the noise of others’ opinions drown out your own inner voice. And most important, have the courage to follow your heart and intuition.

-Steve Jobs

This post is different from what I typically write. A reader has shared his life experience and given permission for it to be shared. Applying my last post you may be trying to identify reasons for belief in people you meet—which factor is most dominant, for example. The writer below overcame the sheer strength of both childhood indoctrination and cultural pressure.

This combination is plaintive because it can deprive a person not only from living in reality, but from desiring or being capable of seeking reality. This is a form of psychological harm, whether intended or not. Luckily he prevailed so have so fear of reading on. Note that this is not just an LDS situation. The psychological model is the same with any organized religion, though perhaps less extreme.


Mark’s Story 
(not his real name)

 “Belief is an interesting thing, especially where it pertains to things of a spiritual or mystical nature which cannot be proven. I grew up in the LDS faith. From earliest childhood children are indoctrinated. The first Sunday of the month is Fast and Testimony Meeting. Members fast for two meals and donate the money which would have been used for those meals as “fast offerings.” Then at church they take turns standing and “bearing their testimonies” or sharing their faith in their beliefs. People even bring up tiny tots, hold them up to the podium microphone, and whisper in their ear what they should say. It irritated me no end to hear a 2 year old being prompted to say that “I know that this church is true”…”I know that Joseph Smith was a prophet” or “I know that Jesus lives.” The child clearly knows no such matters but is indoctrinated through a program of repetition to believe that he believes.

Also in the LDS faith a child is baptized and confirmed a member of the faith at age 8, which the Mormons deem to be the age of accountability in knowing and understanding right from wrong. I see this as only slightly better than the thought of Catholic babies being baptized shortly after birth. I would prefer to see children allowed to wait until they are 18, and fully capable of truly researching the claims of that faith before taking on baptism and confirmation as members. The way it stands boys are trained from early childhood that it is their duty to serve a two year mission for their church. Girls are trained that they should marry returned missionaries and have babies, but it’s also allowable for them to serve a two year mission. The minimum age used to be 19 for males and 21 for females, but the LDS church has just recently lowered the age to 18 for males and 19 or 20 for females. Most of these young men never question the policy, and as it stands with the lowered age, these youths are practically being dropped off at the mission training center straight after they receive their high school diplomas. Belief is a powerful thing. These kids spend two years in earnest effort toward converting as many people to their faith as they can and most of them know very little about their own faith or the true history of the faith and the deeper doctrinal teachings, other than what’s in the missionary handbook and what they’ve been taught to memorize of the missionary lessons. Yet their belief is usually rock solid because of an entire childhood and adolescence of having been indoctrinated.

 I did leave that faith about 10 years ago, because that I had been studying the religions of the world, with emphasis on my own faith at the time. The leaders of the LDS church strongly caution Mormons not to read what they believe to be anti Mormon literature. That boils down to any book about the church or its beliefs not written by an LDS prophet or apostle, or not overviewed and sanctioned by the First Presidency of that faith. I’ve learned that the claims made in their book of scriptures, The Book Of Mormon, are not true. Particularly their claim that the Native Americans in North, Central and South America were descendants of a Hebrew named Lehi, who with his wife, his sons and their wives, allegedly fashioned a ship and sailed to these continents to escape the coming enslavement of the Hebrew people, and because God had appeared in a dream to Lehi and told him what to do. The LDS church stood by that claim for decades, even in light of DNA testing which proved that the Native Americans don’t bear Semite DNA. Church apologists have sometimes stated that the testing was inaccurate and have clung steadfastly to the teachings in the BOM. I also found a great many contradictory teachings between the Bible, the Book of Mormon and another LDS book of scriptures, The Doctrine and Covenants. In learning the truth about these doctrinal teachings and the true and unwhitewashed history of the faith and behavior of its leaders, I determined that the LDS faith was little more than a cult. The clincher is that they sing hymns of praise to their human leaders, which they deem to be prophets.

The general attitude of a Mormon apologist appears to be that even when you can point out many reversals of earlier ‘prophets’ edicts, they wave it away with ‘that was then…this is now’. They’ve justified the murder of American families who were emigrating to California through Utah. I would reference the Mountain Meadows Massacre and suggest reading about it. When I was a teen and taking the compulsory before school doctrine classes, the claim was that the Fancher party in that wagon train was comprised of people who had participated in the Hauns Mill Massacre (a Mormon settlement) or in the execution of Joseph and Hyrum Smith, or in burning out Mormon homes and crops, and that they boasted about it and told the Mormon settlement in Cedar City that they were going to round up some more Missouri Wildcats and return from California to finish the job they had started in Missouri. This is what we were taught, that the Mormons in the settlement decided to kill them because they felt threatened. This was not the case. Mormon apologists continue to attempt to justify the murder of every man, woman and child age 8 and above with lies.

 Indoctrination is a very powerful tool when used to create and reinforce a belief system.

 Part of the difficulty in leaving a denomination such as Mormonism, Jehovah’s Witnesses and Scientology is that the individual who leaves often finds himself shunned by family and friends who are of those faiths. If one considers Mormonism in Utah, one realizes that an ex Mormon is pretty much surrounded by people who believe in their heart of hearts that this individual will be restricted from entering into the highest level of God’s kingdom (The Celestial Kingdom) where only the most faithful who have performed all the rites and rituals deemed necessary to earn that right. They believe that they’ll be excluded from their eternal family units as they believe that families will be together forever, under God’s plan according to Mormonism. Many who have left the faith have also lost employment, as their employers were also Mormons, and a family who leaves as a family might find that their children are shunned as well and not included in former friend’s activities. This isn’t the case in 100% of cases, but it does happen a lot. Also families who are not Mormon who have moved into close knit smaller Mormon communities in Utah, and in Idaho are often approached numerous times in attempts to convert them, and if they continue to refuse they’ll often be treated as pariahs.

 The same happens in many cases to Mormons who are LGBTQ. Many a gay or lesbian youth or young adult have found themselves booted to the curb by parents who are active in the faith. Some are given a choice. Either stop being gay, or you’re on your own. The suicide rates of LGBTQ teens and young adults are extremely high in these communities.”


In the most fundamentalist LDS cults (Mark was not in a fundamentalist church), youths are never taught life skills, to say nothing of how the outside world varies and works. This is psychological and practical entrapment. The above experience is not just an LDS situation. Many who have escaped religion say they had no idea what they were missing—concepts of physics, astronomy, even personal autonomy. They were living in a very small world, unable to hear a Who.

If they have the strength, will, intellect to overcome their early teachings they are often shunned by their friends and family, their only “sin” being that they realized the real world made more sense than the one historically taught. This is sadder yet because belief is not a choice; it is a conclusion. One can attempt to keep it at bay—try not to think about bothersome stories, avoid contact with and knowledge of alternate viewpoints, seek out information to support a desired position, hold contrary information without attempting to explain it—but there are differing degrees of this ability, the strength of one’s believence. Punishing a family member for coming to a particular conclusion would seem to have more to do with the punisher’s believence (perhaps an attempt to keep away temptations of doubt, in themselves and their believing circle of family and friends) than that of the errant family member. Certainly most believers are deeply concerned for their unbelieving kin, heaven or hell being the stake to them, but is cutting off a family member from their kin really a good way to maintain hope?