Humanity has at last arrived at an interesting junction in our development and search for understanding. More and more people have come to distrust or altogether write off spirituality, especially archaic and irrelevant organized faiths. This shift has been the logical conclusion of thousands of years of scientific studies into the unknown; we’ve at last come to understand, more or less, how the universe operates, how it began, and the laws that govern and produce life itself. Myths, it seems, no longer have a practical purpose in a world of knowledge.
Interestingly, that shift has in many ways led to a movement toward different, alternative forms of spirituality. We somehow crave it, even if we find it illogical. The drive to discover a sense of meaning and purpose in life is integral to the human condition. As a result, a revival of pagan religions has been especially strong, as well as an interest in Eastern religions. This shift is natural, considering that these belief systems typically do not attempt to proscribe a “literal” answer for the more earthy questions (such as, How did the universe form? or How did humans come to be?). As a result, they are, in many ways, a perfect fit for humanity in its current state of having knowledge about the physical world we inhabit but very little concrete information (if such a thing is possible) about our spiritual existence.
In perfect concurrence has been an increased interest in psychoactive drugs. With organizations such as NORML (http://norml.org/) and the push toward medical marijuana laws working tirelessly to correct a century of lies and propaganda about cannabis, we can undoubtedly expect major reform by the end of the next decade . . . perhaps even sooner. Once people begin to view cannabis in a new, more level-headed light, it will be natural to begin accepting other psychoactive drugs as legitimate, too. Psilocybin would likely be the next drug considered, and eventually, I posit, DMT. I do not expect to see much conversation surrounding legalization and regulation of addictive or habit-forming drugs anytime soon (such as methamphetamines or opiates), because they do not offer the same types of experiences as the three mentioned above (and others) do, plus, they are potentially very dangerous to consume.
So, what is a psychoactive drug, exactly, and how does it work?
Unfortunately, that question is difficult to fully explain with our current scientific understanding. Yes, we understand that they are substances that cross the blood-brain barrier and thereby significantly alter how our brains receive and perceive information. But at the end of the day, that’s about all we have for scientific, factual knowledge—how they interact chemically, but not what’s going on “behind the scenes,” so to speak.
Some scientists have tried to study these substances, but their studies often get marginalized by the academic science community or shut down by the government. Timothy Leary’s studies into LSD are now famous, and Rick Strassman’s studies into the nature of DMT are well known to drug-enthusiasts. Many other scientists are looking, often secretly, into the effects of other psychoactive substances. And I think it goes without saying that plenty of would-be scientist college students have experimented with the mental effects of cannabis.
Recently, a respected neurosurgeon, Dr. Eben Alexander, wrote a piece for Newsweek Magazine titled “Heaven Is Real: A Doctor’s Experience with the Afterlife” (read it here: http://www.thedailybeast.com/newsweek/2012/10/07/proof-of-heaven-a-doctor-s-experience-with-the-afterlife.html). In it, he details his own shift from skeptic to believer while placed in a coma set on by an infection of E. coli bacteria in his brain. In the fascinating article, he details an out-of-body experience where he came into contact with what he’s interpreted to be angels and, apparently, a deity. Near-death experiences have been well-documented and discussed ad nauseam by the scientific community, debunked by skeptics, and turned to by believers as indisputable, if anecdotal, evidence.
Ever quick to join a reason/faith fray, PZ Myers posted a response a few days ago on his blog, Pharungula (read it here: http://scienceblogs.com/pharyngula/2012/10/09/newsweek-panders-to-the-deluded-again/). In his typical and often justifiably-dismissive tone, he explains that, at best, Dr. Alexander is struggling with confabulation, a phenomenon wherein the human mind constructs artificial memories to fill in gaps, especially common in cases of trauma. At worst, he suggests that the doctor is simply brain-damaged.
But then, what of the near-death type experiences often described and sought by users of DMT? The most common response to this powerful psychoactive drug is a sense of having had an out-of-body, spiritual experience. Even taking for granted that the religious texts we hold today are bogus, or that an E. coli-induced coma could be a traumatic experience leading to confabulation, isn’t it possible that there’s something to these experiences brought on by experimentation with psychoactive drugs? Is it possible that there is “something” more, but we just haven’t been able to grasp it yet?
In short, I don’t have an answer. As an atheist with a profound interest in consciousness and a curiosity into whether humans do have a spiritual side that we’ve simply misinterpreted for so many millennia, it occurs to me that maybe we need to start opening our minds a bit more to alternative forms of consciousness and scientific research simultaneously in order to start answering some of the big questions humanity is left with. Then we can finally shake off our undying need for systems of faith to provide us with meaning and purpose. Because I don’t think the answers are going to come from reviving old religions, but I also don’t think they’ll come from closed-mindedness, either. They’ll come from humanity finally venturing into that next frontier of understanding our brains, our consciousness, and—perhaps most exciting of all—our humanity itself.
THE IDIOSYNCRATIC NOTION
Sunday, October 14, 2012
Thursday, October 11, 2012
Humanity Thirsts for Experience
The
meaning of life is experience. How sad and pathetic that so many come to
inhabit this world, seeking the answer, and fall away unsatisfied. Worse, those
who nobly die with the righteous belief that their purpose for existence was
subservience and devotion to a deity. It would shatter their very beings to
open their minds and see that God is the greatest lie ever told. So instead
they live, day after day consuming purpose via obtuse old rituals written by
power-hungry, delusional men. Even the few who break free of the traditional
paradigms still may fall into lesser belief structures, such as Paganism,
adopting ritual and worship again for the comfort and sense of purpose they
provide.
But true
purpose comes from humanity alone. We have evolved from the primordial soup,
risen above the base masses of primitive animals, to achieve the ultimate
success of intellect, of self-awareness. Of individuality.
So,
where do we go from here? Oddly, we fight against our own progress, and have
done so since day one. We develop order and patriotism, laws and structure,
then cleave to them as the monkeys we left behind still cling to the absurd
comfort, safety, and familiarity of the trees we came down from. The very
nature of humanity is change, is daring to take the ultimate, terrifying leap
into the unknown. It was once fueled by curiosity, but that human urge has been
placated at last by over-saturation. Why bother to be curious now, when all that
is knowable (or truly worth knowing, considering our limited scope of the universe)
is already known?
That,
however, is a fatal fallacy. We have focused on facts rather than experience,
emotions, and empathy. A world at peace, a love without jealousy, a humanity
graced with universal acceptance . . . these experiences are real, they are
possible, but they are, to this day, mere dreams—as the notion of a man walking
on the moon may once have seemed to Copernicus. Should we not, least, abate our
hunger for facts, having fully supped, and indulge a while in our thirst for
experience? It’s not that we’ll never be hungry again, nor thirsty forever, but
humanity itself is a single organism working in unison, as our cells and organs
compose us, and like all organisms, its needs—so far as sustenance is
concerned—are varied.
We’ll
never come to truly understand how the universe operates until we first
experience why it matters at all that we know. After all, we did not discover
the horrible wonders of the split atom until we experienced a need for that
understanding. But can’t wisdom come from positive, progressive needs, rather
than superficial ones—protection, domination, self-preservation in the face of
an enemy—and thereby carry with it a mark of pride and accomplishment hitherto
unknown to the human race?
In short,
if we may submit ourselves to love and benevolence, rather than fear and
hatred, can we not better unlock human potential, solely through the experience
of the most beautiful aspects of our shared humanity?
Monday, September 24, 2012
A Rallying Cry to Ban Technology!
America is facing a new dependency epidemic. Across the country, in countless dark basements, private rooms, and secluded offices, average American citizens are plugging into technology and wasting away their lives. It is time that we band together as a democracy and do the right thing: It’s time for an age-limit on technology. Frankly, it must be made a controlled substance.
Some reports show that as many as 8 out of 10 Americans currently use technology on a daily basis. That number may actually be higher for individuals under the age of 18! They use computers, iPods, tablets, cell phones, and a host of other devices. I’ve even heard of some individuals starting to use these items as early as the age of two. Certainly across college campuses, technology use is ubiquitous—even encouraged by those left-leaning, elitist professors—much to the damage of the development of America’s youth.
Technology addiction leads to anti-social behavior; regular consumption of pornography; piracy of art and media; obesity; and, in extreme cases, freedom of speech. It is our responsibility as good American citizens to set a better example for our children and for the free world by placing age restrictions on the use of technology. Personally, I think we should follow the example set by the drinking age—nobody under 21 should be allowed to use technology, and well-informed adults who do use technology should do so out of sight of minors. In addition, any adults providing minors with access to technology (either directly or indirectly) should be forced to pay a hefty fine, as well as having their right to technology revoked. After all, a similar system has worked exceedingly well for alcohol, as everyone knows, and tobacco products. Gone are the days of underage drinking, of high-school students smoking cigarettes, of teenagers having destructive weekend parties while their parents are away, thanks to our efforts to clean up our country. We need to follow suit, consistently, with technology.
Maybe with time, we can even take a stance similar to our highly-successful drug war. As everyone knows, illegal drugs are almost impossible to get a hold of, because they are illegal. The country is undoubtedly safer as a result. In fact, I hope that in 10-15 years’ time, we can have an outright constitutional ban on technology. Although this will certainly, at first, lead to a black market and the appearance of technology cartels, we can use our vast resources as the world’s most powerful nation to shut them down, just as we did during alcohol prohibition and today in our ongoing (but winning) drug war. The tax burden of a few more corrupt citizens being put into prisons for ignoring the laws would be minor, and most good citizens would do the right thing and give up technology completely if the government said so. After all, they’ve been taught since birth to believe everything they are told, to tremendous effect.
Economically, much good would come of a technological ban. Foremost, we could increase the police force across the country significantly at first, thereby putting more Americans to work. And by replacing modern technology with more wholesome books, landline phones, cable television, and records, we’d reestablish dying industries and revitalize our economy fully. I wouldn’t be surprised if other countries followed suit, seeing how successful our efforts become, and start importing our goods again.
Imagine a world without technology, where nobody is ever exposed to the temptation to connect, share ideas, exchange information, and learn about anything and everything all at the same time. It’s not difficult to look at the older generations, in their infinite wisdom and enlightenment, as a model for the future. I’m proud of the government’s efforts so far with such attempts as SOPA and PIPA to begin to cut down on this ugly modern tendency, and I truly hope the Obama administration will not back down on its efforts to cut Americans off from their technology addiction.
In closing, I hope you’ll all join me in writing your Congressmen to tell them that you want to see a serious discussion about the merits of a technological ban in the forthcoming sessions. They’ve proven so adept in the past at obeying our wishes, as they’ve been elected to do, that I see no reason to suspect they won’t do so again, should enough of us raise our voices.
Some reports show that as many as 8 out of 10 Americans currently use technology on a daily basis. That number may actually be higher for individuals under the age of 18! They use computers, iPods, tablets, cell phones, and a host of other devices. I’ve even heard of some individuals starting to use these items as early as the age of two. Certainly across college campuses, technology use is ubiquitous—even encouraged by those left-leaning, elitist professors—much to the damage of the development of America’s youth.
Technology addiction leads to anti-social behavior; regular consumption of pornography; piracy of art and media; obesity; and, in extreme cases, freedom of speech. It is our responsibility as good American citizens to set a better example for our children and for the free world by placing age restrictions on the use of technology. Personally, I think we should follow the example set by the drinking age—nobody under 21 should be allowed to use technology, and well-informed adults who do use technology should do so out of sight of minors. In addition, any adults providing minors with access to technology (either directly or indirectly) should be forced to pay a hefty fine, as well as having their right to technology revoked. After all, a similar system has worked exceedingly well for alcohol, as everyone knows, and tobacco products. Gone are the days of underage drinking, of high-school students smoking cigarettes, of teenagers having destructive weekend parties while their parents are away, thanks to our efforts to clean up our country. We need to follow suit, consistently, with technology.
Maybe with time, we can even take a stance similar to our highly-successful drug war. As everyone knows, illegal drugs are almost impossible to get a hold of, because they are illegal. The country is undoubtedly safer as a result. In fact, I hope that in 10-15 years’ time, we can have an outright constitutional ban on technology. Although this will certainly, at first, lead to a black market and the appearance of technology cartels, we can use our vast resources as the world’s most powerful nation to shut them down, just as we did during alcohol prohibition and today in our ongoing (but winning) drug war. The tax burden of a few more corrupt citizens being put into prisons for ignoring the laws would be minor, and most good citizens would do the right thing and give up technology completely if the government said so. After all, they’ve been taught since birth to believe everything they are told, to tremendous effect.
Economically, much good would come of a technological ban. Foremost, we could increase the police force across the country significantly at first, thereby putting more Americans to work. And by replacing modern technology with more wholesome books, landline phones, cable television, and records, we’d reestablish dying industries and revitalize our economy fully. I wouldn’t be surprised if other countries followed suit, seeing how successful our efforts become, and start importing our goods again.
Imagine a world without technology, where nobody is ever exposed to the temptation to connect, share ideas, exchange information, and learn about anything and everything all at the same time. It’s not difficult to look at the older generations, in their infinite wisdom and enlightenment, as a model for the future. I’m proud of the government’s efforts so far with such attempts as SOPA and PIPA to begin to cut down on this ugly modern tendency, and I truly hope the Obama administration will not back down on its efforts to cut Americans off from their technology addiction.
In closing, I hope you’ll all join me in writing your Congressmen to tell them that you want to see a serious discussion about the merits of a technological ban in the forthcoming sessions. They’ve proven so adept in the past at obeying our wishes, as they’ve been elected to do, that I see no reason to suspect they won’t do so again, should enough of us raise our voices.
Friday, May 25, 2012
An Atheist Perspective on Morality
Conversations with Christians:
“I am an atheist.”
“So, you don’t believe in sin then?”
Morality is a human construct. It
has no absolute, tangible reality; it is purely a fabrication of the human
mind, crafted and honed after generations of experience and emotive
development. Fortunately, it is a rather useful human construct that does much
to maintain peace and order. It’s hard to imagine anybody who would disagree—even
psychopaths acknowledge its power. Indeed, it is their love for promulgating
chaos that leads them to defying morality all together. Because morality is
catholic (although unique and relative, to some extent, to every individual),
it is a tempting bit of evidence to point to the existence of a creator, a
deity, a god. However, I intend here to lay that temptation to rest once and
for all by explaining an alternative source for morality. I base it on my own
experience and welcome responses from anyone who may agree or disagree.
The key to understanding how a human
comes to label any action as moral or immoral, oddly enough, lies within the
so-called “Golden Rule.” This “rule” that is so ubiquitous throughout the
religious/mythological world—“treat others as you would like to be treated”—need
not come from a deity (or a group of deities, for that matter) in order to be
explained. Indeed, it is no stretch of the imagination to posit that the same
minds that have come to untangle the mysteries of gravity or evolution were
able in more primitive forms to discover a sentiment universally felt and
accepted. It did not need to be (and it indeed was not) revealed to humanity
through divine intervention but through rationalization, emotions, reflection, and
empathy.
To start, it is useful to consider
what is often cited as the quintessential moral issue: murder. No rational,
emotionally-developed human being will make the case for the virtues of murder,
precisely because it is so atrocious on even a global scale. But murder is not
bad because Yahweh, Zeus, Allah, or Vishnu says it is so. Murder is bad,
immoral, because of how it makes people feel.
The human imagination is a
remarkable vessel for compassion and for manifesting glimpses of purely
fabricated situations. In fact, the latter skill is a particularly potent one
humans possess, as demonstrated in our ability to experience dreams while we
sleep, recall from memory rather vividly a loved-one’s face, or even just
imagine the taste of a vanilla ice cream cone. How these processes occur is not
yet fully understood, but that it does occur is obvious and has been one of
life’s great mysteries for millennia.
To prove my overarching point, let’s
devise a brief thought experiment. Unpleasant though it may be, close your eyes
and try to imagine a scenario where someone whom you love very much has been
killed. It need not even be from murder. Or, if this is an experience you’ve
had already, you may try to recall the precise feeling you had when you were
first broken the news and the subsequent sentiment that no doubt lingered on
your mind for a very long time after. The feelings you are presently
experiencing are not unique. Our human faculty of empathy allows us to
powerfully experience in our minds both things we’ve never encountered and
events we have already met with in our lives. This fact is central to
understanding my conception of how morality arose and how I as an atheist have
shaped my own understanding of morality.
I do not kill people precisely
because I would not want someone else to kill me or someone I care about.
Although I suppose I would not be aware of my own death if someone were to kill
me, I can consider now, while still alive, how the murderer would have cut
short my small sliver of allotted time to live, experience, and love. The very
notion of my murder thus affects me deeply due to the sheer ignorance and cruelty
of the committer. Moreover, when I consider (selfishly, I admit) the pain I
would experience upon having anyone important to me wrenched from my life, it
exacts a visceral response deep within me. My sense of empathy prevents me from
doing anything I would not want done to myself.
It’s pretty easy to see, then, how
even a primitive human mind could arrive at such basic moral understandings as
the wrongness of murder, rape, or theft, purely through reflection upon one’s
own feelings. That is not to say that all primitive humans stumbled upon this
understanding, but at least one human did and likely shared it with his or her
peers. What’s important to grasp is that this knowledge did not come from some
outside force but from humanity itself. Very likely, it came from real
experience and empathy, not from imagination at all. With the human mind’s
impressive ability to assimilate facts into knowledge and comprehension, it
seems very probable indeed that the first person to acknowledge the
wrongfulness, the immorality of any action, was a victim of that very action.
Beautifully, empathy does not merely
restrain one from committing heinous acts. To the contrary, it encourages
benevolent behavior. If one avoids bad deeds out of fear of reciprocation, then
it follows that one also will actively share good deeds in hope of
reciprocation. It may be simple to make the case then that all good deeds are
committed selfishly in order to promulgate an economy of good deeds in the
hopes that some will spring back upon the self. However, the same argument
could easily be made (and rather frequently has been made) that religious
morality and charity are simply the carrying out of duty in hopes of reward
from a deity. It’s impossible to get around this conundrum, for even if one
should say that good deeds are inherently good, it is really the good feeling
that comes along with them that makes them occur at all.
Fortunately, this age-old paradox
isn’t that troubling at all. In fact, it helps further my suggestion that
morality is based on emotions. The same primitive minds that conceived of the
pain caused by murder that decided to not murder in order to avoid
reciprocation likely stumbled upon the truth that kindness often propagates
kindness. Over time, these two basic truths—cruelty begets cruelty, benevolence
begets benevolence—came to be expressed in various mythologies the world around
via the Golden Rule. Consequently, I believe that humanity does not possess
morality because of religion, but that religion possesses morality (albeit, often
perverted) because of humans. This question is no “chicken or the egg” paradox,
but a very easily-understood anthropological phenomenon. Humans invented
morality, and then they invented religion as a source of higher authority for
that morality. Religion was likely invented, at least in part, as a safeguard
to maintain and enact this economy of kindness and agreement to avoid immorality.
Interestingly, I posit that it is
precisely this introduction of religion to enforce morality that both spread basic
moral codes and ultimately corrupted the essence of morality. The threat of
punishment by a supreme deity or by supreme deities would certainly keep a
less-sophisticated mind from committing so-called “immoral deeds,” and it did a
fine job of doing so for a very long time, I admit. In some respects, it
continues to do so today. But various other rules became introduced over time
that lack any moral basis—kosher laws and circumcision are perfect examples
from the Judeo-Christian tradition. These laws, some suggest, were introduced
in response to legitimate dangers in the world (for example, shellfish may have
been banned as a result of food poisoning, and a circumcised penis is much healthier
for both genders in a society lacking sophisticated hygiene techniques). Moral
code thus became a useful survival tool, but something much less founded on
morality (Richard Dawkins suggests that this use of religion as a survival tool
is precisely why religion itself has survived—the tribes that had religious
laws preventing them from harm were more likely to survive than those without
it, and thus, more likely to reproduce and pass on their moral codes). Worse,
political groups eventually grew to recognize how easily religion could be
exploited to maintain order and control; the Roman Empire’s adoption of
Christianity is a clear, famous example.
When government became involved in
religion, moral law began to lose its utility. The cause is a tendency of
politicians to line their own coffers while “representing” the people, as can
still be seen today in pork-barrel laws or accommodations politicians make for
powerful lobbyists. I see no reason to believe that such a practice is new, nor
to deny that it had a hand in crafting many of the more bizarre laws found in
various religious codes. Moreover, scientific advancements and fuller
medical/anatomical understandings have done much to render many old religious
laws obsolete. Unfortunately, many have survived on tradition alone, not on
logic or emotion at all. Often, this survival can be dangerous to a society.
In closing, it’s time humanity bind together
to produce a universal moral code based on empathy alone. No more should we
tolerate a world that incites religious wars and acts of terror. No more should
we tolerate a world that punishes a woman for revealing too much flesh. No more
should we tolerate a world that challenges the importance of rational thought
and skepticism in favor of blind faith instead. It’s time to shake off the clingy
stain of religion-based morality in lieu of something progressive, logical, and
based entirely on empathy alone. Such a change would lead to impressive,
desperately-needed social and political changes. The only path to a truly moral
world is to finally give up on the gods.
Monday, March 12, 2012
A Line in the Sand: Atheists versus Theists . . . Or, Pick a Camp, Casual Theists!
I’m drawing a line in the sand today. As of now, only two camps exist in the world of religious views: atheists (my camp) and theists (the other camp). A theist is anyone who believes in a higher power—a “god,” so to speak. This group includes Catholics, Zoroastrians, Hindus, agnostics, Protestants, deists, Jews, Muslims, and non-practicing or non-denominational believers. While I could spend a great deal of time from here attacking and criticizing each of those groups, I have a specific subset in mind I’d like to tackle today, although it’s really an amalgam of a few groups. So, I’m going to give them a new name, a new categorization: the “casual theists.”
A casual theist may be agnostic, a deist, or a non-practicing believer. What they all have in common is that they believe in a higher power (or, in the case of agnostics, don’t rule it out as a good possibility) but either do not believe in or do not follow any of the organized religions. Simply put, their stance is that god exists but he is not the god of any religious creed standing, although he most certainly shares many characteristics ascribed to him within them.
The problem comes in with where the idea of god comes from in the first place. Granted, it is impossible to know with any certainty its origin, but speculation and clearheaded, disinterested consideration lead to a fairly likely theory: God is a human fabrication.
Let’s look at things another way. If the concept of god did not exist, but god himself did, then nobody today would believe in him, regardless. Even Christians would have to agree with this logic—their belief is based on scriptures, the revealed word of Yahweh. It would be impossible to believe in a god without his (or her . . .) intervention, any more than a child could believe in Santa Claus if the familiar myth did not exist. It is simply not a belief that would even be an option.
Similarly, let’s say Santa Claus did exist, but it was becoming trendy amongst certain skeptical circles to deny his existence. Sooner or later, someone would get a photo of jolly old Saint Nick. Or an expedition to the North Pole would ensue, resulting in his capture and, quite possibly, relocation to Guantanamo Bay for interrogation. With proof of Santa Claus’s existence, nobody in his or her right mind would be able to say that Santa Claus is “made-up.” But, of course, Santa Claus is made-up. So, in the real-world, upon reaching a certain age, children are told the truth, and nobody in their right mind still believes he exists.
But let’s push the Santa Claus analogy a bit further. Imagine if some very sadistic parents went to great lengths to keep their unfortunate little son—we’ll call him Simplicio (Latin for “stupid”)—from ever discovering that Santa was not real. All through Simplicio’s life, his parents sneak to his house and deliver presents, even when he’s a grown man. They stage elaborate bits of evidence (reindeer tracks in the snow outside, letters from Santa, etc.) and point to them constantly as proof that Santa is real, knowing all along that it is not true.
These three scenarios describe the three major groups of believers: the first, where a child believes in Santa without the idea ever having existed, is like what deists believe. These are the believers of the so-called “watchmaker creator,” an idea literally invented during the Renaissance (more than likely, at least, but possibly slightly earlier) and popularized in the Age of Enlightenment (the 17th and 18th centuries). The basic idea is that a creator exists who made the universe (like a watch), then stepped back from his creation and let it do its thing. He has no involvement in daily life, he doesn’t listen to prayers, he doesn’t judge people. He is simply the creator, and that’s it. But this idea was simply fabricated during a time of enlightenment and reason when people were becoming disillusioned by organized religion but uncomfortable with the idea of a world without a creator. That is, deism was “made-up.” The idea did not come from god himself, but from the human imagination—just as the idea of Santa Claus would not come to a child from Santa Claus himself (who does not exist), but from the human imagination. As such, it carries no weight.
In the scenario where Santa does exist but nobody believes in him until hard proof is given, we find the atheists. The point is, until someone goes out and pulls God aside and proves that he is in fact real, atheists will not believe in him—even if the vast majority of people still do so. It is simply irrational to do so without hard proof—of which not a single shred exists.
And the final scenario, the one where poor Simplicio is being tricked by his malevolent parents into believing in Santa Claus, well . . . those are all the people the whole world over who ascribe themselves to any religious institution. Simply put, they are being tricked through false evidence (miracles are the big one), forged letters (religious texts), and an abhorrent abuse of trust. To discover the truth, they really need only look and see that Santa’s handwriting is the same as their parents’ (just as the book of Genesis is the same as the Enuma Elish or the Epic of Gilgamesh); that presents are a distraction from the truth (like the promise of an afterlife); or even consider the ridiculous notion of an old man, complete with toy-making elves, delivering presents all over the world in one night based on his personal judgment of people’s morals, a highly-subjective subject (sort of like thinking about talking snakes, virgin births, burning bushes that can talk . . . or maybe just an omnipotent old being, complete with messenger angels, delivering people to an afterlife based on his revealed judgment of people’s morals, a highly prescriptive subject). But, alas, they proudly wear the shutter of “faith” to block their gaze from the truth, thereby remaining blissfully unaware that their lives are based on fiction.
Yet another group exists, though. This group is the group of first-graders who hear from one or two friends that Santa Claus is not, in fact, real. They at first reject the judgment, offering all sorts of “evidence”: the letters, the sound of bells on the roof on Christmas Eve, the time their father got coal, etc. These same youngsters inevitably go home and ask their parents, and are told they were lied to, that Santa is most definitely real. So the children go on believing, trusting their parents. But the more they think about it, plenty of reasons exist for why the evidence is faulty and insufficient . . . sooner or later, they stop believing all together. They become atheists. But what if some people continued believing in Santa Claus, despite no good evidence? These are the casual theists, waiting to grow up.
But enough foolish talk of Santa Claus—my overall point is, of course, belief in a god is as reasonable as believing in Santa Claus. Hardly a new idea, this notion has recently been popularized by the likes of Richard Dawkins and the Pastafarian Movement with the Flying Spaghetti Monster. What I’m getting at is the danger of being a casual theist. If you wish to suspend all disbelief and submit to the scriptures, believe in any denominational god, that’s your decision. It’s a bad one, I posit, but not even so bad as being a casual theist. Here’s why.
Casual theists have no evidence to support their suspicion. After all, without regard for and faith in the scriptures, the revealed, divine word of a god, what proof is there of his existence? If the real god is something entirely different from the way he’s portrayed in all religions, then we have no real portrayal of him at all. Indeed, the truth is, we are simply hoping, insecurely, that he exists while mentally conceiving him to be something different from the scriptures in existence. But, do we see the catch here? If not for the preconceived notions of god given through religion, we wouldn’t be considering the conundrum at all. Casual theists are the hurt first-graders, recently confronted with the truth that they’ve been lied to by their own parents, but who continue to believe in Santa nonetheless. Well, it’s time to grow up.
From my experience, two major forms of religious mobility exist. Theists can become atheists and casual theists can become theists. Rarely do atheists move to either camp, and rarely do casual theists move to the atheist camp (perhaps they are simply too indifferent about god in the beginning to move further to the point of erasing him from their lives entirely?).
The reason casual theists tend to move toward theism is for the same reason they cleave to a belief in a god to begin with—as I say above, insecurity (so often, this pathetic emotion is the cause of life’s mistakes . . .). Insecurity with the idea of a world without a creator, insecurity with the notion of life being finite and death being the end, insecurity with a world without absolute morality, insecurity with a life without higher purpose or meaning. Rampant insecurity. As a result, the false promises of various theist organizations (that is to say, churches) invariably begin to sound downright appealing—especially considering the “evidence” presented and the crooked historical/theological scholarship backing it. Churches have a way of convincing members to join, and their method is simply telling them what they want to hear. They know it’s hopeless trying to convert atheists, so they go after the casual theists instead. And traditionally, atheists were happy to lump themselves alongside the casual theists (how many surveys have a check box next to “Atheist/Agnostic,” as if they were the same thing?). But I’m no longer happy to do so. To quote Ken Kesey, “You’re either on the bus or off the bus.” No in-between exists.
To all you casual theists out there reading this rant piece, pick a side. Choose a team. Either come join the atheist camp or join the theist camp. But if you join the other team, be warned: We atheists are growing, we are spreading, and we are tired of being silenced. Tread carefully.
Note: Thanks to Kari Kohler for the image. I use it without permission.
Note: Thanks to Kari Kohler for the image. I use it without permission.
Tuesday, February 21, 2012
Change
The overweight bum holds out his pampered hand
and requests, unabashedly,
“Change.”
So I stop and snap, politely,
“The election’s over;
you can stop campaigning.”
I’d normally toss him a buck,
but they’re getting harder and harder
to obtain, impossible to retain
since the record companies sued me —
175-zillion Monopoly dollars
for pirating “The Times They Are a-Changin’” —
and the bank’s demanding I repay
that $100K loan I took out
to cover tuition when I was 18,
too young for a cold one,
but ripe for credit crunch
exploitation.
But I guess I’ll find a job someday
with my BA
in Contemporary American BS:
diggin’ for dimes
but pickin’ up
pennies.
This whirlwind black-hole avaricious
economy
has sucked all the Change out of me,
leaving the bitter copper taste
of pennies, like phlegmy blood,
lining my throat with a pinch.
Subscribe to:
Posts (Atom)