PSA: Use Genghis Khan To Avoid Godwin’s Law

Since we live in, as the comedic show Futurama called it, the Stupid Ages, where an astonishing number of people are willing to admit the evil of only truly extraordinary cases, it is very tempting when trying to have a discussion of a moral subject to invoke Adolph Hitler and the Nazi party. Unfortunately, Mike Godwin ruined this by coining Godwin’s Law. If you’re not familiar, Godwin’s Law states:

As an online discussion grows longer, the probability of a comparison involving Nazis or Hitler approaches one.

Godwin claims that he set out to try to reduce the incidence of glib Nazi comparisons, but I think that his main supporters are people who really don’t like the ability to clearly and unambiguously point out an instance of evil. This is a different time and place to when Godwin coined it, but certainly the majority of times I’ve seen it invoked is as a defense against thinking.

It occurred to me recently, though, that Genghis Khan works approximately as well as Hitler for a person who did what is (almost) universally acknowledged to be great evil while he thought that he was doing good. He’s not interchangeable with Hitler in all contexts, but he is in an awful lot of them. Where he is, you will probably save yourself some trouble while trying to stimulate thinking. Since he’s not quite as well known as Hitler, though, I do recommend always mentioning that he raped an awful lot of people. That will insure that people will think of him in the correct frame of mind.

(Note: this way not work as well in Mongolia, but you probably won’t need to do this as much in Mongolia anyway.)

Atheists and Morality

When I was young, I heard a fair amount of argumentation that “atheists can be just as moral as Christians,” and in a very theoretical sense—and neglecting the duty of piety to God which an atheist must necessarily transgress—there is a sense in which this is true. As time has proven, though, it’s an irrelevant sense.

There are two possible meanings to “atheists can be just as moral”, though they end up in the same place. The first meaning is that an atheist can, through practice and effort, form virtuous habits according to whatever theory of morality he holds, and continue in these virtuous habits all the days of his life. This is true, especially for people who become atheists as adults, since adults tend to be fixed in their habits relative to children. Habits formed in youth are, certainly, possible to keep even if the reason for them has been rejected. It is not easy for such an atheist, since the continuation of his habits will have only the support of his prejudices and not of his intellect, but it is certainly doable. At least so long as he doesn’t face temptation. Upper middle class bachelors, with cheap hobbies, few wants, and no family obligations, will in fact tend to be harmless until they return to dust.

This misses is that there is more to morality than simply being harmless, or even than simply being “a productive member of society”. These are fine things, but human beings were made for greatness, not for being somewhat more convenient to their neighbors than absolutely nothing would be. It doesn’t really matter, anyway, because of the second possible meaning.

The second possible meaning of “atheists can be just as moral” is that they can be just as moral, according to their own moral standards. If this sounds good to you, consider that Genghis Khan thought that pillaging, raping, and murdering were heroic actions and that he was a really great guy for doing them. Being moral according to your own moral standards—assuming you’ve done your best to ensure your moral standards are correct—may possibly be sufficient to make you saveable by the salvific power of Christ on the cross atoning for the sins of the world, but it doesn’t mean that you aren’t being immoral according to the correct moral standards.

To say that an atheist can follow his own moral standards is, in fact, to say that he is liable to be immoral precisely to the degree in which his standards differ from real morality. It is no answer to say that he will approve of himself while he fornicates, adulterates marriages, and murders. It is no answer to say that he will only murder the people he believes it’s perfectly fine to murder, or that he will only murder people he declares aren’t human beings before he murders them. Approving of his evil actions doesn’t make him less wrong, it makes him more wrong.

When the famous enlightenment philosopher said, “there is no God, but don’t tell that to the servants or they will steal the silver,” it is no answer to say, “perhaps, but they will not think that it’s wrong to steal the silver while they steal it.”

This does bring up something of a side-note, which is sometimes talked about in this context: people who believe that God sees everything and punishes wrongdoing in the afterlife are more likely to think that they cannot get away with a moral transgression than are people who think that no one will catch them now or later. There is something to this, since the feeling that someone will know what you do is a support to doing the right thing. This is not the entirety of morality, however. People can habituate themselves to virtue such that they do not need this support in order to do the right thing, i.e. so that they can do the right thing even if they don’t believe anyone will know, merely because it is the right thing to do. Which brings us back to this second point: that does no good if the person’s theory of right and wrong is wrong.

If someone’s theory of right and wrong is wrong, he will not just do wrong (that he approves of) when no one is looking, he will also do wrong when everyone is looking.

Many atheists of the late 1800s and early 1900s took great offense at the idea that an atheist could not recognize moral standards. All people know right from wrong, they indignantly said. That modern atheists are busy villifying those older atheists for not living up to the different moral standards of modern atheists tells you what you need to know about this silly claim.

In the end, it doesn’t matter whether atheists can, through effort and practice, create virtuous habits and stick to them even if no one is looking, until the day they die. History tells us, if common sense didn’t already, that they will alter their moral standards before then. Once that happens, it’s only a matter of time and temptation before they form new habits accordingly.

Murder, She Wrote & Stereotypes

Murder, She Wrote has been accused of being formulaic, sometimes in very funny ways, but that’s not really true unless you construe the formula so generally that a show with a genre must be formulaic. (On that subject, you might want to read Writing Formulas and Formulaic Writing and In-Genre Fiction is Dull Outside of It.)

What people are really referring to is that Murder, She Wrote generally used very clichéd characters. Writers, businessmen, police detectives, lawyers, real-estate agents—these were all variants on the standard cliché of them. Businessmen did business and they cared about numbers and money, especially if the numbers represented money. Real estate agents were always desperate to make sales, no matter what. TV actors were always full of themselves and demanding. (When I say “always” what I really mean is “almost always.) There was a reason for this, though.

Once you subtract out the commercials, an episode of Murder, She Wrote were approximately 45 minutes long (once you subtract the coming attractions, intro sequence, and closing credits). That’s not a long time to tell a complicated story, and if—as I’ve done recently writing episode reviews—you pay close attention to how many plot elements there are in a typical episode, you’ll discover that they’re actually quite densely packed. Once I started describing and analyzing all of the plot elements, it started taking me about a week to review one. This comes from the need to establish why Jessica is present, to give several people plausible motives, to do some basic investigation to uncover clues, and to throw in a red herring or two. Once you consider how much needs to be done, it becomes clear that there just wasn’t much time for characterization.

The characters in Murder, She Wrote were not, in the main, clichés because the writers had no creativity. I don’t want to oversell this; the writers were TV writers. They churned product out on a tight schedule and were not, for the most part, brilliant people. However, they did tend to flavor their characters in creative ways, if sparingly. What they were really doing was using clichéd characters in order to save time. We’ve all seen the characters they’re referring to a hundred times, generally more developed in those other places. Even if not more developed in those other places, a hundred variants on the same idea fleshes it out in our imagination. This is shorthand.

Had Murder, She Wrote used really original characters in each episode, the episodes would have had to be two hours long. That is, they would have been movies.

This shorthand is also part of what makes Murder, She Wrote a kind of comfort food. We’re familiar with all of these characters; we’ve spent a long time with them and now we’re seeing them again. Sometimes even played by the same actors we’re used to watching play them in the old days. This isn’t really a coincidence, it’s part of the show’s theme that old things are still good.

Prime Day Cometh

I recently got a notification that Amazon’s “holiday,” Prime Day, is coming this summer. This amuses me on several levels.

The first is that Prime Day is actually two days long. So far as I know, this isn’t a sundown to sundown thing, either. Amazon just has their own special definition of “day.”

Also, for those who haven’t heard of it, Prime Day is basically Amazon trying to make up its own Black Friday which no one else celebrates. In theory is is a “day” on which there are all sorts of amazing deals, such as usually are only found on Black Friday.

At least based on previous prime days, there aren’t any amazing deals to be had, at least not on anything that anyone wants, or in comparison to what items normally sell for. There is a certain amount of removing all discounts in the days leading up to Prime Day so that re-applying the ordinary discounts can be claimed to be a sale, but that’s not exactly impressive. There are, of course, the occasional significant markdowns on overstocked items, but if one pays attention those always come up from time to time. If they’re still doing it, those are available year-round on Amazon’s “gold box”. I once got a very nice kitchen knife for $20 that way. (I think it may have been overstocked because the factory forgot to sharpen it and it couldn’t cut a paper towel out of the box. Since one has to periodically sharpen kitchen knives anyway, this wasn’t exactly a big deal to me.) A “day” dedicated to buying things at basically the same prices but with more fanfare than normal isn’t very exciting.

Which actually brings me to Black Friday. There was a short time when Black Friday sales really were a thing, large box stores offering exceptionally deep discounts on some items in order to lure people into the store, at which point they would start doing the rest of their Christmas shopping while they were there. As the saying goes, though, while you can fool all of the people some of the time, you can’t fool all of the people all of the time, and competition from other stores combined with awareness that it would be better to hold off on one’s other shopping until the crowds weren’t of quite such deadly sizes rendered the practice of huge discounts on attractive items economically untenable. And whatever can’t go on forever, won’t.

Black Friday is already passing into the sands of time. Cyber Monday never really became a thing. I find it quite funny that Amazon is trying to invent a pretend holiday which Amazon can own based on holidays already on their way out.

Missed One

In my previous post about the taxonomy of atheists, I realized that I missed one: the atheist who hates human nature. This is probably more commonly known as the atheist who wants to be immoral, though they amount to the same thing. The example which comes to mind quickest is Bertrand Russell.

I wrote about Bertrand Russell a bit in my post about his famous teapot argument. The short version was after seeing how stupid his teapot argument was, and knowing that he was a well educated man, I knew he had to have an ulterior motive and so I knew that he was a bad man. And briefly looking up his biography turned up that he was a serial adulterer.

This sort of atheist is not limited to those who wish to contravene sexual morality, though that may be the most common form of it. People who wish to be something other than they are can also fall into this trap, since it amounts to hating God for giving them the nature that they have and not the nature that they want. People can hate God for giving them the wrong hair, or the wrong skin, or making them short instead of tall. It will, of course, be most common where people think that they can do something about it and God is standing in their way. That is why you see this more commonly with morality, since somebody who believes that he should be, by nature, a bigamist, will tend to think that if he simply practices bigamy he will be what he wishes to be. And, indeed, if our nature was our own creation, he would be right. If God exists, however, essence precedes existence and he is what God made him, regardless of what he’s futilely trying to remake himself into.

This, by the way, is why the life history of people trying to make themselves into something that they’re not is always extremely depressing. I’m currently reading the biography of a guy who was certain in his heart that he was a rock star—and he absolutely hated reality for disagreeing with him. The harder he pursued his delusion, the angrier he got at everyone around him. This is really what Sartre was talking about when he said, “hell is other people.” If you take the existentialist position seriously (that existence precedes essence), other people will be hell because, being just as real as you are, they will inevitably prove that the essence you’re trying to give yourself is a lie.

Contingency

The argument for God’s existence from contingency and necessity is not very long, so there aren’t many ways to attack it. Mostly they consist of holding that reason doesn’t actually work. However, one of the more reasonable approaches is to question whether anything is contingent. The traditional approach looks at whether something exists at all points in time, but since time is so mysterious this can be questioned simply because anything regarding time can be, despite our direct perception of time. But we also dichotomy perceive our free will, and yet determinists exist, too. Perhaps a way of helping such people is to look at space rather than time. If a thing does not exist at all points in space simultaneously, it is clearly not necessary since there are places where it is not anything at all – the places where it isn’t. If a thing exists in one place and not another, it’s existence in the one place and not another must be contingent on something other than itself. If that thing is not necessary, then it must be contingent on something else. There cannot be an infinite chain that doesn’t terminate in something necessary, or else there would be nothing at all since no contingency would be fulfilled. That necessary thing all (reasonable) men call God.

A Taxonomy of Atheists

I’m planning to do a video soon in which I present a taxonomy of atheists. I wanted to present my rough draft for it here in case anyone has feedback to give to help me improve it:

  1. Philosophical Atheists — atheists who think about fundamental questions
    1.a. Spiritual atheists — Nietzsche, etc.
    1.b. Parasitic atheists — Modern Philosophers, the sort of people you find in academia where all they can do is deconstruct what others have done
  2. Simple atheists — engineer types who don’t think about big things at all, ever
  3. Cultural atheists — rabbits who just go with the herd and don’t think; the dominant culture is secular, so they are too; had it been religious they would be exactly that religious
  4. Teenage rebels — raised religious, either in an irrational tradition or in a rational tradition but not taught it; when they started having questions their parents had no answers
  5. Angry atheists — people with daddy issues, often a lack of father in their life, though a week father, a neglectful father, a drunkard, etc. can all produce this; they grew up too early and so think of themselves as the ultimate authority figure in their life, they need everything to make sense on their terms, etc.
  6. Cult atheists — atheists who have found some measure of community and purpose in Atheism™ brand disbelief in God, and their actions reflect their attachment to this community and the purpose they find in it.

An individual atheist can be in multiple taxons at once, though mostly that’s true of taxon 6, while they start out in one of the other ones and then move into it.

In the video I will of course clarify that #6 doesn’t mean that there’s one big cult, rather rather there is a loose network of cults, primarily online, and less intense than the sort where everyone moves onto one compound and practice taking suicide cool-aid. There are similar relationships, though; a similar drive to find new members, a similar us-vs-them mentality, a similar view that their life isn’t lacking purpose and meaning.

Have I missed any taxons?

From the Archives

In February in the year of our Lord 2016, I wrote a post titled Reductio ad Absurdum Isn’t Straw. The first paragraph does a fairly good job of describing what the post is about, since for once I didn’t wander around too much:

Reductio ab Absurdum is a criticism of a position which shows that it is false by demonstrating that absurd conclusions follow from it. A Straw Man is a fake position that sounds like someone’s real position which is constructed by an opponent because it’s easier to disprove than the person’s real position. (It is often the case that the straw man is accidentally constructed because the attacker has never understood his opponents real position.) These two are often confused for each other, which is a bit odd, and I think that a big part of the explanation for why is Kantian epistemology. (I wrote about Kant’s substitute for knowledge here, and this blog post won’t make much sense unless you read that first.)

I think it’s a useful explanation on its own, but it also serves as an example of how Kantian epistemology has practical effects. As they say, read the whole thing.

Bishop Barron’s Tribute to Bob Dylan

In honor of Bob Dylan’s eightieth birthday, Bishop Barron sung a verse from one of his favorite of Dylan’s songs in tribute:

He does a really good job. This is one hell of a birthday tribute.

As a side note, Bob Dylan is a curious figure—one of the most popular singers of all time, but not exactly gifted with a great voice. To be fair, he’s famous as a singer on the strength of his writing rather than his singing; there are many of his songs I haven’t heard, but the best versions of all of his songs I know of are covers. Heck, Bishop Barron sings the song better than Dylan did in the recording that I looked up of the original.

Just to pick an example at random, Jeff Healey did When the Night Comes Falling From the Sky sooo much better:

I think just about every cover of Blowin in the Wind was better than Bob Dylan’s. My favorite is Peter, Paul, and Mary’s:

That, by the way, doesn’t have the best audio—it clips sometimes. This version is a much cleaner recording:

I think you can make a decent case that even William Shatner’s version of Hey Mr. Tambourine Man is better than the original, but I don’t think that there’s much argument that the Byrd’s version is the best:

When it comes to The Times They Are a Changin’ I think that Dylan’s version is closer to some of the covers, but still Simon and Garfunkel did it way better:

To be fair, his voice works much better for Like a Rolling Stone:

If you compare it to the Rolling Stones’ version, I think it’s about a draw:

Anyway, here is the full version of the song which Bishop Barron sang a verse from (Every Grain of Sand):

The lyrics really are brilliant. Consider this verse, which comes shortly before what Bishop Barron sang:

I gaze into the doorway of temptation’s angry flame
And every time I pass that way I always hear my name
Then onward in my journey I come to understand
That every hair is numbered like every grain of sand

It’s quite profound. Or again, this verse that comes right before it:

Oh, the flowers of indulgence and the weeds of yesteryear
Like criminals, they have choked the breath of conscience and good cheer
The sun beat down upon the steps of time to light the way
To ease the pain of idleness and the memory of decay

Somehow these few words capture both the complexity and the simplicity of sin; and of the desperate need to escape it once you can see clearly enough to see it for what it is.

From the Archives

In January of the year of our Lord 2016, I wrote the post Kant’s Version of Knowledge. If you want to understand the modern world, it’s important to understand the substitute that Immanuel Kant came up with for knowledge, as it leads to many of the very strange things we observe today. The post begins:

For those who don’t know, there is a school of philosophy called, unfortunately enough given the passage of time, Modern Philosophy. It had several features, but the main one was that it denied that knowledge was really possible. It was rarely that explicit, and oddly enough started in the 1600s with René Descartes’ proof that knowledge is possible. It ended with Immanuel Kant’s work in the 1700s trying to come up with a workable substitute for knowledge.

I know that it’s my own post that I’m recommending, but nevertheless it’s worth reading the whole thing.

From the Archives

Back in 2016 I wrote a post called Analysis of Detective Fiction, which was both about how detective fiction tends to include analysis of detective fiction in it, and also where I give some of my own analysis of detective fiction.

To give a taste of the beginning:

Detective fiction is a curiously self-referential genre. Other genres may discuss themselves, for all I know, but this does seem to be a very common theme in detective stories. Sherlock Holmes talked about C. Auguste Dupin, Dorothy L. Sayers talked about the plot to one of the Father Brown stories in Busman’s Honeymoon, and both Agatha Christie and Sayers introduced successful female mystery writers as important characters into their stories.

As they say, read the whole thing.

Singin’ in the Rain

In order to explain the dad joke of singing, “swinging in the rain,” to my eight year old son who was, the other day, swinging in the rain, I had to show him the most famous clip from the movie Singin’ in the Rain:

About a decade ago I took the time to actually watch the movie. It has an interesting premise—it’s about a bunch of actors right at the time of the switch from silent film to talkies. The lead man, played by Gene Kelley, had a pleasant voice and can make the switch, while the leading lady is beautiful but, unfortunately, has a voice for silent films.

There are also some funny parts where the studio that Gene Kelley’s character works for turns their latest silent film into a talkie by just recording the audio while they film. The dialog was written to be physically expressive, not to really make sense, and so it sounds stupid. (“I love you! I love you! I love you! I love you! I love you!”) There are also some amusing parts where they don’t have any experience with recording sound so the microphones are in bad places and the actors don’t face the mics so the mics don’t pick up their voices, too.

There’s also a really bizarre scene where someone is pitching a new kind of movie and then instead of hearing his description we see it—for about three minutes it’s a kaleidoscope of color and sound, mostly of dancing but with some singing, and then when it’s finally over about three minutes after it should have been, we get the joke, “I don’t know, I’d have to see it.”

At its core, it’s a love story. Gene Kelley’s character pursues a chorus girl he accidentally meets and fell in love with. Unusual for love stories, Kelly’s character is popular and well off, not struggling and unknown and yet to make his way in the world. He faces some struggles, it’s true, but it’s a very different dynamic. In fact, given that Kelly’s character uses his fame and influence to make his love interest’s career in films, it’s almost a Pygmalion story, though not nearly as much as, say, My Fair Lady.

It’s one of those movies where I really wonder if I’m going to show it to my children some day. On the one hand, it’s absolutely a classic. On the other hand, I’m not sure it’s worth the time, given how many other things there are to see. Especially when you can watch the best four and a half minutes of it on YouTube.

The Progress of Technology

One of the very curious things about reading literature from the early 1900s is that people were extraordinarily impressed with the rate of technological change. World War I not withstanding, life was generally getting better in ways that would have seemed like magic to people’s grandparents. Walking and horses gave way to motor cars. Telephones allowed people to talk at a distance of hundreds of miles. Radio allowed people to get news from hundreds of miles away within seconds. Indoor plumbing and central heating produced comfort and convenience like never before.

As we come to the 1930s, cars were becoming much faster. Travel by aeroplane was becoming possible and even affordable for upper middle class people. Travel by sea was safe and comfortable thanks to large metal ships. In 1927 Charles Lindbergh had recently crossed the Atlantic ocean from New York to Paris, and who knew how much more would be possible soon?

Less often remarked upon but equally significant, astonishing advances in chemistry were underway. Synthetic rubber was invented in the 1920s which made pneumatic tires vastly more practical and affordable. Bakelite, the first plastic, was invented in 1907 and plastics would take off rapidly. (It would be some time before they came to be used to make cheap junk; at first their amazing insulating and other properties enabled the creation of all sorts of things, including many kinds of electrical inventions.) Steels were getting stronger and more corrosion resistant.

There were many other reasons why people thought about progress, and by no means did everyone think that the world really was getting better. All change comes with loss, even if it sometimes comes with greater gain, and there has never been a time when this wasn’t noticed. Still, the world was changing and even where it was changing for the worse, old ways of life would no longer apply. Women no longer wanted to be mothers; they wanted to get drunk and do drugs at wild parties and slave away at typewriters during the day—this may be a turn for the worse, but teaching girls motherhood was no longer preparing them for the life they would live. Etc. etc. etc.

This theme of change lasted a long time. The flappers who rejected their parents in the 1920s were rejected by their own children in the 1950s, and they were rejected by their children in the 1970s. Society was overthrown and then overthrown and then overthrown again; people who worried about being in advance of their age were always behind the next one when it came.

I grew up in the 1980s and I don’t recall the same level of expectation that I meet in literature earlier in the century. It’s the 1990s, though, that I really remember well since I was a teenager during them, and while there was an expectation of change, it was not at all the same sort of thing as, say, during the 1920s. I think that part of it was that the changes were more of quality than of type.

There really wasn’t anything like the bicycle before it was invented. The horseless carriage had a predecessor, of course, but it was an enormous change from it. There was nothing like an aeroplane before they were invented. Radio and telephone were remarkably unlike shouting very loudly.

By the 1990s, the only thing that was really utterly unlike what came before it was the computer, but there were primitive computers around as far back as I can remember. (If I could remember all the way back to when I was three or four there weren’t, at least in the home.) So while home computers were new in my lifetime, they were new so early on that they were barely noticeable.

There was also the internet, of course, but that was just an extension of how computers could call each other up on the phone—and people could already do that. E-mail was great, but it was just, well, electronic mail. We already had mail. This was just better mail. DVDs were just better VHS tapes. Internet video was just better television.

(Also, it should be noted, not everything even stayed the same. We put a man on the moon in 1969 and then (let’s ignore later apollo missions) never again. By the 1980s, we couldn’t if wanted to. Then the space shuttle started blowing up…)

Fast forward to 2021 and I don’t feel like life has changed all that much since I was a teenager. I will grant that, objectively, very little is exactly the same. People have smartphones and yell at each other on social media; all sorts of businesses are possible because of the internet; craig’s list has killed off newspapers and YouTube is killing off television. (That last one is kind of hard to separate from hollywood writers having become emancipated from decency and almost everything they write is shiny garbage.)

And yet, I don’t get the sense when talking to people my age and younger (or even a little older) that they feel like they’re living in an exciting era of wonderful new things with even more amazement to come. In fact, the idea that change is bad is, if anything, more pervasive now than it was when I was a kid. Organic foods, which to most people’s mind means food grown in older, more traditional ways, is phenomenally popular. Skepticism about vaccines, antibiotics, and most parts of modern medicine seems to be on the rise.

Even apart from that, with change having being constant our whole lives, change is normal. People who grew up in the last fifty years or so never expected the way we do things today to be like how we do them ten years from now, so when they’re not—it’s not amazing, it’s just work to get used to the new way of doing the same basic things. Now instead of emailing our friends, we DM them on discord or telegram or signal or, heaven help us, on Facebook (where we probably won’t have them as friends for long). To quote annoying teenagers from several years ago: amazeballs. (yes, that was actual slang in the late 2010s.)

I think that the era of technological excitement is well and truly over. Technological advancement continues, but it no longer produces a world that we don’t recognize, or discontinuities between the generations. No one suggests that because teenagers have cell phones adults have no right to tell them that they should be honest or that fornication is wrong. We have, of course, the results of decades of people claiming that fornication is, in fact good, but at least the thing is defended on its own merits (“I wanna!”) rather than on the absurdly irrelevant grounds that now the aeroplane exists and who knows whether tomorrow people will travel into space. It’s no longer a whole new world. It’s the same old world, except that instead of having to be patient with the village idiot from your village, you have to be patient with the village idiot from every village on the planet because they all have Facebook accounts. But at least we may start to have parents with the courage to tell their children that twice two is four, or at least if they leave it off it will probably be on the grounds that maybe one of the twos identifies as a three, rather than because radio now has pictures and who knows, perhaps in the future it will be in 3D.