Punk Rock Can Be Quite Tame

Over at Amatopia, Alex has a post about Geek Conformity. The main subject is a former drummer of the band Nirvana who left to join the army. When asked why he left, he explained that punk rock was too stifling. About this, Alex says:

But his comments–“strict rules for practicing the right kind of individualism”–perfectly encapsulate the contempt I feel for a lot of so-called scenes involving “free thinkers.”

And this is something I find interesting. I grew up with music from my parents’ generation, not my own. The Beatles, Simon & Garfunkel, Peter, Paul, & Mary—that’s what I listened to while I was growing up in the 1980s. The closest I came to anything contemporary was borrowing my father’s Billy Joel’s Greatest Hits CD. Punk Rock was something very foreign to me that moreover seemed dangerous.

Then as an adult, I actually listened to some. (Note: I’m enough enough of an expert to tell the various genres of people in death-inspired makeup apart. If this is Emo or some other genre I don’t know the difference between and regarded as equally dangerous in my youth, please forgive me.) And I found that it wasn’t actually very different.

Granted, there’s what for simplicity I’ll call “shout rock”, which tends to be very fast and might as well not have lyrics because one can’t tell what any of them are, but amongst the supposedly dangerous stuff where the singers actually sing, well, it turns out to be a bit underwhelming in its satanic majesty.

Readers of this blog might have seen me mention that song before, because it was the opening theme to Friday Night Live’s performances. Anyway, this is hardly scary. In fact, it sounds somewhat reminiscent of The Beatles. Compare it to Hard Day’s Night:

I’m not saying that you’re going to mistake the one song for the other, but if it turned out that Friday I’m In Love was inspired by Hard Day’s Night—or even was intended as a tribute to it—it would hardly be surprising.

There’s another song by The Cure which I discovered I like:

It’s a nice song. Granted, the lead singer is gotten up to look a bit like Pinhead from Hellrazer, but it’s basically a breathy love song. The music itself sounds a bit unfamiliar because of its heavy use of synthesizer, but as this acoustic cover shows, you can easily substitute a wind instrument and it sounds similar:

At the end of the day, it looks strange, but it isn’t actually that strange. And that makes sense when one remembers the context: punk music is a performance. It was also a product, but that just means that it was a performance intended for a large audience.

Now, the thing about performances is that in order to be successful they must be intelligible to the audience. I know that it’s popular for artists to say that they make their art for themselves but that’s (about 99%) nonsense. If they did that, they wouldn’t perform it. And the thing about audiences is that they don’t vary that much in what they find intelligible. They do vary—hence the existence of shout rock and jazzy noise (I don’t know the technical term for the genre of jazz in which the musicians, all heavily under the influence of mind-altering drugs, play random notes for interminable lengths of time with traditional jazz instruments)—but the further outside of common experience one goes, the vastly more one limits the potential size of one’s audience.

Anyone who wants to make a living off of their performances requires either an impressively rich audience or a fairly large one and the former is much harder to come by than the latter. The consequence is that the performer—if he wants to be different—must clever disguise what the audience is familiar with as something that it isn’t. Those who wish to be popular may only ever use novelty as a spice, never as a main course.

All the same, it’s tempting to thing that Rob Parvonian is right: “Punk Music’s such a joke, it’s really just baroque.”

The Best Defense Can Be Recognizing What’s Not A Threat

A few days ago, I was stung by a yellow jacket. (If you’re not from the north-eastern United States, they’re a type of wasp.) I was mowing the lawn in a section where I had let the grass grow a little too long and so didn’t notice their burrow. They felt threatened by my presence and so one stung me.

It was quite painful, and has been for a few days since, but ultimately, that’s not too important. I can simply avoid the nest. But my children play in my back yard and young children can’t be expected to remember where the yellow jacket nest is. So, shortly, the yellow jackets are going to die.

There’s an interesting lesson here—neither I nor my children would normally bother with the nest at all, except that they consider us a threat. By playing it safe—in the sense of not taking chances on what’s a danger and what isn’t—they’re actually playing it dangerous. Because when my children’s safety is on the line, I’ve got some very powerful tools to ensure that not a single one of the things survives to sting my children. And unlike them, I can make sure that they’re completely gone.

There’s a lesson in there, when it comes to dealing with potential threats. In a sense it’s related to Ed Latimore‘s dictum:

Never make enemies for free.

But the general rule is fairly simple. If you’re not sure that somebody is an enemy, be sure that you can beat them before you guarantee that they’re an enemy.

Actually, there’s an additional aspect to that rule. Before you make somebody an enemy make sure that you can beat them and their friends. In this case, the yellow jackets don’t stand a chance against my friends Amazon and the Sawyer company, who will provide me with this rather potent insect poison to pour into their nest:


(There are a lot of alternatives; I’m actually trying permethrin out for the first time. Something I’ve done before which is cheap and effective is to pour 91% isopropyl rubbing alcohol into the nest. Note: when doing this, keep your eyes open and be ready to run quickly when they notice who is disturbing them. If you’re dealing with hornets—who I’ve heard will pursue people long distances—dress appropriately and have an escape plan to indoors you can get to quickly. In that case, though, you should probably invest in a more special-purpose wasp killer than can be deployed from 20+ feet away. And as always, since this is the internet: don’t do anything yourself and instead hire a professional to do it for you. Including hiring the professional. Hire a professional hirer to hire the professional.)

It’s Not Easy Watching Movies Any More

If one pays attention to how movies are made, it’s hard not to notice that they’d probably be more virtuous enterprises if they were written, directed, performed, etc. by prison inmates. I’m not even really referring to what Rob Kroese described in this tweet:

(If it ever gets deleted, he said, “Nice to see celebrities taking time off from raping each other to condemn prayer”.)

That doesn’t help, of course, but ultimately that concerns the personal virtue of the people involved, which is between them and God. As Chesterton said in a different context, “for [their] god or dream or devil they will answer not to me”.

What really bothers me is the degree to which Hollywood wants to sell evil as good. It’s not a single-minded occupation; they also want to make money, to be praised, and the fulfillment of several other self-interests. But the problem is that there’s a certain amount of trust involved in listening to someone’s story, and people who actively mean you harm are hard to trust.

Part of how I work around this is that I rarely watch new movies and—when I watch movies—tend to re-watch movies I already know are good. But even that is getting harder. Part of it is becoming aware of how long Hollywood has desired to destroy the concepts of decency and goodness in those who watch their movies. Merely watching old movies isn’t safe. Part of it is that it’s easy to become paranoid; to borrow an analogy, when 23 out of 24 M&Ms in the bowl are poisoned, one begins to wonder about one’s skill at picking out the good ones.

The thing is: fear is not a good master. Paranoia is rational, in a certain sort of very limited sense, but it’s not healthy. I suspect that the right way forward is to emphasize how not everyone in Hollywood is actively attempting to be evil, and good can slip through the cracks. The devil is not well organized. He can’t be, since evil is the privation of good and order is good. Or, as Saint Paul said, “where in abounds, grace abounds much more.”

There Was a TV Show Called Grand Jury

I was recently watching the Mystery Science Theater 3000 episode featuring the movie The Sinister Urge, and then I watched Plan 9 From Outer Space with the Mike Nelson commentary. Out of curiosity, I looked it up the cast, because one of the major villains in Sinister Urge was a good guy in Plan 9. His name was Carl Anthony. Here’s a picture of him:


He’s the one in the police uniform. Here’s another picture of him.mpv-shot0002.jpg

His IMDB page only lists four credits:

  • Plan 9 From Outer Space (1959)
  • The Sinister Urge (1960)
  • Grand Jury: Boxing Scandal (1960)
  • Raw Force (1982)

Two things caught my attention. The first was that after some initial success, it was 22 years before he got another part. It’s not much of a movie, but it is a real movie. The IMDB synopsis for Raw Force is:

A group of martial arts students are en route to an island that supposedly is home to the ghosts of martial artists who have lost their honor. A Hitler lookalike and his gang are running a female slavery operation on the island as well. Soon, the two groups meet and all sorts of crazy things happen which include cannibal monks, piranhas, zombies, and more!

The second thing I noticed was that there was a TV show called Grand Jury. Carl Anthony was in the episode called Boxing Scandal. The Wikipedia page has next to no information beside an episode list. Yet there are writers for each episode, and it has actors, so presumably it was some sort of crime drama or detective show about a grand jury. I’m very curious what their premise was since grand juries don’t have continuity. They’re assembled for a few weeks, hear a bunch of cases, determine whether there is sufficient evidence to proceed to trial, then disband. Their goal is not to determine guilt or innocence, which would make it a very odd fit for a detective series.

Then it occurred to me to check to see whether there are any episodes on YouTube. Luck! There is:

It’s curious, to be sure. After a bit more research I discovered that Grand Juries actually do have investigative functions, such as the ability to compel witnesses to testify. The basic format is that some people committed a crime, then the grand jury investigates and tries to catch them. It’s actually a bit reminiscent of Columbo, in the sense of being a how-catch-em rather than a who-dunnit. That said, at least the episode Fire Trap is really more about the tragedy of how crooks fall apart than about investigations by the grand jury.

The acting was decent. The writing was… adequate. Really, it was kind of minimal. It wasn’t subtle, to be sure. To some degree it actually felt like a morality play—somebody made the choice to do a wicked deed and his life fell apart because of it. Of course, I have no way of knowing whether this was a typical episode. Still, I can see why this was not one of the great TV shows which is still remembered. That might be why Carl Anthony got a part in it (this was after Plan 9 but before The Sinister Urge).

This ties in to my earlier post, Most of Life is Unknown. People worked on this show and—presumably—people watched it. Most of them are probably dead by now, and the show is barely known. The wikipedia page which gives its list of episodes is presumably copied from somewhere since most of the entries are TBA. Even the things which we think should last disappear quite quickly from human memory. Fame promises so much and delivers so very little.

It’s a good thing that God is in charge of the world, or we’d be really screwed.

Of All Things To Predict, The Future is the Hardest

Yesterday, I wrote about The Future of Cars According to (Disney in) 1958. This reminded me of a conversation I had, about two decades ago, when I was a freshman in college. One of my majors was computer science (at the time I was tripple-majoring in Math, Computer Science, and Philosophy) and I was speaking with a senior who was also a computer science major. We were discussing Windows versus Linux. I can’t recall exactly how it came up, but he said, very confidently, that in 10 years neither Linux nor Windows would exist any more, both having been replaced by something completely new.

Twenty years later, both Windows and Linux are going strong, though the writing is on the wall for windows and it is gaining ever-more Linux-like features. (Including, recently, a way to easily install Linux within a virtual machine.) Which brings me to the title of this post—the future is notoriously difficult to predict. And I’d like to consider a few of the things which people tend to get wrong.


For some reason, people tend to think of current companies, products, etc. as either completely changing or completely static. Which really amounts to the same thing—the idea that whatever comes next will be completely new. And yet, this is rarely the case. Not never, of course, but rarely.

What prognosticators tend to leave out, here, is the market forces which operate on technology. Most people don’t like completely new interfaces; that means a lot of learning for marginal benefit. Far preferable are modifications to existing tools, so that the amount of learning requires is in proportion to the benefit received. An extra button or two is a small price to pay for new capabilities; having to learn all-new muscle memories for mostly old capabilities and some new ones is a price which is usually too large.

In many cases there is also the issue of interoperability. Computers benefit tremendously from backwards compatibility, but so do tools. A drill which cannot take normal drill bits renders one existing collection of drill bits worthless. Or worse, drill bits which can’t be chucked into one’s drill collection renders one’s drill collection worthless—and one generally as more money in the drills than in the bits.

These are powerful forces which tend to make new technology look and feel like old technology wherever possible.

And of course, people don’t like going out of business. Those who already have tools are motivated to improve them in order to stave off replacements—and this often works.

Technology Changes, Human Beings Don’t

Another place where prognosticators often fall down is that the human beings who are served by technology don’t change. We still have two eyes, two hands, are social, need to eat and sleep, etc.

So for example people look at the steering wheel, see that it’s been around for a long time, and think “that’s due for a change!” What they don’t consider is the problem at hand: we need a control interface to a car which translates fairly large movements into fairly small movements (for precision) while still allowing large movements, but where large movements are impossible to do by accident. This interface needs to control one dimension (left-to-right), should be usable for long periods of time without fatigue, and should be operable without looking at it. The steering wheel does all these things and does them well. Joysticks, nobs, and other such video-game replacements do at least one of these things poorly. It is of course possible that other sorts of interface will be made that can do the job just as well, but how much better is it possible to do the job, given the limitations of human physiology?

There’s a related example in 3D movies. Some people take 3D as the next obvious step, since silent movies to talkies went well and black-and-white to color went well and low-definition to high-definition went well. But what they miss is that all of these things made movies more realistic. Stereoscopic 3D information gives us distance information—and for typical movies these distances are almost always wrong for our viewing. Granted, they would probably be about right for The Secret of Nimh (which is about mice) in a 50″ TV about 10′ away—if the director would refrain from close-up shots. If you watch a movie, you will realize that there is almost no configuration of movie shots and TV size/location that will make the distance information to the characters correct for the viewer. Worse, if it were, instead of an immersive world of photographs, you’d have a window you’re looking through onto a stage play. The Nintendo 3DS had this problem, with its tiny screen—looking at it one had the feeling of watching a tiny animated diorama. It was novel, but not really interesting. The problems here are that the human being is simply not being taken into account.

Which is why, incidentally, the filming of 3D movies using two cameras at approximately eye-distance apart has all but been abandoned. Instead, 3D is (essentially) painted into a scene in post-processing by artists, who artistically fake the 3D to make it look good when viewed. I’m told that this can really help with scenes that are very dark, making it possible to distinguish dark grey blurs from a black background. This is adapting the technology to the realities of the human being, which is why it’s actually successful. But it’s also quite expensive to do, which greatly limits the appeal to viewers since it translates into much higher ticket prices.

Better Implementation Often Trumps A Better Idea

This is one of the more under-appreciated aspects of technological change. A mediocre idea implemented well is often superior to a good idea implemented in a mediocre way. I’m not just referring to actual fabrication; ideas for technology are themselves developed. Mediocre ideas which are developed well can produce better results than good ideas which are not much developed.

Perhaps the best example of this I can think of is the x86 processor. It has what is called a Complex Instruction Set. This is in contrast to the ARM processor which has a Reduced Instruction Set. (These give rise to the acronyms CISC and RISC.) The x86 processor was developed during the days when CISC was popular because computer programs were largely hand-coded in machine instructions. The complex instruction sets saved the programmers a lot of time. The trade-off was that it took far more silicon to implement all of the instructions, and this resulted in slower CPUs. After the advent of higher level languages and compilers, the idea of RISC was born—since a program was generating the machine instructions anyway, who cares how hard a time it had? The trade-off was that the compiler might have to do more work, but the resulting CPU used far less silicon and could be much faster (or cheaper, or both). In the modern environment of all programs being compiled, RISC is clearly the better approach to computing. And yet.

Because Intel CPUs were popular, Intel had a ton of money to throw at the engineering of their chips. AMD, who also manufactured x86 processors, did too. So RISC processors were only a little faster and actually significantly more expensive. And then something interesting happened—Intel and AMD figured out how to make processors such that they didn’t need nearly so much silicon to implement the complex instruction set they had. In effect what they did was to introduce a translation layer that would translate the complex instruction set into a series of “microcode” instructions—basically, internal RISC instructions. So now, with x86 processors being interally RISC machines, the penalty paid for supporting a RISC instruction set is an inexpensive translation layer in hardware—a pentalty which generally isn’t paid unless the complex instructions are actually used, too. And compilers generally don’t bother with most of the complex instructions.

The result is that ARM processors have a small advantage in not having to do translation of opcodes to an internal microcode, but this usually takes only 1 step out of the 12 execution steps in an x86 pipeline anyway. And Ahmdal’s law should always be born in mind—the maximum improvement an optimization can produce is the total execution time of the process being optimized. That is, eliminating 1 step out of 12 can produce at most a 1/12th improvement.

But then modern Intel and AMD CPUs also do something interesting—they cache the results of decoding instructions into microcode. So this decode penalty is only paid for code that isn’t executed very often anyway. The most performance-critical code—tight loops—don’t pay the decode penalty. As I said, a better implementation often trumps a better idea.

(People familiar with x86 versus ARM may point to the difference in performance on mobile phones, but this is really about what the respective CPUs were designed for. x86 processors are designed for achieving the maximum efficiency possibly at high power draws, while ARM CPUs are designed to achieve the maximum efficiency at low power draws. ARM CPUs are therefore better in very low power draw environments but simply cannot be scaled up to high power draw environments. There is no power configuration possible in which an ARM processor will come close to the single-threaded performance of an x86 processor. The tradeoff they make for the low-power efficiency is that they cannot be scaled up. This has nothing to do with technical superiority of either x86 or ARM processors, but the fact that they’re designed for extremely different workloads and environments. The thing which Intel and AMD need to be careful about is that the amount of processing power which they can provide at 15W, 35W, or even 60W may simply be far more than is necessary in a laptop (since the tasks which laptops are asked to do is becoming more standardized as the development of software slows down), at which point ARM might eat their lunch not because of technical superiority but because of its capabilities becoming a better match for a changing problem-space.)

The Future of Cars According to (Disney in) 1958

In the 1950s, Walt Disney started a television show in order to fund the theme park he wished to build called Disneyland. In 1958, this television show featured a segment called The Magic Highway, which discussed the future of cars:

I am very fond of historical projections of the future, and this is an especially well-done one. Though I should mention that one bit of the future it predicted—larger, straighter highways designed for faster travel—was not much of a prediction as the US interstate highway system was already underway (it began in 1956).

There are some things which are remarkably common to predictions of the future which can be seen in this movie too. One is the way that user interfaces are either predicted to change when they won’t or remain static when they will change. In this case the computer which drives the car is given its destination via punch card. (In another scene data is entered via toggle switches.) On the other hand, the steer wheel was replaced by by joysticks, which are actually a terrible interface for driving a car. By contrast, steering wheels are actually a very good interface for driving a car, so there’s no real reason to replace them.

There is also the assumption that energy will be free and consequently used in the most lavish of fashions (like heating highways to dry them off from the rain). I think that was related to the expectation of the coming nuclear age and how it will provide almost unlimited power. That was still highly optimistic since at a minimum one needs to maintain an electrical distribution grid, and wires have to be sized to the electricity they’re carrying, which means copper or aluminum fabrication and distribution.

But details like distribution aside, the predictions that nuclear power would result in free power never panned out for the simple reason that the nuclear part of a nuclear power plant is actually a small fraction of the work which goes on in a nuclear power plant. At their heart, nuclear power plants are giant hot-water heaters. The electricity is produced in turbines which are turned by steam. The only significant difference between a nuclear power plant and a coal power plant (or oil, natural gas, etc.) is what heats the water.

Anyway, check it out. It’s very interesting to see the ways in which futurians are often wrong and sometimes right.

The Three Endings of Clue

If you’re not familiar with the movie Clue, it’s based on the boardgame of the same name (known as Cluedo in Britain). It’s not the greatest movie ever made, but it’s a lot of fun. I own it both on DVD and Blu-ray and recommend it if you like murder mysteries and fun.

A curious feature of the movie is that it was filmed with three different endings. Movie theaters would be given one of the three endings, so viewers would see a different ending depending on which movie theater they saw it in. However, this posed something of a problem for the VHS version of the movie.

Technically, of course, it was possible that they could have made three copies of the VHS cassettes, but probably would have been prohibitively expensive. Thousands of people see the copy of a movie sent to a theater, whereas each VHS cassette would have cost $15 or $20 at the time. The labor involved in shuffling the cassettes couldn’t have been worth it.

Whatever the reason, three different copies of the movie was not the approach taken by the makers of the VHS. And since VHS was a linear medium, it wasn’t possible to shuffle the endings. So the of the endings were presented as possible endings, while the third was presented as “what really happened”. A nice touch is that this information is presented in the sort of text cards that one might see in a silent film; it fits nicely with the setting in the 1950s (desipte that being long after the era of silent films, curiously) and with the fact hat it’s a movie based on a board game. (Which would be ludicrous except that the movie doesn’t take itself seriously, though not in the modern wink-at-the-audience way which we all know and hate.)

About ten years after the release of Clue (the movie), DVDs hit the market. Unlike VHS cassettes, DVDs did not have to be linear and it became possible to play one of the theatrical endings at random. And the first DVD of Clue that I bought indeed had that option, though thankfully it also had (and defaulted to) the option to play the VHS ending, i.e. the three endings together with the title cards identifying the first two endings as possibilities. I’m thankful because I far prefer that ending; it’s in keeping with the fun and tone of the rest of the movie.

But this raises the interesting question: if VHS hadn’t had the technical limitation of being purely sequential, would the three-in-one ending ever have been made? There’s no way to know, of course, but it points to a larger issue of providence and limitations. Limitations often force people to be creative in ways they would not have been without them. Perhaps the best example of this I can think of is Star Wars episodes IV-VI. When one compares them to episodes I-III, when George Lucas, now very rich, had (effectively) no limits, he did much worse work.

The issue of providence is probably fairly obvious, but we as finite creatures don’t see the big picture; we chafe at our limitations. But our limitations often guide us to the work which we’re supposed to be doing. The things which frustrate us are often safety rails. Not so much that they protect our health—though they occasionally do that—but they protect the good work which we’ve been given to do. This is all the more helpful because we so rarely recognize that work until long after it was (all but) forced on us. If we even recognize it then. Something to remember is that God loves Beetles.


Man Was Always Small

In Orthodoxy, Chesterton has a great line:

It is quite futile to argue that man is small compared to the cosmos; for man was always small compared to the nearest tree.

To give context:

Herbert Spencer would have been greatly annoyed if any one had called him an imperialist, and therefore it is highly regrettable that nobody did. But he was an imperialist of the lowest type. He popularized this contemptible notion that the size of the solar system ought to over-awe the spiritual dogma of man.

It’s something I’ve seen various versions of from contemporary atheists. The most common is that the universe is so large that even if God existed he couldn’t possible care what human beings do. Other versions tend to run along the lines that the universe is so big our petty concerns can’t matter.

Chesterton’s point is a very good one—that man never took his importance from his size relative to the world. It reminds me greatly of a quote from C.S. Lewis in his book The Problem of Pain:

It would be an error to reply that our ancestors were ignorant and therefore held pleasing illusions about nature which the progress of science has since dispelled… even from the beginnings, men must have got the same sense of hostile immensity from a more obvious source. To prehistoric man the neighboring forest must have been infinite enough, and the utterly alien and infest which we have to fetch from the thought of cosmic rays and cooling suns, came snuffing and howling nightly to his very doors… It is mere nonsense to put pain among the discoveries of science. Lay down this book and reflect for five minutes on the fact that all the great religions were first preached, and long practiced, in a world without chloroform.

Of course, few atheists will put the idea this baldly—atheists rarely state any of their ideas without dressing them up a bit, or simply failing to consider how they relate to their other ideas, in my experience—but that one meets this sort of thing at all is very curious. One of the problems which some atheists seem to have in relating to older ideas is that they can’t relate to older peoples.

To the degree that there are solutions to this, I suspect that they are largely going to be narrative. The modern narrative is one of being utterly cut off from our ancestors. And yet we’re also seeing counter-narratives emerging; the revival of older traditions, the preference for older ways of doing things. It’s not material whether the putatively older ways of doing things are in fact accurate to how they used to be done, what matters to this purpose is whether they are believed to be in continuity. When they are—when people believe themselves to be in continuity with their ancestors—that’s when they’ll stop seeing their ancestors as aliens and see them as human, instead.

Most of Life is Unknown

We live so awash in stories and news that we get a very skewed perspective on how much of real life is know to more than a few people and God. What got me thinking about this was watching the following song:

It was the theme song used by a sketch comedy group at the university I went to for my undergraduate degree. They were called Friday Night Live (as an homage to the TV show of similar name) and would put on a show about three times a semester. They had apparently been hugely popular when they started, which was at least a few years before I attended. I was in the rival sketch comedy group, Pirate Theater. At the beginning of my freshman year the attendance at our shows was lower than that of Friday Night Live, but by the end of my senior year the positions had reversed. Our audiences were 3-4 times larger and theirs were smaller even than our audiences had been in my freshman year.

A few years ago I ran into someone who was currently a student at the university and I asked about the shows. He said that Pirate Theater was still reasonably popular but Friday Night Live no longer existed. In fact, he had never even heard of it.

There had been a reasonably friendly rivalry between the two shows, and it turned out that our efforts brought success, in a sense. I don’t think that any of us pirates wanted to kill of Friday Night Live, and ultimately I suspect that it was the quality of their writing which did them in. For whatever reason (I never wrote sketches for FNL, so I couldn’t say what it was like to do so), Pirate Theater managed to attract far more of the skilled writers on campus. It also didn’t help that FNL insisted on having an intermission and hiring a local band to play in it. I think in all the time I was there—and I didn’t miss an FNL show all four years—they had one band that I didn’t leave the auditorium to get away from. Really, really cheap bands tend to be so inexpensive for a reason.

As you might imagine, I was never alone outside the doors when the band was playing. And they were always ear-hurtingly loud, too. Adding injury to insult, I suppose.

Towards the end of my senior year, there were probably less than a hundred people attending the Friday Night Live shows. The total number of people who saw their trajectory for those four years I watched it was not very large; the number who remember it now is probably much smaller. And yet the actors did work on their sketches, however little humor was in them. The “turrets family” where the “joke” was that there was a lot of yelling and cussing may not have been entertaining to watch, but it’s no easier to memorize unfunny lines, or to say them at the right time when you’re live on stage. (There was at approximately one of those per show, for all four years, by the way.)

One of the actors was limber and would do physical comedy with a folding chair, getting stuck in it. It wasn’t brilliant physical comedy, and (due to the lack of skilled writing) never really fit into the sketches it was in; one could see it coming a mile away as the sketches it was used in were basically an excuse to do the wacky chair antics. But someone did write the sketch, and people memorized it, and the actor did twist himself through a chair on stage which is not an easy thing to do. And I do have to say that the final time he did it—which was the last FNL show I attended since he graduated the same year I did—was actually kind of funny because it was actually a protracted goodbye dance with the chair, complete with sad music and longing glances.


This was all very real; people put real work, went through real happiness and sadness, and now it is mostly forgotten. That is ultimately the fate of (almost) all human endeavors. This was captured quite well, I think by Percy Blythe Shelley in his poem Ozymandias:

I met a traveller from an antique land,
Who said—“Two vast and trunkless legs of stone
Stand in the desert. . . . Near them, on the sand,
Half sunk a shattered visage lies, whose frown,
And wrinkled lip, and sneer of cold command,
Tell that its sculptor well those passions read
Which yet survive, stamped on these lifeless things,
The hand that mocked them, and the heart that fed;
And on the pedestal, these words appear:
My name is Ozymandias, King of Kings;
Look on my Works, ye Mighty, and despair!
Nothing beside remains. Round the decay
Of that colossal Wreck, boundless and bare
The lone and level sands stretch far away.

Most things fade from memory far faster, of course. How many people now remember more than a few fragments of Friday Night Live’s sketches? I don’t think I remember more than a few fragments of the Pirate Theater sketches which I wrote, let alone those I merely performed in. I do have a DVD with a collection of the video sketches we did (many of which I’m in), but it’s been quite some time since I watched it.

It may well be longer until I watch it again. In the intervening decade and a half, I’ve gotten a profession, married, bought a house, had three children, published three novels, made a YouTube channel with over 1,973 subscribers, and a whole lot more. As much a I value my memories of my college days, I don’t want to go back in the way that video takes one back.

But the thing is, that’s not a strength. I just don’t have the time and energy for it. But all the things I’ve done since which so occupy me now will also fade in time. Eventually I will die; eventually this house will fall down or be demolished; eventually my children will die. Nothing has any permanence within time.

So the only hope we have is for permanence outside of time. There’s a great metaphor, which Saint Augustine uses in his Confessions, of God, at the end of time, gathering up the shattered moments of our lives and putting them together as a unified whole. And that’s really the only hope we have for any of our lives to be real.

The Death of Rock and Roll With Zarathustra’s Serpent

You may recall my blog post The Death of Rock-n-Roll. After writing it, I invited Zarathustra’s Serpent to talk with me about the subject because he’s studied popular music quite extensively. This is the conversation we had. You can also watch the video on YouTube:

When I Feel Sorriest For Atheists

Of all the things which rightly make an atheist an object of pity, the one I feel sorriest for the atheist for is when he realizes that all pleasure, satisfaction, and joy that he experiences is (according to him) nothing more than some chemicals in his brain. For two main reasons:

First, because he then accords Joy no significance. When this happens one can almost hear the sound of the cell door slamming shut on the mental prison in which he is trapped. It is a prison with no windows and no sunlight can enter it.

Second, because he will soon notice that there is, therefore, no distinction in kind between real happiness and what is produced with recreational drugs. And recreational drugs—the hard-core ones, I mean—are basically a form of slow suicide. (Not because their side-effects cause death, but because their main effect is basically a temporary suspension of living in a haze of mere feeling.)

There are many things for which to pity this atheist, but this one has always affected me the most. Once the door of this mental prison has been shut, I do not know of any natural force which can open it. I doubt that there is anything to do for a person in such a case but pray for them.

You Can’t Get an Ought From an Is In Hell

One of the questions which comes up in discussions of morality is whether you can get an “ought” from an “is”. This is relevant primarily to discussions of atheism, since to the atheist everything is a brute fact, i.e. an “is” which is not directed towards anything, and therefore an atheist cannot get any “oughts” out of their description of what is. Or in simpler language, if God is dead then all things are permitted. (Note for the unpoetic: by “God is dead” we mean “there is no God”.)

There are two reasons why if God is dead all things are permitted:

  1. If God is dead, who is there to forbid anything?
  2. If God is dead, then there is no ultimate good because all is change and therefore nothing has any lasting reality.

If you argue this sort of stuff with atheists long enough, somewhere along the line while you’re explaining natural ends (telos) and natural morality, you may come by accident to a very interesting point which the atheist will bring up without realizing it. It often goes something like this:

OK, suppose that what God says is actually the only way to be eternally happy. Why should you be eternally happy? Why shouldn’t you do what you want even though it makes you unhappy?

This question sheds some very interesting light on hell, and consequently on what we mean by morality. Our understanding of morality tends to be like what Saint Augustine said of our understanding of time:

What then is time? If no one asks me, I know what it is. If I wish to explain it to him who asks, I do not know.

Somehow or other atheists tend to assume that ought means something that you have to do, regardless of what you want to do. It’s very tempting to assume that this is a holdover from childhood where ought meant that their parents would make them do it whether or not they wanted to. It’s tempting because it’s probably the case and because that’s not an adult understanding of ought. And it’s not because ultimately we can’t be forced to be good. (Or if this raises your hackles because I’m “placing limits on God”, then just take it as meaning that in any event we won’t be forced to be good.)

Hell is a real possibility. Or in other words, it is possible to see two options and knowingly pick the worse option.

What we actually mean by saying that we ought to do something is that the thing is directed towards the good. And we can clarify this if we bring in a bit of Thomistic moral philosophy: being is what is good. Or as the scholastic phrase goes, good is convertible with being. But being, within creation, is largely a composite entity. A statue is not just one thing, but many things (atoms, molecules, etc.) which, in being ordered toward the same end, are also one thing which is greater than their parts.

And you can see a symphony of ordering to a greater being, in a human being. Atoms are ordered into proteins (and many other things like lipids, etc), which are ordered into cells, which are ordered into organs, which are ordered into human beings. But human beings are not at the top of the hierarchy of being, for we are also ordered into community with other created things. (Please note: being part of a greater whole does not rob the individual of his inherent dignity; the infinite goodness of God means that creation is not a competition. Also note that God so exceeds all of creation that He is not in the hierarchy of being, but merely pointed to by it.)

And so we come to the real meaning of ought. To say that we ought to do something is to say that the thing is ordered towards the maximum being which is given to us. But we need not choose being; we can instead choose non-being. The great lie which the modern project (and, perhaps not coincidentally, Satan) tells us is that there is some other being available to us besides what was given to us by God. That we can make ourselves; that we can give ourselves what we haven’t got. And, not at all coincidentally, are the things which we ought not to do—that is, those things are not ordered toward being. They’re just what the atheist says that all of life is—stimulating nerve endings to fool ourselves that we’ve accomplished something.

And yet atheists complain when one says that, according to them, they’re in hell.

God, at least, has a sense of humor.

Gold Covered Chicken Wings

If you haven’t heard, there’s a restaurant which came up with the idea of gold-covered chicken wings. While there are all sorts of things which could be said about about the wisdom of buying such things, the thing I really want to talk about is the symbolism of the thing.

(Since there’s too much outrage on the internet, I think I should note in passing that due to gold’s astonishing brilliance with only a few atoms of thickness the wings are not actually wildly expensive. You can get 10 wings for $30, which for the location is probably a 3x markup—wasteful, but not very wasteful in absolute terms. You can easily get less food for more money in Manhattan.)

To see the symbolism of the thing, we need to consider what gold-plated food is. Unlike many heavy metals, metallic gold is (basically) inert, which is why it is safe as a food additive. But the fact that it’s inert also means that it has exactly no nutritional value, either. It’s not bad for you, it’s not good for you; it’s just there.

As such it’s an almost pure waste. I say “almost” because it does look pretty, though its beauty in the wrong place. If gold is to be present, it should be on the plates, where its beauty is not destroyed by the act of eating. It should not be on the food itself, where the beauty is destroyed by the act of eating. And that is, I think, the key to the symbolism.

My favorite version of the baptismal promises includes the questions:

Do you reject Satan?

And all his empty promises?

But there is another translation of the second question:

And all his empty show?

Gold-covered chicken wings seem to me an almost perfect illustration of Satan’s empty show. It looks like it has value—but has none—and the acceptance of it destroys even the slight good it uses as a bait.

Hearing the Same Story Twice

One of the great benefits of having friends who are at least twenty years older than oneself is that they have a wealth of life experiences that they are happy to share. This enables one to circumvent the problem in the popular saying:

Good judgment comes from experience and experience comes from bad judgment.

Having significantly older friends means that one can benefit from their experience. (The same is true of parents, if one can bring oneself to listen to them.)

But there is a problem with listening to the stories of people who are several decades one’s senior: they tend to tell you each story several times. Contrary to popular belief, this is not because they’re old, but because while stories are memorable, the act of telling them isn’t. In fact, telling a story is actually quite hard to remember because the storyteller’s attention is on the story, not on the telling.

Further, older people simply have far more to remember because they’ve got much fuller lives than young people do. Our culture’s obsession with youth not withstanding, older people have far more friends and acquaintances than young people do. They also have vastly more people’s lives and concerns to keep track of.

And since one very remarkable experience—that is, one good story—will touch on many aspects of life, in conversation with one’s older friends their especially good stories will come up from time to time, and they will probably not remember that they already told you that story three years ago. As I said, the story is far more memorable than the telling of it.

There are, at this point, three options:

  1. Interrupt them to tell them they already told you the story.
  2. Let them tell it then tell them that they already told you the story.
  3. Let them tell the story and appreciate it again.

Of the three, the second is the worst option. It’s basically throwing a gift back in the giver’s face. Don’t do this.

The first can be polite, but it’s tricky to pull off. If the story is recognizable in its first few words, you can probably find a pause in the first sentence (or so) to interrupt and ask if it’s the story you’re thinking of—and bear in mind you might be wrong because sometimes different stories sound similar. If it is, then tell the friend how much you like the story. The danger of interrupting them is that you might seem ungrateful or unappreciative of the wisdom being conveyed and telling them how much you appreciate the story—not merely appreciated it in the past, but kept its lessons with you—will ensure that the proper reaction of gratitude is conveyed.

The third option is often the best option. First, because it is the most grateful option. Second, because the same story is often told with different details filled in, so one gets a more complete version of it by putting the two together. Third, because one will probably learn new things from hearing it again. And fourth, because the impossibility of perpetual novelty (while maintaining quality), happiness depends upon the ability to appreciate good things one has already experienced. Hearing a good story again is excellent practice at this.

One should not lie and pretend that one has not heard the story before, but it almost never comes up, and if it doesn’t, there’s no need to bring it up.

And you’re vastly better off having heard the same story twice than not at all.

Atheists’ Bluster

Around a quarter century ago, in my early teens, I did online Christian apologetics in various forums (AOL, usenet, etc.). And something I came across was the habit of atheists using bluster—the extremely confident assertion of things that, if pressed, they couldn’t defend.

In my later teens I took a hiatus from apologetics to spend time learning, to better prepare myself. It ended up being a fairly long hiatus, and by the time I was ready to get back to apologetics I was Catholic and now it was called evangelization. And in the great dealing of thinking and reading and so forth that I did in those years, I  came to the conclusion that reasoned argument was not what most people needed. Atheism was not so much an intellectual position as it is a mental prison. The atheist is in a tiny, cramped little universe, so much smaller than a human mind. What atheists really need—as Chesterton said of the madman in his masterpiece, Orthodoxy—is not arguments, but air. He needs to come in contact with enough truth that he will realize it can’t fit inside his prison, at which point he will realize that he’s not actually inside of a prison, and leave.

But being an open Catholic online and hanging out with the sort of people I hang out with does bring one into contact with a lot of atheists—though almost all of a few related kinds. And in meeting the same sorts of people I was arguing with 25 years ago, I found that they were still using bluster—making assertions with impressive confidence. But as an adult in my 30s, this was nowhere near as intimidating as it was to me when I was 13. And I found something very interesting when I would respond to bald-faced assertions with contrary bald-faced assertions.

I somewhat naively expected to simply come to a standstill of assertions that would result either in agreeing to disagree or providing space for a real discussion to take place. Instead, the atheists tended to get angry. Very angry. And what was curious was that it was the sort of anger one sees from a dog owner who isn’t any good at dog training when their dog fails to perform on command. It’s the anger of, “you’re not doing what you’re supposed to!”

You’ll see this all over the world, from all sorts of people. Doubtless many atheists have gotten this from irate grandmothers. But they were holding themselves up as rational inquirers. But if you scratch the surface, like with gold leaf, you find out that their rationality is just a coating which is only a few molecules thick.

And I started noticing that this applied in other places, too. The people who scream, “only believe things because of evidence!” get awfully huffy when you ask them for evidence of their honesty. They don’t put it that way, but apparently that, you’re supposed to take on faith.

“Don’t believe things without evidence!”

“OK, do you have any evidence that you’re not a moron?”

Again, their principle apparently comes with a lot of unstated qualifications. In theory, this should be an entirely reasonable question since you’re just asking for evidence. Instead you’ll typically hear about “ad homs” (argumentum ad hominem, i.e. arguing that the man is bad as if that proved his conclusion is false, see here for more), which is rather bizarre since a question cannot be a fallacious argument since it is not any kind of argument.

It’s been rather fascinating to see, since these people have great conviction, but it’s not conviction in their own principles. I still haven’t really found what their conviction is in. (I have my suspicions, and it will vary with the individual, of course. But I haven’t come to any definite conclusions yet.)

But it’s been very interesting to see how little there is behind atheists’ bluster.