Wedding Flowers Is Off to the Editor

For anyone who is interested my my novels: a few days ago I sent the manuscript of Wedding Flowers Will Do For a Funeral (the second chronicle of Brother Thomas) off to Silver Empire publishing (they published the first Chronicle of Brother Thomas). Next comes edits, and if all goes well it will be published in the first half of 2020. It’s been a long time coming, and I’m really looking forward to finally having it published.

Why Do Moderns Write Morally Ambiguous Good Guys?

(Note: if you’re not familiar Modern spelled with a capital ‘M’, please read Why Moderns Always Modernize Stories.)

When Moderns tell a heroic story—or more often a story which is supposed to be heroic—they almost invariably write morally ambiguous good guys. Probably the most common form of this is placing the moral ambiguity in the allies who the hero protagonist trusts. It turns out that they did horrible things in the past, they’ve been lying to the protagonist (often by omission), and their motives are selfish now.

Typically this is revealed in an unresolved battle partway through the story, where the main villain has a chance to talk with the protagonist, and tells him about the awful things that the protagonist’s allies did, or are trying to do. Then the battle ends, and the protagonist confronts his allies with the allegations.

At this point two things can happen, but almost invariably the path taken is that the ally admits it, the hero gets angry and won’t let the ally explain, then eventually the ally gets a chance to explain (or someone else explains for him), and the protagonist concludes that the ally was justified.

In general this is deeply unsatisfying. So, why do Moderns do it so much?

It has its root in the modern predicament, of course. As you will recall, in the face of radical doubt, the only certainty left is will. To the Modern, therefore, good is that which is an extension of the will, and evil is the will being restricted. It’s not that he wants this; it’s that in his cramped philosophy, nothing else is possible. In general, Moderns tend to believe it but try hard to pretend that it’s not the case. Admitting it tends to make one go mad and grow one’s mustache very long:

(If you don’t recognize him, that’s Friedrich Nietzsche, who lamented the death of God—a poetic way of saying that people had come to stop believing in God—as the greatest tragedy to befall humanity. However, he concluded that since it happened, we must pick up the pieces as best we may, and that without God to give us meaning, the best we could do is to try to take his place, that is, to use our will to create values. Trying to be happy in the face of how awful life without God is drove him mad. That’s probably why atheists since him have rarely been even half as honest about what atheism means.)

The problem with good being the will and evil being the will denied is that there’s no interesting story to tell within that framework.

A Christian can tell the story of a man knowing what good is and doing the very hard work of trying to be good in spite of temptation, and this is an interesting story, because temptation is hard to overcome and so it’s interesting to see someone do it.

A Modern cannot tell the story of a man wanting something then doing it; that’s just not interesting because it happens all the time. I want a drink of water, so I pick up my cup and drink water. That’s as much an extension of my will as is anything a hero might do on a quest. In fact, it may easily be more of an extension of my will, because I’m probably more thirsty (in the moment) than I care about who, exactly, rules the kingdom. Certainly I achieve the drink more perfectly as an extension of my will than I am likely to change who rules the kingdom, since I might (if I have magical enough sword) pick the man, but I can’t pick what the man does. And what he does is an extension of his will, not mine. (This, btw, is why installing a democracy is so favored as a happy ending—it’s making the government a more direct extension of the will of the people.)

There’s actually a more technical problem which comes in because one can only will what is first perceived in the intellect. In truth, that encompasses nothing, since we do not fully know the consequence of any action in this world, but this is clearer the further into the future an action is and the more people it involves. As such, it is not really possible for the protagonist to really will a complex outcome like restoring the rightful king to the throne of the kingdom. Moderns don’t know this at a conscious level at all, but it is true and so does influence them a bit. Anyway, back to the main problem.

So what is the Modern to do, in order to tell an interesting story? He can’t tell an interesting story about doing good, since to him that’s just doing anything, and if he does something reader is not the protagonist, so it doesn’t do him any good. Granted, the reader might possible identify with the protagonist, but that’s really hard to pull off for large audiences. It requires the protagonist to have all but no characteristics. For whatever reason, this seems to be done successfully more often with female protagonists than with male protagonists, but it can never be done with complete success. The protagonist must have some response to a given stimulus, and this can’t be the same response that every reader will have.

The obvious solution, and for that reason the most common solution, is to tell the story of the protagonist not knowing what he wants. Once he knows what he wants, the only open question is whether he gets it or not, which is to say, is it a fantasy story or a tragedy? When he doesn’t know what he wants, the story can be anything, which means that there is something (potentially) interesting to the reader to find out.

Thus we have the twist, so predictable that I’m not sure it really counts as a twist, that the protagonist, who thought he knew what he wants—if you’re not sitting down for this, you may want to sit now so you don’t fall down from shock—finds out that maybe he doesn’t want what he thought he wanted!

That is, the good guys turn out to be morally ambiguous, and the hero has to figure out if he really wants to help them.

It’s not really that the Moderns think that there are no good guys. Well, OK, they do think that. Oddly, despite Modern philosophy only allowing good and evil to be imputed onto things by the projection of values, Moderns are also consequentialists, and consequentialists only see shades of grey. So, yes, Moderns think that there are no good guys.

But!

But.

Moderns are nothing if not inconsistent. It doesn’t take much talking to a Modern to note that he’s rigidly convinced that he’s a good guy. Heck, he’ll probably tell you that he’s a good person if you give him half a chance.

You’ll notice that in the formula I’ve described above, which we’re all far too familiar with, the protagonist never switches sides. Occasionally, if the show is badly written, he’ll give a speech in which he talks the two sides into compromising. If the show is particularly badly written, he will point out some way of compromising where both sides get what they want and no one has to give up anything that they care about, which neither side thought of because the writers think that the audience is dumb. However this goes, however, you almost never see the protagonist switching sides. (That’s not quite a universal, as you will occasionally see that in spy-thrillers, but there are structural reasons for that which are specific to that genre.) Why is that?

Because the Modern believes that he’s the good guy.

So one can introduce moral ambiguity to make things interesting, but it does need to be resolved so that the Modern, who identifies with the protagonist, can end up as the good guy.

The problem, of course, is that the modern is a consequentialist, so the resolution of the ambiguity almost never involves the ambiguity actually being resolved. The Modern thinks it suffices to make the consequences—or as often, curiously, the intended consequences—good, i.e. desirable to the protagonist. So this ends up ruining the story for those who believe in human nature and consequently natural law, but this really was an accident on the part of the Modern writing it. He was doing his best.

His best just wasn’t good enough.

Sequels Shouldn’t Reset To the Original

One of the great problems that writers have when writing sequels is that, if there was any character development in a story at all, its sequel begins with different characters, and therefore different character dynamics. If you tell a coming-of-age story, in the sequel you’ve got someone who already came of age, and now you have to tell a different sort of story. If you tell an analog to it, such as a main character learning to use his magical powers or his family’s magic sword or his pet dragon growing up or what-have-you, you’ve then got to start the next story with the main character being powerful, not weak.

One all-too-common solution to this problem is to reset the characters. The main character can lose his magic powers, or his pet dragon flies off, or his magic sword is stolen. This can be done somewhat successfully, in the sense of the change not being completely unrealistic, depending on the specifics, but I argue that in general, it should not be.

Before I get to that, I just want to elaborate on the depending-on-the-specifics part. It is fairly viable for a new king with a magic sword to lose the sword and have to go on a quest to get it back, though it’s better if he has to entrust it to a knight who will rule in his absence while he goes off to help some other kingdom. Probably the most workable version of this is the isekai story—a type of story, common in Japanese manga, light novels, and animation, where the main character is magically abducted to another world and needs to help there. Being abducted to another world works pretty well.

By contrast, it does not work to do any kind of reset in a coming-of-age story. It’s technically viable to have the character fall and hit his head and forget everything he learned, but that’s just stupid. Short of that, people don’t come of age then just become people who no experience who’ve never learned any life lessons again.

So why should resets be avoided even when they work? There are two main reasons:

  1. It’s throwing out all of the achievements of the first story.
  2. It’s lazy writing.

The first is the most important reason. We hung in with a character through his trials and travails to see him learn and grow and achieve. If the author wipes this away, it takes away the fact that any of it happened. And there’s something worse: it’s Lucy pulling the football away.

If the author is willing to say, “just kidding” about character development the first time, why should we trust that the second round of character development was real this time? Granted, some people are gullible—there will be people who watch the sequel to The Least Jedi. I’m not saying that it’s not commercially viable. Only that it makes for bad writing.

Which brings me to point #2: it’s lazy writing to just undo the events of the original in order to just re-write it a second time. If one takes the lazy way out in the big picture, it sets one up to take the lazy way out in the details, too. Worse, since the second will be an echo of the first, everything about it will either be the first warmed over or merely a reversal of what happened the first time. Except that these reversals will have to work out to the same thing, since the whole reason for resetting everything is to be able to write the same story. Since it will not be its own story, it will take nearly a miracle to make the second story true to itself given that there will be some changes.

A very good example of not taking the lazy way out is the movie Terminator 2. Given that it’s a movie about a robot from the future which came back in time to stop another robot from the future from killing somebody, it’s a vastly better movie than it has any right to be. Anyway, there’s a very interesting bit in the director’s commentary about this. James Cameron pointed out that in most sequels, Sarah Connor would have gone back to being a waitress, just like she was in the first movie.

But in Terminator 2, she didn’t. James Cameron and the other writer asked themselves what a reasonable person would do if a soldier from the future came back and saved her from a killer robot from the future, and impregnated her with the future leader of the rebellion against the robots? And the answer was that she would make ties with gun runners, become a survivalist, and probably seem crazy.

We meet her doing pullups on her upturned bed in a psychiatric ward.

Terminator 2, despite having the same premise, is a very different movie from Terminator because Terminator 2 takes Terminator seriously. There are, granted, some problems because it is a time travel story and time travel stories intrinsically have plot holes. (Time travel is, fundamentally, self-contradictory.) That said, Terminator and Terminator 2 could easily be rewritten to be about killer robots from the Robot Planet where the robots have a prophecy of a human who will attack them. That aside, Terminator 2 is a remarkably consistent movie, both with itself and as a sequel.

Another good example, which perhaps illustrates the point even better, is Cars 2. The plot of Cars, if you haven’t seen it, is that a famous race car (Lightning McQueen) gets sentenced to community service for traffic violations in a run-down town on his way to a big race. There he learns personal responsibility, what matters in life, and falls in love. Then he goes on to almost win the big race, but sacrifices first place in order to help another car who got injured. (If you didn’t figure it out, the cars are alive in Cars.)

The plot of Cars 2 is that McQueen is now a champion race car and takes part in an international race. At the same time, his buddy from the first movie, Mater, is mistaken for a spy and joins a James Bond-style espionage team to find out why and how an international organization of evil (I can’t recall what they’re called; it’s C.H.A.O.S. from Get Smart or S.P.E.C.T.R.E. from James Bond) is sabotaging the race. McQueen is not perfect, but he is more mature and does value the things he learned to value in the first movie. The main friction comes from him relying on Mater and Mater letting him down.

As you can see, Cars 2 did not reset Cars, nor did it try to tell Cars over again. In fact, it was so much of a sequel to Cars, which was a coming-of-age movie, that it was a completely different sort of movie. This was a risk, and many of the adults who liked Cars did not like Cars 2, because it was so different. This is the risk to making sequels that honor the first story—they cannot be the first story over again, so they will not please everyone who liked the first story.

Now, Cars 2 is an interesting example because there was no need to make it a spy thriller. Terminator 2 honored the first movie and was still an action/adventure where a killer robot has come to, well, kill. But there was a practical reason why Cars 2 was in a different genre from its predecessor but Terminator 2 was not: most everyone knows how to grow up enough to not be a spoiled child, but pretty few people in Hollywood have any idea how to keep growing up to become a mature adult from a minimally functioning adult.

If one wants to tell a true sequel to a coming-of-age film, which mostly means a film in which somebody learns to take responsibility for himself, the sequel will be about him learning to take responsibility for others. In practice, this means either becoming a parent or a mentor.

This is a sort of story that Hollywood has absolutely no skill in telling.

If you look at movies about parents or mentors, they’re almost all about how the parent/mentor has to learn to stop trying to be a parent/mentor and just let the child/mentee be whatever he wants to be.

Granted, trying to turn another human being into one’s own vision, materialized, is being a bad parent and a bad mentor, just letting them be themselves is equally bad parenting and mentoring. What you’re supposed to do as a parent or a mentor is to help the person to become themselves. That is, they need to become fully themselves. They must overcome their flaws and become the perfect human being which God made them to be. That’s a hard, difficult process for a person, which is why it takes so much skill to be a parent or a mentor.

There’s a lot of growth necessary to be a decent parent or mentor, but it’s more subtle than growing up from a child. Probably one of the biggest things is learning how much self-sacrifice is necessary—how much time the child or mentee needs, and how little time one will have for one’s own interests. How to balance those things, so one gives freely but does not become subsumed—that is a difficult thing to learn, indeed. That has the makings of very interesting character development.

The problem, of course, is that only people who have gone through it and learned those lessons are in a position to tell it—one can’t teach what one doesn’t know.

At least on purpose.

Art is a great testament to how much one can teach by accident—since God is in charge of the world, not men.

But I think that the world really could do with some (more) decent stories about recent adults learning to be mature adults. I think that they can be made interesting to general audiences.

The Scientific Method Isn’t Worth Much

It’s fairly common, at least in America, for kids to learn that there is a “scientific method” which tends to look something like:

  1. Observation
  2. Hypothesis
  3. Experiment
  4. Go back to 1.

It varies; there is often more detail. In general it’s part of the myth that there was a “scientific revolution” in which at some point people began to study the natural world in a radically different way than anyone had before. I believe (though am not certain) that this myth was propaganda during the Enlightenment, which was a philosophical movement primarily characterized by being a propagandistic movement. (Who do you think gave it the name “The Enlightenment”?)

In truth, people have been studying the natural world for thousands of years, and they’ve done it in much the same way all that time. There used to be less money in it, of course, but in broad strokes it hasn’t changed all that much.

So if that’s the case, why did Science suddenly get so much better in the last few hundred years, I hear people ask. Good question. It has a good answer, though.

Accurate measurement.

Suppose you want to measure how fast objects fall. Now suppose that the only time-keeping device you have is the rate at which a volume of sand (or water) falls through a restricted opening. (I.e. your best stopwatch is an hour glass). How accurately do you think that you’ll be able to write the formula for it? How accurately can you test that in experimentation?

To give you an idea, in physics class in high school we did an experiment where we had an electronic device that let long, thin paper go through it and it burned a mark onto the paper exactly ten times per second, with high precision. We then attached a weight to one end of the paper and dropped the weight. It was then very simple to calculate the acceleration due to gravity, since we just had to accurately measure the distance between the burn marks.

The groups in class got values between 2.8m/s and 7.4m/s (it’s been 25 years, so I might be a little off, but those are approximately correct). For reference, the correct answer, albeit in a vacuum while we were in air, is 9.8m/s.

The point being: until the invention of the mechanical watch, the high precision measurement of accurate time was not really possible. It took people a while to think of that.

It was a medieval invention, by the way. Well, not hyper-precise clocks, but the technology needed to do it. Clocks powered by falling weights were common during the high medieval time period, and the earliest existing spring driven clock was given to Phillip the Good, Duke of Burgundy, in 1430.

Another incredibly important invention for accurate measurement was the telescope. These were first invented in 1608, and spread like wildfire because they were basically just variations of eyeglasses (the first inventer, Hans Lippershey, was an eyeglass maker). Eyeglasses were another medieval invention, by the way.

And if you trace the history of science in any detail, you will discover that its advances were mostly due not to the magical properties of a method of investigation, but to increasing precision in the ability to measure things and make observations of things we cannot normally observe (e.g. the microscope).

That’s not to say that literally nothing changed; there have been shifts in emphasis, as well as the creation of an entire type of career which gives an enormous number of people the leisure to make observations and the money with which to pay for the tools to make these observations. But that’s economics, not a method.

One could try to argue that mathematical physics was something of a revolution, but it wasn’t, really. Astronomers had mathematical models of things they didn’t actually know the nature of nor inquire into since the time of Ptolemy. It’s really increasingly accurate measurements which allow the mathematicization of physics.

The other thing to notice is that anywhere that taking accurate measurements of what we actually want to measure is prohibitively difficult or expensive, the science in those fields tends to be garbage. More specifically, it tends to be the sort of garbage science commonly called cargo cult science. People go through the motions of doing science without actually doing science. What that means, specifically, is that people take measurements of something and pretend it’s measurements of the things that they actually want to measure.

We want to know what eating a lot of red meat does to people’s health over the long term. Unfortunately, no one has the budget to put a large group of people into cages for 50 years and feed them controlled diets while keeping out confounding variables like stress, lifestyle, etc.—and you couldn’t get this past an ethics review board even if you had the budget for it. So what do nutrition researchers who want to measure this do? They give people surveys asking them what they ate over the last 20 years.

Hey, it looks like science.

If you don’t look to closely.