Month: March 2017
Predictability vs. Recognizability
Something I didn’t talk about in my post on writing formulas and formulaic writing is that not all predictability is bad. In fact, there is a great deal of any story which one wants to be predictable. If we are reading a murder mystery, we will be irked if there is no murder, and doubly irked if there is no mystery. And if the mystery is not about who murdered the victim, we will be very hard to win over. I think that G.K. Chesterton summarized it when he said that as a boy, he would put down any book which didn’t mention a dead body on the first page. (And once again I can’t track this down. I need to get better at actually sourcing Chesterton quotes.)
Now, while these examples are obvious, they illustrate the point precisely because they are so obvious as to not normally be worth mentioning. When we complain of fiction being predictable, we don’t mean simply that we were able to predict elements of what was in the story. In most cases, that we are able to predict elements in a story is, in fact, a necessary prerequisite to being willing to read the story at all. It is good advice not to judge a book by its cover, in so far as that advice goes, but I think you will find that a book with a completely blank cover (that is, having neither words nor pictures upon it, so that there is no suggestion whatever of what is behind the cover) will not get read very often.
I do not bring this up to be pedantic, but because those cases where we do not say what we mean often conceal interesting truths. Certainly it has been my experience, anyway, that whenever a man says, “everyone knows what I mean” he is wrong. Usually the more certain he is that he is universally understood, the more wrong he is, because the only person who can be completely convinced that everyone understands him is a man who’s never found out what anyone else thought he meant. But, be that as it may, I think that this is a particular interesting topic because what we really mean reveals much about how we enjoy fiction. It also reveals the real reason why books should never be reviewed by people unfamiliar with their genre.
Chesterton once said that an artist is glad of his limitations:
You may, if you like, free a tiger from his bars; but do not free him from his stripes. Do not free a camel of the burden of his hump: you may be freeing him from being a camel. Do not go about as a demagogue, encouraging triangles to break out of the prison of their three sides. If a triangle breaks out of its three sides, its life comes to a lamentable end. Somebody wrote a work called “The Loves of the Triangles”; I never read it, but I am sure that if triangles ever were loved, they were loved for being triangular. This is certainly the case with all artistic creation, which is in some ways the most decisive example of pure will. The artist loves his limitations: they constitute the thing he is doing. The painter is glad that the canvas is flat. The sculptor is glad that the clay is colourless.
And so it is with fiction; the elements which we want to be predictable form what sort of story it is. And this can get very specific. Within murder mysteries, there is the sort of story which can be referred to as an armchair cozy, and within those, there is a yet more specific sort of story which can be referred to as a Christie. Armchair cozies tend to feature a very intelligent detective who uses his wits more than his fists, and Christies tend to have that intelligent detective, at the conclusion of the story, gather up all of the witnesses and suspects into a room and explain the facts as he found them and then set them into an orderly and coherent picture, while clearing away red herrings, lies, and mistaken inferences others had been making.
Now, there are those who criticized Agatha Christie’s work because it always ending with the detective gathering everyone together and summarizing the plot prior to revealing the solution as being predictable and formulaic. These are, in a sense, the mortal enemies of those who love the form of the Christie and feel the lack in the ending of the Maltese falcon, where the solution to the mystery is a mere afterthought. To those who take these plot elements to be part of the form of the story, a story which does not include them is defective. To someone who does not take these plot elements as part of the form of the story the stories are predictable and formulaic.
Which is to say, whether writing is predictable and formulaic is in no small part a matter of how one conceives of the story before one reads it. If one thinks of science fiction as just plain old literature, all of those space ships and other worlds become very predictable and formulaic. Thought of as romances, murder mysteries sure do use the same old device to bring the couple together—if they even remember to have a couple to bring together. And so forth; in a sense this is just remembering that a thing can only be good when thought of as what it is—hammers make terrible pillows, etc. But things like hammers and pillows are relatively clear in what they are while stories rarely fit perfectly into any genre and thus are always defining new sub-genres. Indeed, the fact that there is a type of murder mystery called a Christie testifies to that very fact since they are named for stories with plot elements like those found in many of Agatha Christie’s stories.
And there is a sense in which even thinking in terms of genres is a mistake with fiction because it implies a comparison; it is always a mistake to allow the goodness of one thing to eclipse the goodness of another thing. Perfect happiness cannot rest on infinite novelty since infinite novelty is not possible. (Perfect happiness must instead come from the ability to appreciate a good which has already been appreciated, whether in some greater good, or in the thing itself already experienced.) That said, in a world with imperfect creatures thinking within genres is unavoidable and so a clever (or charitable) author will help the reader to understand what sort of story he’s getting into and what he may expect, that he will know where to look for surprises. Because a large part of enjoying a story is knowing where to look for surprises in it.
(There is the obvious exception of books with “twists”, that is to say, books which signal that they are one sort of thing and then suddenly reveal that they are something else. Being more a re-reader than a reader of new things, my own opinion is that these are rarely good stories because the twist is typically a gimmick. Having managed one thing to startle the reader, the authors of twists often seem to not bother themselves with putting in anything else which is novel, and so there’s no value to re-reading them. There are exceptions to that, though, where the books are worth reading even if you know the twist, so I don’t mean to over-generalize.)
At this point I suspect that the relationship of this post to whether writing formulas encourage formulaic writing should be clear. If the reader is familiar with the formula and reads stories written according to it as if the formula defines a genre, then formulas will not encourage formulaic writing at all (except in so far as they elevate formulaic writing that otherwise would have been unreadable to the level of being readable, as I discussed in the post I linked above). On the other hand, if readers do not understand the formulas as a type of writing, there is a good chance that they will find fiction writing according to the formula to be formulaic because they will be looking for novelty in the wrong place.
This same phenomenon can be seen in music appreciation, by the way. A friend of mine who studied music in college pointed out that each type of music has its typical structures (allowable cords, cord progressions, repeats, and so on) inside of which musicians play around and differentiate themselves. Those familiar with these structures hear the music as music, while those who aren’t familiar with these structures will often hear the music simply as noise. This is why new genres often gain popularity with the young, who have not imprinted on already accepted musical structures and who can easily adapt to a new musical structure. Later, they spread as those who need more time to learn new music’s structures finally do.
There’s even something analogous in looking at the “long hair” of the Beetles. By modern standards, their hair is within norms for businesslike hair styles. In fact, on this album cover they almost look like modern bankers:
Not quite; bankers do have an extremely recognizable style that has shifted only very little with the times. But in their time, the Beetles were icons of rebellion. Today, outside of a few niches like banking, we barely have any standard hair styles for men—except possibly that mullets are bad—and so nothing violates those standards. (Again, except mullets, for some reason.) But the curious upshot of those lack of standards is that if anyone’s hairstyle is recognizable, it is therefore derivative and boring. There is, I think, a lesson to be learned there.
Glory to God in the highest.
Admitting One’s Weird
In an interesting essay I suggest reading, Ed Latimore gave, “5 Lessons From Growing Up in the Hood.” One of them in particular caught my eye:
1. Good manners go a long way.
I fought a lot as a kid. That’s just par for the course growing up in the hood. I would have fought a lot more if it wasn’t for one simple phrase: “My bad.” For those of you that don’t speak hood, “My bad” is the equivalent of saying “I’m sorry.”
You bump somebody in a crowd? ‘My bad’ goes a long way. Step on someone’s foot on a crowded bus? Dude might get mad, but you can cool it quick by just saying ‘My bad.’ Say something a little too offensive that gets guys in the mood to fight? Just say ‘My bad’ and dial it down. It’s amazing what an apology can do to cool tempers in the hood.
I didn’t grow up in the hood, nor even particularly close to it, but I found the same thing applies to situations with much lower stakes: being willing to admit error where one can truthfully do so goes a long way to smoothing out human interactions. And the curious thing is that where one is telling the truth in admitting error, most people are very willing to accept that and move on. People, by and large, don’t tolerate affronts to their dignity, but they are very willing to tolerate other people’s human imperfection where it is acknowledged as such and where a person is willing to put in the work to make things right afterwards.
This applies quite a lot in the context of business. If one makes a mistake in a professional setting, simply admitting it in a straight-forward way tends to turn such mistakes into a non-issue. Professionals are there to earn money, which they do by solving problems. Co-workers’ mistakes are just one more problem to solve. This can of course become excessive to the point where you are causing more problems than you are solving, but if that’s the case you’re probably a bad fit for your job and should move on for everyone’s sake. But where you are competent at your job, people just don’t really care deeply about the occasional mistake, and if you own up to it, there’s nothing left to talk about so people just move on.
And it’s that last part that I want to talk about in another context. Most people are weird but hide it; and most people are made very uncomfortable by other people being different (which is just another way of saying that they’re weird). At its root this comes from a tribal instinct; it is not good for man to be alone—and we know it. Differences make us fear rejection, though a little bit of life experience and sense teaches us which differences matter and which don’t. But sense is surprisingly uncommon and learning from life experiences is—for quite possibly related reasons—similarly rare. So a great many people fear whatever is different from them. This can be people who look different but I think it’s far more common to be afraid of people who act differently. And one thing people do when they’re uncomfortable is talk about it.
And this is where admitting that one is weird can be a very useful strategy. To give a concrete example, I shoot an 80# bow. (For a long time it was actually 82# but string creep eventually set it and for some reason they couldn’t get it back up.) That’s pretty uncommon, these days, especially for someone with a 30″ draw length. Most men shoot a bow somewhere in the range of 55#-70# (women tend to shoot in the 35#-50# range). You’d think that an 80# bow wouldn’t seem that odd to people shooting a 70# bow, but for reasons relating to how many reps you can do in weight-lighting being a function of how close you are to your one-rep max, it actually is a pretty big jump for a lot of people. They could draw the bow, but only a few times an hour. I’m not that strong, but I’m a relatively big guy (6′ tall, over 200lbs) and so I can comfortably shoot my bow for an hour or two at a stretch without losing more accuracy than if I was shooting a 70# or a 60# bow (really the main thing affecting accuracy is that your shoulders get tired of holding the bow up at arm’s length). So it’s a very reasonable thing for me, personally, to do, but it’s pretty odd among people at the archery shop I go to. And moreover it’s not really necessary. Where I live the only common big game is whitetail deer and you can reliably kill a whitetail with a 40# bow if you’ve got a good broadhead/arrow setup and are a good shot. I do it because I like it, and because it acts like insurance. With the double-edge single-bevel broadheads I use on top of 0.175″ deflection tapered carbon fiber arrows, the whole thing weighing 715 grains, shot from an 80# bow, if I make a bad shot and hit the large bones my arrow will most likely go right through and kill the animal anyway. And I could use the same setup for hunting moose or buffalo without modification, should I ever get the opportunity. (That would fill the freezer with meat in one shot!)
So, as you can see, from my perspective this is a reasonable thing to do. But from most everyone else’s perspective, it’s weird. And moreover, it’s more than most men at the archery shop I go to can do. Some people there can’t even draw my bow, and many who could would find the strain too much to do more than a few times. It would be easy for people to suspect that I look down on them as lesser because of it, and to reject me in self-defense. If someone you respect looks down on you, it’s painful. If someone you reject as mentally deranged looks down on you, it’s irrelevant.
So when people make jokes about me/my bow being atypical, I go along with it. I will cheerfully admit that I’m engaging is massive over-kill; I will joke along with them about the way deer are wearing bullet-proof vests these days. (My setup could probably go through a lighter bullet-proof vest since broadheads are razor sharp and can cut through kevlar. It has zero chance against the sort of vest with ceramic plates in it.) If someone characterizes me as crazy, I smile and say, “nuts, but I like it.” And in general the joking lasts for a minute then is forgotten about and things are normal. This is, I think, for two reasons:
- I have signaled that I know I am abnormal and am happy with the status of being abnormal. I am clearly indicating that I am not the standard against which others should be measured so I am no threat to anyone’s social standing or sense of self.
- It smothers the impulse to joke about me, in the sense of taking the air away from a flame. If you say that someone’s crazy and he smiles and says, “certifiable,” you just don’t have anywhere to go. Joking/teasing requires a difference of opinion. If someone agrees with you, there’s nothing left to say since a man looks like an ass if all he does is repeat himself.
Of course, this does depend on the content of what’s being said about me being something which I can agree with. In this example, “crazy” just means “abnormal,” which is quite true. If someone were to accuse me of being a criminal I would defend myself, not agree with them. The point is not to be a carpet for people to walk on but rather to learn how to pick one’s battles and only fight the ones that need to be fought. That’s a general principle of skill, by the way; skill consists in applying the right amount of force to the right place to generate the best results. A lack of skill wastes force first in applying it to the wrong place and so needing far more force to achieve the desired result, and then in needing to apply more force to correct the problems caused by having applied force to the wrong place. That’s as true of picking one’s battles as it is of swing dancing or balancing in ice skating. Or, for that matter, archery; missing the target in archery often means that you have to spend a lot of effort to pull your arrow out of a tree.
God’s Blessings on March 1, 2017
God’s blessings to you on this the first day of March in the year of our Lord’s incarnation 2017.
The popularity of videos is an interesting subject, especially for someone who runs a youtube channel. Here are my last few with their views, from most recent to oldest:
Christian Asceticism: 87
Thoughts about Bishop Barron prayer: 132
Believing the Incomprehensible: 248
Why I Don’t Debate Atheists: 948
Just for Fun: A Debate Challenge from Deconverted Man: 491
Channel Update & Thoughts on Disagreement: 171
The Value of Debate: 171
To Err is Human: 214
What the Burden of Proof is: 179
The Burden of Proof Isn’t a Logical Fallacy: 375
Good and Evil are Asymmetric: 249
Discussing Social Media w/ Russell Newquist: 142
Logic Lesson for Atheists: 528
Why Atheists Can’t Logic: Answering Deflated Atheism: 1,527
The Burden of Proof: A Few Quick Thoughts: 293
Sci-Fi Author Brian Niemeier, A Conversation: 139
Chesterton’s Post: 183
Occam’s Razor: 459
(I should note that the way that youtube works is that there is a big bump in views in the first day or two for a video as subscribers notice it and it goes through whatever recommendation process is used for recommending new videos, then things tend to settle down to a steady state of getting a few new views most days. Thus a video with the same number of views as one which came out before it is more popular.)
So as you can see, it’s all over the place. There are some definite themes; things which are explicitly about atheism tend to do better than things which aren’t. In particular I find it interesting that Why I Don’t Debate Atheists is about 5.5 times more popular than The Value of Debate, despite being more recent, and despite it being basically just an application of The Value of Debate. I actually suggest watching The Value of Debate instead for a more positive take on it in the description of Why I Don’t Debate Atheists.
Now, in fairness, there is a three minute section in the beginning of Why I Don’t Debate Atheists in which I sarcastically summarize it as:
- I’m too scared to. If I ever heard an atheist say, “where’s your evidence” or “that’s not evidence” my faith would shatter.
- I’m too arrogant to. I already know everything.
- This makes me a bad Christian because Christians should always treat public blasphemy with the utmost respect.
I think a lot of Christians have been accused of all this many times, so I suspect that my sarcastic “executive summary” in the beginning was cathartic for more than just myself. So it might have gotten shared or recommended more often. Still, it’s interesting to consider what relationship the subject matter and title have on viewership. And I hope it should be obvious that I don’t blame anybody for watching only what they think is likely to be of direct interest to them and their lives; we all have very limited time and a great deal of things clamoring for our attention. Anyhow, it’s interesting to observe and consider.
Glory to God in the highest.