Determinism, Free Will, and Predestination

In this video I answer a question about (atheistic, materialistic) determinism, free will, and Calvinism-style predestination.

That last part is important to point out because there is a Catholic doctrine of predestination, but it only means that God has a plan (being outside of time) and in no way contradicts human free will. I also talk about how Martin Luther denied free will as well as John Calvin, though I don’t go into great depth.

Bicep Curls are Practical, Actually

Curls and other exercises that primarily work the biceps (brachii) have something of a bad reputation; they’re frequently regarded as being non-functional exercises for insecure gym bros whose only purpose is to look good in the mirror when flexing. I’m not sure why this is the case, though, because bicep curls (with a curl bar or with dumb bells) are actually quite functional.

So, when in normal life does one pick up something a bit below one’s hips and bring it up to one’s shoulder? One does exactly that thing when picking up a child who is old enough to walk. Admittedly, sometimes one has to bend over a little because their armpits are closer to knee height than to waist height, but it becomes a bicep curl once you stand up.

The most common way to pick up a child who can walk is when they stand in front of you and lift their arms up to indicate that they want to be picked up, in which case you tend to use two hands, one under each armpit. Sometimes you’re already holding something, though, and so you need to pick then up with only one arm. This is when the bicep curls really come in handy, since all of the child’s weight is being lifted like a dumb bell. (Pro tip: have the child lift it’s leg so you’re picking them up by their femur while they hold onto your upper arm with both of their arms. If you try to do it under just one armpit it will probably hurt them unless they’re very little or can pull down with that arm hard enough that their latissimus dorsii flexes hard enough to bear the weight.)

Fun fact: little children enjoy when you do reps of bicep curls with them, though in my experience they tend to max out at around 5 reps before they want you to just hold them like normal.

Eating Carbs To Lose Weight Is Strange

(I probably should append “part 2” or “part 3” or something to the title, but I don’t recall what the number should be and I don’t think anyone will really care if I don’t look it up.)

The advice to eat carbohydrates and as little fat as possible in order to lose weight is very strange advice. I’ve talked about this before, and for people who have the dysfunction of insulin resistance (or worse) it’s downright insane advice. (I don’t use the term as hyperbole, but rather than a person who recommends people who have trouble processing carbohydrates, or worse, outright diabetics, eat primarily carbohydrates for energy is not meaningfully connected to reality. It is possible to be insane only when some subjects come up, rather than completely insane, i.e. insane about all subjects, such as the man who thinks he’s a poached egg and tries to sit, motionless, in an egg cup all day.)

(Before I proceed, I should note that there are a few caveats to what I’m saying, here, which primarily apply to athletes. If you need to maintain maximal athletic performance for competition while losing weight, you are in a specialized situation and specialized strategies will apply.)

The argument for eating primarily carbohydrates for energy when losing weight mostly come down to the observation that carbohydrates are less energy-dense than fats are. Carbs contain 4 calories per gram while fat contains 9 calories per gram. So carbs fill you up more than fat does, so you won’t be as hungry and want to eat more for the same calories!

First, this is a stupid satiety model which is entirely ignorant of how human satiety works. Anyone who has ever been to a large meal such as Christmas dinner is familiar with eating the main course until feeling complete stuffed and unable to eat another bite, then suddenly having plenty of room for desert a few minutes later, knows this. This sort of ignorance is entirely inexcusable; it would be like giving people gardening advice without knowing the plants need sunlight.

The second problem is that, even if one ignores the bad satiety model, it’s not even the right inputs. Stomach expansion is a matter of volume, not mass. Looking it up, olive oil has a density of .92 grams per cubic centimetre, while granulated sugar has a density of 1.59 grams per cubic centimetre. Thus 1 cubic centimetre of olive oil will have 8.2 Calories, while 1 cubic centimetre of granulated sugar will have 6.36 Calories. If you eat the same number of calories in olive oil and granulated sugar, the sugar will only take up 30.2% more space. (Granulated sugar, by the way, is not as dense as sugar can get, since the granules are not tightly packed.) It’s more space, but not by a lot.

A bigger problem is that it’s extremely doable to add bulk to food while adding minimal calories. 100g of butter plus 100g of baby spinach will have only a few more calories than 100g of butter (and mostly in protein, curiously), but will take up way more room in the stomach than 103g of sugar will.

The general defense of telling people to eat carbs not fat is that most people can’t handle the complexity of actual food-volume calculations. In an abstract way, this is true, but again a person is straight-up delusional if they think the average person can’t handle, “eat a certain number of calories and try to make them take up as much room in your stomach as possible.”

And then we come up to the issue of satiety-over-time. If you want to make your stomach full on few calories without concern for how long this lasts, just drink a glass of water.

The moment that we care about satiety over time, though, the fact that the human stomach takes many more hours to process fats than it does to process carbohydrates becomes relevant, even on a garbage model of satiety like pure-stomach-pressure.

When one takes a moment to consider all of the false assumptions required to make the carbs-not-fat recommendation work, it’s really quite astonishing that anyone ever had the temerity to propose it in public.

Fingerprints And Forensic Evidence

My oldest son and I recently watched The A.B.C. Murders, and at the end there was a part, as Poirot was detailing the evidence against the murderer, where he added that a fingerprint of the man Poirot was accusing was found on the typewriter that the murderer used. Later, Hasting commented that the fingerprint produced a strong effect (the suspect tried to commit suicide).

That fingerprint clinched things, Poirot,” I said thoughtfully, “He went all to pieces when you mentioned that.”
“Yes, they are useful—fingerprints.”
he added thoughtfully:
“I put that in to please you, my friend.”
“But, Poirot,” I cried, “wasn’t it true?”
“Not in the least, mon ami,” said Hercule Poirot.

One of the curious things about detective fiction is that it comes on the scene almost contemporaneously with the advent of forensics, the use of technology to catch crimes, and police forces organized in the modern manner. Francis Galton only published his statistical analysis that established fingerprints as a viable means of unique identification by the police in 1892. The first arrest and conviction of someone on the basis of fingerprint evidence was ten years later, in 1902. The golden age of detective fiction, if we include Sherlock Holmes in it (which we should), begins before the use of fingerprints as evidence in crimes.

As I mentioned in Fingerprints in Detective Stories, it’s not difficult to see why fingerprints are almost never used as real evidence in detective stories. We want detective stories to be interesting and the detective to be brilliant. “There was a fingerprint on the dagger in the victim’s back, we checked it against everyone’s fingerprints, it turns out to belong to his brother, therefore the brother is the murderer, the end” isn’t much of a story, and doesn’t require a brilliant detective.

Which actually brings me to the relationship between forensic science and fingerprints, because it is interesting to consider that while fingerprints were rarely used in detective stories, plenty of golden age detective stories were primarily about forensic science. Sherlock Holmes was often conducting scientific experiments to prove a case, though to my recollection rarely as the main story. This may have reached its apotheosis in Dr. Thorndyke. I’ve read that when the short stories were published they would include photographs of what the good doctor would have seen through his microscope as described in the story, and other such things. Thorndyke also made extensive use of enlarging photography and other forensic technologies. The stories have faded, considerably, in the public’s memory—to some degree the fate of everything whose main attraction was being on the cutting edge of science or technology. They were, so I read, immensely popular at the time. Their role is probably taken, these days, by police procedural television shows, whose stock and trade is often the cutting edge of forensic science.

I can’t help but wonder if it was G.K. Chesterton’s Father Brown that helped to move detective fiction from a focus on forensics to include psychology. Chesterton first wrote Father Brown in 1910, which was still early on in the golden age. To be sure, more than half of Sherlock Holmes had been written by then and Holmes was no slave to forensics nor was he ignorant of human psychology. Still, he was an expert. He could identify over one hundred brands of cigar by their ash and could tell where a patch of mud on the trousers was picked up in london by its composition, just from looking at it.

Father Brown was not an expert—at physical details. We was an expert in the human being, which proved far more interesting.

This move to psychological mysteries brought with it what has, I think, made the murder mystery so enduring: the puzzle. Once forensics were established as a norm, murderers began to use their cunning to fake the forensic evidence and lead the forensic detectives astray. The psychological detective was necessary to combat this newer breed of criminal. It was at once more interesting and also more accessible. It is not really worth anyone’s time to minutely study cigar ash, but anyone can (if sufficiently clever) figure out the meaning of a particular kind of cigar ash being found in a particular place.

Poirot very much represents this transition. He said many times that he does not get on his hands and knees to find the clues, as anyone can do that. His job is to understand what the clues mean. The A.B.C. Murders was published in 1935, when the fascination with forensic detection was still fresh. It’s curious to see traces of this in the Poirot stories.

Trust and Trustworthiness

A few years ago I read an article about how awesome the Sweden was because it’s such a high trust society that all sorts of things are easy and convenient and efficient. He gave as an example that there were not turnstiles on the entrance to a train, there was merely a place where you’re supposed to scan your ticket but it didn’t get in the way of the flow people. He gave other examples of how much better life was because citizens were just trusted to do the right thing without any enforcement, and wondered how we can get people in the United States to be more trusting. I thought it very telling that he never once asked how to get people in the United States to be more trustworthy.

What I find especially interesting about this is that it’s an inversion from, approximately, ever serious classical view of virtue and its effects that you can find in any culture, at any time. Trust is a choice that other people make, and therefore you cannot control it. Trustworthiness, however, is entirely within your control, and therefore is the only thing to worry about. A man should strive, always, to be trustworthy. At the same time, he should never demand that people trust him, for how can anyone but him know that he is trustworthy? Thus the trustworthy man should always be willing to give guarantees, to give proofs of what he says, and in general to require as little trust from others as possible. To not require trust from others in no way diminishes his trustworthiness, so he is in no way the loser. A trustworthy man may accept when other decline to take his collateral, or to look up his proofs, because they trust him. A trustworthy man would not demand it, though.

This is especially true when the trustworthy man is dealing with a stranger. Since the trustworthy man goes to the trouble of being worthy of trust, he knows what signs there are that he is trustworthy, and therefore knows that the stranger has not seen any evidence of his trustworthiness.

This modern obsession with being trusted without first being trustworthy is indicative, I think, of how utterly childish moderns tend to be. It arises from wanting benefits without having put in the work. It wants benefits without putting in the work because it fails to consider things from anyone else’s perspective. It doesn’t really take the existence of the rest of the world seriously. This is excusable in a child because they simply don’t know enough about the world to take it seriously, in the sense of being able to consider how it works in their absence. An adult, however, should know that there are real consequences if the people who ride a train do not pay for tickets to ride it.

Perhaps the great problem of our time is that so few people grow up, not even late.

The one good thing to say about that is that people who have not grown up when they should have still have the ability to grow up. It’s not as good as doing it when they should have, of course, but they do still have the ability. Which means that the trick is figuring out how to help them actually do it.


(Curiously, though it does not bear on the main point, a Swedish friend said that not checking the validity of your ticket is only in Stockholm, the rest of Sweden verifies your ticket.)

nVidia’s Faked Presentation

There are various news articles around about during a presentation, a few seconds of the presentation was not of the CEO, Jensen Huang, but of a computer-generated fake of him instead. What I’d like to discuss is how misleading the initial articles reporting this were. The first one was from Tech Radar, and reported on a blog post from nVidia, and had the headline, “Jensen’s Kitchen Was a Lie.”

In fact, only a second or two of Jensen Huang’s kitchen was CGI; the CGI portion (which included a digitally generated Jensen) was only in the digital kitchen for a second, then it transitioned to a nearly black, obviously computer generated set. The computer generated set and CEO only lasted for fourteen seconds and the computer generated figure was actually very small in the frame. Here’s a screenshot from that section of the video:

In context, and if you’re familiar with the state of the art in this sort of thing and how much work it normally takes, this was still an impressive demonstration of computer technology. That said, the reports of it made it sound wildly more impressive than it actually was. Which brings me to why.

First, I’m 99.9% certain that this was an honest mistake. nVidia’s blog post was written from a very tech-centered point of view. It was very detail-oriented in terms of what nVidia technologies did what. Basically, it’s how engineers tend to write, because engineers can only do what they do because of tunnel vision. But that tunnel vision also tends to make them bad at communicating with non-engineers unless they conscious frame-shift.

Then we come to the tech reporters who took the nVidia post in the most sweeping way possible. Again, I think that they did this honestly. I think it highly likely that the writer believed every word he wrote.

So, what happened?

I strongly suspect it’s just selection bias at work. Tech reporters are tech reporters because they love technology. They want technology to be amazing. If tech reporters want technology to be amazing, tech readers want that tenfold. A hundredfold. This creates a selection bias; reporters who report on technology being amazing get more readers, because they provide the thrill that the readers seek. Ordinarily, this will mean that they report the same things as others, but do so in a more thrilling way. Tech reporting benefits tremendously from the world producing news on, approximately, a schedule. The ever-increasing performance of computers on roughly a yearly schedule means that there is a steady-state supply of genuine news. (If, granted, news that only tech-enthusiasts find interesting. But, we do find it interesting.) This is one massive advantage that tech news has over regular news, who only get newsworthy events rarely and haphazardly, and so have to make up most of what they report in order to fit their schedule (they make it up mostly in the sense of inflating the importance of insignificant events more than outright fabrication, but the spirit and effect are the same).

The issue comes in when the tech news to be reported is ambiguous. The enthusiastic, optimistic reporters who readers select for will tend to interpret the ambiguities in the most optimistic, impressive way, because that’s how they are and they’re the popular ones because readers like that.

Another advantage of tech news is that it doesn’t really matter. No one is going to do anything of any lasting effect because they believed for a few days that nVidia was able to fake their CEO for longer than they did, or more convincingly than they did. Tech news also tends to be fast to correct in part because real news will come along quickly to replace any mistakes. General news may go months or even years without anything that people need to pay attention to on a daily basis.

Beware of news.

New Religions Don’t Look Like Christianity Either

To those familiar with religions throughout the world, new religions like environmentalism, veganism, wokism, marxism, etc. are pretty obviously religions and are causing a lot of damage because that’s what bad religions do. People who are not familiar with any world religions beside Christianity frequently miss this because they think that all (real) religions look like Christianity but with different names and vestments.

I suspect that the idea that all religions look like Christianity was partially due to the many protestant sects which superficially looked similar, since even the ones that did away with priests and sacraments still met in a building on Sundays for some reason. I suspect the other major part is that there is a tendency to describe other religions in (inaccurate) Christian terms in order to make them easier to understand. Thus, for example, Shaolin “monks”. There are enough similarities that if you don’t plan to learn about the thing, it works. It’s misleading, though.

You can see the same sort of thing in working out a Greek pantheon where each god had specific roles and relationships and presenting this to children in school. It’s easy to learn, because it’s somewhat familiar, but it’s not very accurate to how paganism actually worked.

All of this occurred to me when I was talking with a friend who said that the primary feature of a religion, it seemed to him, was belief in the supernatural. The thing is, the nature/supernature distinction was a Christian distinction, largely worked out as we understand it today in the middle ages. Pagans didn’t have a nature/grace distinction, and if you asked them if Poseidon was supernatural they wouldn’t have known what you meant.

Would the ancient pagans have said that there things that operated beyond human power and understanding? Absolutely, they would. Were they concerned about whether a physics textbook entirely described these things? No, not at all. For one thing, they didn’t have a physics textbook. For another, they didn’t care.

The modern obsession that atheists have with whether all of reality is described in a physics textbook is not really about physics, per se, but about one of two things:

  1. whether everything is (at least potentially) under human control
  2. whether final causality is real, i.e. do things have purposes, or can we fritter our lives away on entertainment without being a failure in life?

The first one is basically an enlightenment-era myth. Anyone with a quarter of a brain knows that human life is not even potentially under human control. That it is, is believable, basically, by rich people while they’re in good health and when they’re distracted by entertainment from considering things like plagues, asteroids, war, etc. Anyone who isn’t all of these things will reject number 1.

Regarding the second: ancient pagans didn’t tend to be strict Aristotelians, so they wouldn’t have been able to describe things in terms of final causality, but they considered people to be under all sorts of burdens, both to the family, to the city, and possibly beyond that.

If you look at the modern religions, you will find the same thing. Admittedly, they don’t tend to talk about gods as much as the ancient pagans did, though even that language is on the rise these days. In what sense the Greeks believed in Poseidon as an actual human-like being vs. Poseidon was the sea is… not well defined. Other than philosophers, who were noted for being unlike common people, I doubt you could have pinned ancient pagans down on what they meant by their gods even if you could first establish the right terminology to ask them.

As for other things, environmentalism doesn’t have a church, but pagans didn’t have churches, either. Buddhists don’t have churches, and Hindus don’t have churches, and Muslims don’t have churches. Heck, even Jews don’t have churches. Churches are a specifically Christian invention. Now, many of these religions had temples. Moderns have a preference for museums. Also, being young religions, their rites and festivals aren’t well established yet. Earth day and pride month and so on are all fairly recent; people haven’t had time to build buildings in order to be able to celebrate them well. (Actually, as a side note, it also takes time to commercialize these things. People under-estimate the degree to which ancient pagan temples were businesses.)

Another stumbling block is that modern environmentalists, vegans, progressives, etc. don’t identify these things as religions—but to some degree this is for the same reason that my atheist friend doesn’t. They, too, think of religions as basically Christianity but maybe with different doctrines and holy symbols. They don’t stop to consider that most pagans in the ancient world were not in official cults. There were cults devoted to individual gods, and they often had to do with the running of temples. Normal people were not in these cults. Normal people worshiped various gods as convenient and as seemed appropriate.

There is a related passage in G.K. Chesterton’s book The Dumb Ox which is related:

The ordinary modern critic, seeing this ascetic ideal in an authoritative Church, and not seeing it in most other inhabitants of Brixton or Brighton, is apt to say, “This is the result of Authority; it would be better to have Religion without Authority.” But in truth, a wider experience outside Brixton or Brighton would reveal the mistake. It is rare to find a fasting alderman or a Trappist politician, but it is still more rare to see nuns suspended in the air on hooks or spikes; it is unusual for a Catholic Evidence Guild orator in Hyde Park to begin his speech by gashing himself all over with knives; a stranger calling at an ordinary presbytery will seldom find the parish priest lying on the floor with a fire lighted on his chest and scorching him while he utters spiritual ejaculations. Yet all these things are done all over Asia, for instance, by voluntary enthusiasts acting solely on the great impulse of Religion; of Religion, in their case, not commonly imposed by any immediate Authority; and certainly not imposed by this particular Authority. In short, a real knowledge of mankind will tell anybody that Religion is a very terrible thing; that it is truly a raging fire, and that Authority is often quite as much needed to restrain it as to impose it. Asceticism, or the war with the appetites, is itself an appetite. It can never be eliminated from among the strange ambitions of Man. But it can be kept in some reasonable control; and it is indulged in much saner proportion under Catholic Authority than in Pagan or Puritan anarchy.

Why Moderns Abhor Violence

One of the most noticeable characteristics of thoroughly modern people is that they have an absolute abhorrence of violence (when they can see it). One of the other most notable characteristics of thoroughly modern people is that their philosophy utterly undermines any moral restraint on violence and also eliminates all possibility of rational reconciliation, leaving power the only relationship between people. This may not be a coincidence.

In particular, it may be that people will only indulge in being modern (Modern philosophy, post-Modern philosophy, etc) when they feel protected from the violence which is its natural consequence. An analogy may be the various stupid ideas children will engage in as long as they’re not the ones paying for things. (Things like payment should be based on the amount of time someone puts into a job rather than the quality of their work.)

In like manner to how being vegan is a luxury good only made possible (to the degree that it even is long-term possible) by advanced technology and massive trade infrastructure, believing that morality is just an evolved set of preferences where none are any better than any others may be a luxury good for people who have an effective security force that does not believe this ready to ensure one’s safety. Or like how having a philosophy that only works for non-reproductive people is a luxury good for people with a steady supply of converts from reproductive people.

Tricking The Murder Into Confessing With False Evidence

More common, I think, in television mysteries than in detective novels, is the technique a detective may use when the murderer has managed to commit the perfect crime, at least with regard to admissible evidence: the detective falsifying evidence in order to trick the murderer into confessing. I wonder how this was ever considered legitimate.

The fundamental problem with it is that, symbolically, the detective catching the murderer is supposed to be the triumph of truth over lies. The detective is supposed to be a christ figure. The whole problem is that the murderer has mis-used reason to throw the world into disorder. The detective is supposed to triumph over evil through superior intellect, not through inferior morality.

A good way to see the problem with this approach is to consider that the confession is entirely unnecessary. If the detective knows who the murderer is and then fabricates evidence sufficiently well, that would be enough to secure a conviction without the confession. If a conviction is justice being served, then this is sufficient for justice to be served. Would anyone think it’s a good detective story if the murderer is convicted and hanged based entirely on evidence that the detective fabricated?

In fact, if the detective is willing to fabricate evidence to get a conviction, why bother with a trial at all? Why not have the detective cut to the chase and just assassinate the murderer without bothering to fake any evidence?

Oh, wait. That’s already happened. (That said, Dexter the TV series is categorized as “crime drama” and the novels as “supernatural crime horror”, not as mystery or detective fiction.)

As I said, I think that this trope is more common in television than in novels, and I can’t really think of any golden-age mysteries that feature it. I suspect that’s because it’s a crutch—a technique for writers who have written themselves into a corner and have a deadline approaching too fast to fix the problem. That could happen, of course, with short stories, or even with serialized novels where the author didn’t plan out his novel before the first five sixths of it have been published. That’s why I don’t want to say that it never happened. Still, I can’t think of any examples.

I really wish that TV writers didn’t give into it so often.