Good morning on this the seventeenth day of December in the year of our Lord 2016.
As I write this, the two oldest children are currently watching How the Grinch Stole Christmas for the second time. It’s one of my favorites too. Boris Karloff does an amazing job narrating and acting the story. Of course I saw it at least once per Christmas season as a child myself. There’s a lot made of the massive discontinuity in how children grew up with how their parents grew up that’s been brought about both by cultural changes but especially by technology, and that’s certainly true, but in many ways technological change is slowing down, and those of us who grew up with technology are having our children grow up with at least similar technology. If we don’t tell many stories around the fire any more, we do watch the same Christmas specials. I’m not interested in arguing that it’s the same—because it probably isn’t—but on the other hand it is continuity. There are things my children are doing which are just like the things I did, and this does form points of connection. As nice as it is to have things in common with my children, I think it’s much more important for them to have things in common with me. My oldest son found it very interesting that I used to watch Scooby Doo, and he’d often ask for “the Scooby Doo you used to watch as a kid.” Granted, it was very well done and I still enjoy it now (see my post about formative fiction), but I think that personal connection was important to him, too.
And on the subject of technology, there was for a while—I think most concentrated roughly in the 1950s through the 1990s—the idea that technology was on an exponential curve of improvement. You can find people who will talk about the singularity (when technology really starts accelerating because technology is able to make itself without our intervention, which I jokingly summarize as, “and the word became silicon, and dwelt among us” (see John 1:14)). And yet, this is not how a great deal of technology actually develops in practice. Consider cars, for example. From 1910 to 1960, the top speed of (ordinary) cars went from something like 20 miles per hour to around 70 miles per hour. By the 80s, however, the practical top speed of cars was something like 85 miles per hour. Again talking about ordinary cars, you wouldn’t want to drive a car made in the 2000s above 90 miles per hour. While the engine and drive train and so on can take it, the problem is that he aerodynamics are awful. It’s not just a matter of air resistance, but the fact that the air can push so hard on a vehicle at that speed that it isn’t safe to go faster. Between aerodynamic lift and sideways pushing, it’s just dangerous to drive a common car that fast. I don’t think that there’s much of a difference between cars made in the 2000s and cars made in the 2010s in that regard, and I don’t think there’s likely to be much of an improvement in cars made in the 2020s in that regard either. Most roads don’t permit you to go nearly 80 miles per hour anyway, so why pay lots of extra money and make trade-offs in convenience and interior space to be able to drive at such high speeds once every few years? And here we come to one of the most significant retarders of technological progress in the modern world: economics.
There are all sorts of things it’s technologically possible to do which do not get done because no one finds them to be worth the money. There was, a few years back, a high powered rifle which used a linux-based computer and high quality digital camera to be able to identify targets, then when you pull the trigger it waits until you are aiming the gun exactly to hit the target and only then fires. It could accurately hit targets almost two miles away, I believe, but it cost well in excess of $50,000. So no one bought it, because, well, why would you? It’s very expensive and takes all the fun out of shooting. My guess is that they probably had military applications in mind and were just using the civilian market as a means of proving that it worked, but who knows? They stopped making it because of a lack of interest, and it no longer exists, so far as I know. It’s not that we can’t make it, it’s just that we don’t. (The “we” being our species.)
Televisions are another interesting example of this. TV makers have a big problem that people don’t replace TVs very often, but there was a big boom in demand back when everyone was switching from CRTs to LCD TVs. They really want another replacement boom, but despite the fact that it’s now possible to replace one’s 40″ TV with a 60″ TV, most people don’t find that to be very necessary, and while they might go for a bigger TV when their current TV finally breaks, it’s not compelling to spend the money now. TV makers also hoped that 3D was going to be huge and drive another lucrative replacement cycle, but 3D offers very little over 2D (not nearly as much as color offered over black-and-white) and is generally too much of a pain in the neck to be worth it. 4K TVs are the current hoped-for rainbow with a pot of gold at the end, but the actual quality improvement over 1080p as far as human beings evaluate it is very minor. (Technically 4k is four times the resolution and thus four times as good as 1080p, but it makes for a very slight increase in enjoyment to a human being.) I’m going to come back to this topic later and give it a more thorough treatment, but the upshot is: technology is slowing down in an umber of key areas because our capacity to enjoy technology is becoming saturated. So I think we’re going to be feeling like there is more continuity between the generations than the people in the 1960s through the 1980s felt. Just a guess, but it’s looking like it, at least on the technological disruption front.
God bless you.
Pingback: Good Morning December 18, 2016 – Chris Lansdown