Technology Feed

Peter Hitchens On the Automobile

At The Lamp's blog, Hitchens has a great little essay on the wrong turning civilization took when almost everyone got a car. It starts as a personal matter with him. He just doesn't like cars, period:

Life would be a lot easier if I did not hate motor cars. But I just do hate them. I have tried not to. I even learned to drive at the age of thirty-one, a terrible surrender made as I sought to fit in with what felt increasingly like a compulsory faith. But I never really submitted, and have since drifted away from it....

I would be dishonest if I pretended to have this fine and unequivocal disdain for the thing. Like most Americans, I got my driver's license as soon as I possibly could when I turned sixteen. I was never exactly what one could call a car person, but there was a period of a couple of years in my teens when I read hot rod magazines and assembled plastic model cars. When Tom Wolfe published The Kandy-Kolored Tangerine-Flake Streamline Baby I knew what the title referred to. I knew who Ed "Big Daddy" Roth was. I coveted a Jaguar XKE.

image from

I spent weekend nights riding around in the nearby towns, although what I was driving was hardly cool: a low-end 1959 Chevrolet with a 6-cylinder engine and a rusting body. It was not improved by the small fire I started in the back seat when I flipped a cigarette out the driver's window and into the open window behind me.

image from one I drove was not decorated like this one.)

A friend of mine sometimes had the use of a T-Bird. My one-time girlfriend had a white Mustang. Those were about as close to cool as I ever got.

By the time I was nineteen or twenty that interest had vanished completely. For some while a sort of detached admiration for certain cars persisted. I recall seeing a dark green Mercedes on Sundays at my parish decades ago, and thinking it was nice looking, but I had no desire to own one. I've never owned a vehicle that would merit any interest or even respect from a person seriously interested in cars.

Even the mild appreciation that continued faintly into middle age dissipated altogether. The arrival of the so-called "sport utility vehicle" really killed it. Once upon a time the term had a connection to reality, but that's gone. I have an active disdain for the really big ones, the Tahoe and such, and the luxury ones, like Mercedes and BMW "SUV"s. Some Americans seem to look back at the big cars of the '50s and '60s and congratulate our modern selves on having better taste than the people of those days. One can argue about the aesthetics but there is absolutely no room for most contemporary Americans to look down on those of the 1950s for their love of the big powerful cushy automobile.

I now drive a 2010 Honda Civic and plan to continue driving it until either I die or it does. But I admit that I still like to drive under certain conditions, still enjoy a long drive on a highway with not too much traffic. So to that extent I am not on the same page as Hitchens.

It isn't just his personal dislike, though. He also hates what the automobile has done to English cities, towns, and countryside. And there I'm very much with him:

And, if you do not love automobiles for what they are, I think you are bound in the end to hate them for what they do. Look at the way they spoil every prospect. A line of garishly colored cars parked in a beautiful city square wrecks the proportions of the place. Their curious shapes, inhuman and flashy, clash violently with almost every style of architecture except the most brutal concrete modernism. The incessant noise and smell of them, the horrible danger they represent to soft human bodies, the space they take up, are all outrages against peace, beauty, and kindness. Near where I live, there are several roads where drivers are officially encouraged to park on the sidewalk, because if they parked on the narrow road, it would be impassable. The logic of this is inexorable, once you have assumed the supremacy of the car. But if you are a car heretic, the thing is a blood-boiling outrage. 

I probably have (if it's possible) an even more intense resentment on that point because of the scale on which America has done this. I doubt that England has as many of the 100% auto-age urban areas that we do. We have vast cityscapes that did not exist before 1950 or so. Some of these have arisen around an existing city of significant size, some have only a little town at their centers, surrounded by a sprawl many times larger. And all of it is built on the assumption that everyone has a car and will get into it and drive somewhere for any activity that does not take place in his own home.

I grew up fifteen miles or so from Huntsville, Alabama. At the beginning of World War II Huntsville was a small town of roughly 13,000 people. Because a military weapons development center was located there, the population grew during the war and was up to 20,000 or so by its end. After that the military installation was mainly devoted to the coming thing in weaponry, guided missiles, and then a major piece of NASA was located there. Its currently listed population of something over 200,000 doesn't tell the story of the growth of the area, as Huntsville and/or its suburbs now extend literally to the driveway of the house I spent most of my childhood in. The two smaller towns within commuting distance have also grown and sprawled so that the three have almost grown together. The connecting highways are clogged. Most of the area between my former home and the middle of the city, the old town square which is only a relic, is literally unrecognizable to me now. When I go there I find it difficult to stop complaining. 

Here is the link to the Hitchens piece again. It's called "The Great God Zil," and I'll leave it to you to discover what Zil is or was. And I'll add that his denunciation of the huge pompous belligerent motorcades that now move our rulers around is not the least important and savagely enjoyable part of the article.

Man Bites Nuclear Dog

Finland's Greens Now Fully Behind Nuclear Power

I don't think I've ever written a post about climate change here. I don't post all that much about political and social issues anyway, but still, it's mildly surprising that in all the years (eighteen!) I've had this blog  I've never written a post specifically about it, considering how much attention it gets. I've mentioned it here and there, but as far as I can tell only in passing. Part of the reason, the major part I guess, is that I don't take it all that seriously.

As a threat, I mean. I'm willing to believe that it's happening, and to believe that industrialism, the automobile, and so forth are causing it, or at least contributing to it. I'm willing to believe it because many people who know a lot about the climate tell me so. And I'm willing to believe that the rising temperature will or at least may cause problems. I say "may" because there is such a blatantly tendentious (to put it mildly) effort to link any problem, especially but not necessarily if it includes weather or any aspect of the natural world, to climate change. Maybe some of those are valid, maybe not, but the political intention is so obvious that it invites skepticism--practically requires it.

But I'm unwilling to believe that climate change is going to make the planet uninhabitable, or result in the deaths of some large percentage of the human population with the survivors becoming hunter-gatherers, or have any of the other world-ending or otherwise massively catastrophic consequences that are predicted. 

I don't mean just that I reserve judgment, or am somewhat skeptical, but that I actively disbelieve it. One reason is that these predictions, like the attribution of existing phenomena, have an obvious political motivation. Another is that many of them seem implausible, predicting consequences that seem far in excess of the predicted rises in temperature. But the biggest reason is that the environmentalists and others who are loudest in their alarms do not appear to truly believe what they are saying. Maybe they're purposely exaggerating for effect, as political crusaders generally do, not apparently having absorbed the lesson of the boy who cried wolf. 

It's not only because so many of the crusaders are hypocritical--owning multiple enormous homes, jetting to climate conferences, and all that. Yes, it's hypocritical, but hypocrites we will always have with us. The biggest reason is that for the most part they don't allow a place for nuclear power in mitigating the problem. Compared to fossil fuels, especially from the point of view of climate change concerns, nuclear power has some major benefits. I won't bother reproducing the arguments on that score, as they are easily found all over the net. 

Obviously it has significant dangers, too. But to rule it out entirely, and put all your hope on the unlikely prospect that wind and solar power can replace fossil fuels anytime soon, only indicates that you don't really believe that the danger is as great as you are saying. If there's an 18-wheeler coming straight at you, and your only alternatives are to go into the ditch or have a head-on collision which will certainly kill you, you don't say "The ditch is not an option. Much too dangerous."

It's good to see at least one environmentalist group be realistic on this point. I get the impression that a not-insignificant number of young people are being driven into terrified despair by the wild alarmism of many. Poor Greta Thunberg is blaming the wrong people for stealing her childhood. 


What's In These Names?

Just humor, I guess:

Allen Doss, Darron Tuff, Jan Fugg, Russell Fiery, Angelo Legend, Amy Haggis, Andreas Weeder, Jasen Foul, Adolfo Slaughter, Daron Matins, Seneca Zen, Curtis Isogloss, Curt Hubble, Shea Roo, Charles Heavy, Jamie Bovver, Brant Verve, Dominick Thrawn, Jayson Nil, Hassan Sass, Jamil Point, Doyle Dyke, Bennie Fleer, Moshe Fraud, Kimberly Waker, Martin Beth, Rex Pochard, Jeffery Croon, Louis Kauri, Kenneth Disc, Bryce Fedora, Scott Grocer, Marcus de Brief, Maurice Jar, Guillermo Balk, Andy Pitt, Sammy Gearing, Leon Brandish, Norman Purple, Harold Dazzle, Esteban Woolly, Jeremie Cue, Erwin Antics, Brent Clean, Brandon Fretsaw, Cristopher Homely, Paulo Sketch, Marlin Haymaker, Derik Cayman, Mickey Mustang, Loren Sequin, Jorge Pure

I've been trying out a writing tool called Scrivener. It includes a good set of word processing tools, but goes beyond that with tools for managing a book-length project. You can have lots of separate pieces of text, from a paragraph to a chapter or whatever, and move them around easily, and these components are not separate files but are all right at your fingertips. This is a bigger deal than you might think if you've never tried to write a book and had trouble organizing it. 

And there's much more. I think it's going to be a big help to me, but right now I'm still trying to adjust to it and figure out what it can do. Poking around in the menus, I found, several levels down, something called Name Generator. I looked at it and it's exactly what the name says:


Of course I had to try it out, although I'm not writing a novel. And the names above are what I got. I couldn't understand why it would propose such weirdness for someone writing a novel. Then I realized that though I had set the first-name type to "Popular US Names (Male)," there is no corresponding list of surnames, so I checked "Potential  Dictionary Surnames." (There is a "Popular British Surnames" choice, and, oddly to an American, also "Popular London Surnames." I'll guess that it includes more non-English names.) I'm pretty sure no more than a third or so of these have ever existed in real life, though there is one that's only one letter off from that of someone I know. I suppose you might use some of them if you were writing some kind of Douglas Adams type thing. "Maurice Jar" seems un-randomly close to "Maurice Jarre". 

So if you ever come across a fictional character called Mickey Mustang, you can figure the writer used Scrivener. 

Listening to a Book Vs. Reading It

There's an interesting article by Art Edwards in (at?) Quillette about the difference between reading a book and hearing it read: Listening to Literature—What We Gain and Lose with Audiobooks. It's rather long for online reading, but worthwhile if you're interested. The author wonders whether listening to a book really counts:

The one constant of my reading life is that I always want to read more. If audiobooks offered me nothing else, they offered me that.

Or did they? Was I reading these books? I didn’t know. A search online revealed a piece on the subject quoting Daniel Willingham, a psychologist from the University of Virginia and author of 2017’s The Reading Mind: A Cognitive Approach to Understanding How the Mind Reads: “What you find is very high correlations of reading comprehension and listening comprehension.” In other words, one’s ability to listen well to an audiobook corresponds directly to one’s ability to read well—the issue is largely a matter of personal preference.

And preference is strongly affected by, so to speak, competence. The writer does most of his listening while driving, some of it involving long and fairly demanding non-fiction, and feels that he has understood these more or less as he would have done if he had read them. I could not do that. Not only would I not absorb the book, I would find the whole process unpleasant. I've often suspected myself of having a mild case of attention deficit disorder, though I think that would be an overly dramatic way of saying simply that my powers of concentration are low. (I think I'll just put the question of basic intelligence aside.) When I read something very complex, I have to make a more or less continual effort to keep my mind on it, even if I'm sitting on the couch at home with nothing and no one to distract me, and frequently have to re-read a sentence or a paragraph. It's a problem even with less demanding fiction, such as murder mysteries. 

And while driving? No, impossible, not to be considered. Of course I would not read some of the stuff this guy reads anyway. Capital in the 21st Century by Thomas Piketty? On the face of it, unreadable in any case, even if it were not 800 pages long. But twenty-five hours on twenty CDs, while driving? I would only do it for a substantial hourly wage and on the condition that no one would ever ask me to demonstrate that I had understood it. 

I can't help wondering if this fellow is a hazard on the highway when listening to such things. I find that any driving conditions much more demanding than continuing straight on a rural interstate with light traffic disrupt my listening. It's strictly fiction for me when driving, and it can't be very densely written. Even under the best conditions, it happens fairly often that I realize I've missed something--or, possibly even stronger evidence of distraction, am not sure whether I missed something or not--and have to back up for half a minute or so. But then, as I say, I'm easily distracted. 

Just a couple of weeks ago I was making one of my fairly frequent drives to north Alabama and back, a 350-mile trip that takes at least five and a half hours (each way). I had borrowed two audio books from the local library, The Saint Zita Society by Ruth Rendell and Sacred Clowns by Tony Hillerman. The latter was meant as a sort of fallback, something I knew I would enjoy, in case the former was unsatisfactory in some way. And that turned out to be the case. I started the Rendell book. It opens with a gathering of people and immediately introduces a half-dozen or so characters. I was already somewhat distracted by having had several errands to run before I hit the road, and made the mistake of starting the book before I was out on I-65. I immediately had trouble getting the characters straight, and switched to the Hillerman (not his best, by the way, but I enjoyed it, as always).

But apart from my own limitations as to what can be enjoyably listened to I certainly agree with the author that listening to a well-performed book can actually be a richer experience than reading it. I mentioned one such last fall, Josephine Tey's The Franchise Affair (see this post). There have been a number of others: Lewis's "space trilogy," Brideshead Revisited, several of Patrick O'Brian's Aubrey-Maturin novels, various English mysteries. You'll note the common thread of Englishness in all these. I'm not sure whether it's the gifts of the actors doing the reading, their ability to voice characters differently and effectively, or just the fact that they aren't reading in the colorless voice which is what I get when I read them in print. Mostly the former, I think. I can't offhand think of any American books read by Americans that gave me quite the same  striking coming-alive sense as the English ones.

On the other hand, as the Quillette writer also notes, a reader you don't like can spoil or at least get in the way of your enjoyment of a book. There was an element of that in my reaction to Cormac McCarthy's The Road. And on another recent trip I listened to a thriller by Lee Child (Night School), and although I found the book entertaining enough for the most part, the reader had some mannerisms that I found annoying enough to discourage me from choosing anything else by that author read by that reader. Though now that I think of it, it was also this book that caused me to write a brief blog post complaining about cringey sex scenes in novels.

Le Spleen de Samsung?*

My DVD player is having problems so I started looking for a new one. Of course all sorts of stuff has changed since I bought this one--everything is Internet-enabled, etc. So, trying to figure out what I actually need and/or want, I was reading the users' questions and answers for one I'm interested in, and saw this:

Question:  Does it have app for MLB tv
Answer:     i’m not good with tech stuff - i don’t know what MLB means and i hate apps
By Brandy [name redacted] on March 5, 2019

I would almost sort of like to meet Brandy.

* (Baudelaire, Le Spleen de Paris: "The title of the work refers not to the abdominal organ (the spleen) but rather to the second, more literary meaning of the word, "melancholy with no apparent cause, characterised by a disgust with everything".)

More AI Nonsense

On stilts. At Vox: "Robot priests can bless you, advise you, and even perform your funeral".

I'm slightly surprised that the author is a former religion editor of The Atlantic. Sadly, I'm not at all surprised that a Catholic theologian--a Franciscan sister, no less--is on hand to add some extra touches of fatuousness. 

"So would I want a robot priest? Maybe!” she said. “A robot can be gender-neutral. It might be able to transcend some of those divides and be able to enhance community in a way that’s more liberating.”

Skynet Is Never Going To Become Self-Aware

For the benefit of those who have managed not to have seen the Terminator movies, or even to have picked up the pop culture lore that originated with them: Skynet is the computer system that initiated nuclear war on its own volition, and began to rule the world in its own interests--which were not those of its inventors. Never mind the rest of the plot, as you probably either know it or don't care. 

Today is the day (or rather the 22nd anniversary of the day) on which, in the movie, the catastrophe occurs:

The system goes online August 4th, 1997. Human decisions are removed from strategic defense. Skynet begins to learn at a geometric rate. It becomes self-aware at 2:14 a.m. Eastern time, August 29th. In a panic, they try to pull the plug. (from IMDB)

Hardly a week goes by that I don't see some mention, either a "science" article in the news, or discussion of a movie or a TV show, which deals with some variation of the idea that artificial intelligence will at some point develop human consciousness, "self-awareness," an independent will, emotions, and interests, and so forth: in short, become a person. Sometimes this is presented as a bad thing, sometimes a good thing. 

If this worries you, or if you are eagerly anticipating it, I'm here to tell you: forget it. Don't worry about it. It is never going to happen. If you like stories that play with the idea, fine, have fun, though they're not usually to my taste. But don't concern yourself with the possibility that it might actually happen. 

How can I be so sure? Well, of course I can't say it with 100% certainty. But I'll commit to 99.999%. The reason is that I know how computers work. I am not a computer scientist, just a journeyman programmer, but I do know how the things work at the most fundamental level. And I also can see, as anyone who bothers to think about it can, that the idea that they can develop consciousness is based on a naturalistic assumption about the nature of consciousness--that it is, in us, an epiphenomenon of the brain, and therefore a probably inevitable development in computing machinery that mimics certain functions of the brain. This is a pure act of materialist faith. There is no evidence for it. No one can actually describe in detail how it can happen; it's simply postulated.

We speak of computers "knowing" things, and in a sense they do. But in the same sense it can be said that a light bulb "knows" when it should glow. It "knows" because you flipped a switch that supplied electricity to it. If you set up an array of 256,000,000,000 lights (very roughly the number of on-off switches in a 32 gigabyte computer memory), and rigged up an elaborate mechanism in which symbols representing letters and numbers were encoded as sets of lights that are either on or off, and could be manipulated so that the information represented by the symbols was constantly shifting in accordance with your instructions, do you think the array would somehow have the capacity to "know" what the symbols mean? 

The fact--I feel justified in calling it a fact--is that there is no reason to believe that consciousness is a by-product of any physical process. For dogmatic materialists, it's a necessary belief: consciousness exists, therefore physical processes produced it. To the rest of us it only sounds plausible and reasonable because we're so used to looking at the world through materialist assumptions, or at least prejudices. 

If you want to worry about the risks of artificial intelligence, worry about the extent to which it is able to simulate human mental processes by means of a combination of calculation and symbol manipulation, and thus do things which once required human intelligence. In combination with very advanced robotics, these machines can do an awful lot of jobs that are now, or were once, done by people. That's been going on for some time now, and it has serious social implications. But the machines aren't going to start conspiring against us.

The speech recognition and synthesis involved in, for instance, Apple's Siri, and the fact that the computer that does it can fit in your pocket, do seem almost miraculous compared to the technology of forty years ago when I first got involved with computers. But we all know Siri is not in fact a person. That's part of the reason why it can be amusing to ask "her" silly questions, give "her" impossible commands, and so forth. I think if you dug around you could find predictions made thirty or forty or fifty years ago that computers with the speed and capacity of your phone could and probably would develop true conscious intelligence. But that's no closer than it ever was. Or ever will be.


Machine vs Bugs

I logged in to Facebook and there was a link to a YouTube video from a band I "liked" (and like), Laki Mera. So I thought I'd listen to the song. It was good. Then I noticed on the YouTube sidebar a video called "Bothering Bald-faced hornets with an Action Drone AD-1". Why did YouTube connect that to the music video? I have no idea. But it caught my attention.

That looks interesting. No it doesn't. Liar. It's 13 minutes long. So what? Okay, I admit I'm curious. Of course--who wouldn't want to see a drone bothering a hornets' nest? But I don't want to take that much time. Seeing a drone bothering a hornets' nest is worth 13 minutes of your time. Yeah, but it's getting kind of late, I should do something more useful. I repeat, seeing a drone.... Okay, okay. You just have to take out the recycling, you have plenty of time for that.  OKAY.

So the voice in italics won and I watched it. You can, too. It is exactly what the title says--well, a bit more than "bothering"--it's a drone attacking a hornet's nest. And it is kind of fascinating. But I will warn you that at several points the hornets are flying so thick that it's creepy, and I'm a little concerned that it could give me a nightmare, so I'm going to get away from the computer for a while before I go to bed.

I suggest watching/listening to the Laki Mera video to help clear that out.


Sunday Night Journal, April 22, 2018

It seems I'm not the only person who thinks the most noticeable thing about Twitter is the amount and level of venom it seems to produce, very often in combination with stupidity. I freely admit that this is unfair on my part, because I'm not on Twitter, and so all I hear about for the most part is the controversies that spill out onto the Internet at large--the "Twitter mobs," as they are aptly called, which form and attack someone who has attracted their hostility. But the other day I saw a graphic associating various "social media" platforms with one of the seven deadly sins, and whoever composed it assigned Wrath to Twitter.

I know it isn't all bad. Much of it is harmless, and some of it is probably good. My wife the archivist occasionally mentions Twitter posts from archivists, or from museums or libraries, which contain interesting bits of lore. The Archbishop of Mobile uses it to tell the world that, for instance, he will be in Thomasville doing confirmations this weekend. I've seen jokes from Twitter that were actually funny.

But none of that changes my basic animosity toward it, which pre-dates any of the pathological phenomena. I was ill-disposed to it from the moment I heard of it, because of its name and because what one does with it is to "tweet." I took this as an open declaration that it was designed to be a vehicle for noisy, frequent, and trivial remarks. "Venomous" did not immediately occur to me as a likely possibility, though it probably should have; it's not as though the Internet was an altogether benign place before Twitter.

I had an instant conviction that it was not for me and that it was to be ignored, and I noticed with mild horror as it grew over the years and became one of that very small group of Internet platforms that, taken together, almost seem to define the net itself: Google, Facebook, YouTube, Twitter. My disdain has even extended to a reluctance to refer to it at all, and a great resistance to beginning a sentence with "So-and-so tweeted that...." How can you possibly take seriously anything that follows those words? It's like saying "So-and-so yapped...." "So-and-so grunted...." "So-and-so belched..."

Never of course did it cross my mind that a duly-elected president of these tenuously-united states would use Twitter as his favored means of addressing the nation and its problems (to say nothing of his own personal grudges) and that he would do it every day, so that no day would pass without a news story beginning "The president tweeted that...."

I found myself unwilling to use the word "tweet," whether noun or verb, in a sentence, without putting it in quotation marks. To do otherwise seemed to give it some kind of legitimacy that it didn't deserve, to suggest that I was somehow approving of it as a means of rational conversation. But this week I've had a slight change of heart. Very slight. I'm giving up the quotation marks.

This change was catalyzed by a case you probably heard of: a nutty professor who took the occasion of Barbara Bush's death to tweet (there, I've done it) that the former First Lady was, among other bad bad things, an "amazing racist," and that her death was an occasion for joy. A great furor immediately erupted, of course, including, apparently, a length exchange of hostilities between the professor, Randa Jarrar of Fresno State University, and people who criticized her. If you have somehow managed to remain ignorant of it, this Washington Post story has the basics. And it struck me: why should I feel any need to find a word more serious than "tweet" for this sort of jibber-jabber? Tweeting is appropriate and pleasant from a goldfinch. Coming from a human being it's ridiculous. If people are going to use this medium which makes them appear bird-brained, why should I try to dignify their twittering with a less silly-sounding word? 

Almost as striking to me as the professor's charge of racism was the way she used the word "amazing." This is a verbal tic which I associate with teen-aged girls, who seem to use it as a sort of all-purpose positive hyperbole, the way they used to use "awesome" (maybe they still do). Anyone the teenager likes is "an amazing person." To enjoy oneself is to have "an amazing time." She just ate"an amazing apple." And so on. I think this is the first time I've heard it used in a negative sense, though I suppose the word itself is just as applicable to something amazingly bad as amazingly good. 

Actually, looking more closely at what the professor said, I'm not sure that she meant "amazing" as negative. Her words were "Barbara Bush was a generous and smart and amazing racist." What exactly does "amazing" mean there? Positive, but ironic? But then, who cares? It was a tweet.

I can't say I was much surprised to learn that what the professor professes is creative writing. Maybe she can write. Who knows, and, again, who cares? I don't plan to find out. I was actually a little surprised to find that her title is in fact "professor." The news media tend to use that word for anyone who teaches at the university level, though it is in fact a specific title, coveted and respected.

Naturally there have been calls, loud calls, for her firing. Some conservatives have spoken against that and I think they're right. In truth Jarrar seems like a rather pathetic person, who would be more pitiable than anything else if she didn't have such a mean streak.


Here is Kevin Williamson's summary of the role of Twitter in his firing from The Atlantic.  

Where my writing appears is not a very important or interesting question. What matters more is the issue of how the rage-fueled tribalism of social media, especially Twitter, has infected the op-ed pages and, to some extent, the rest of journalism. Twitter is about offering markers of affiliation or markers of disaffiliation. The Left shouts RACIST!, and the Right shouts FAKE NEWS! There isn’t much that can be done about this other than treating social media with the low regard it deserves.

You may be able to read his side of the whole story here at The Wall Street Journal. (Thanks to Grumpy for the link.) I found that from my phone that link was blocked as being subscriber-only, but from my laptop it wasn't.


Speaking of people shouting "RACIST!": a month or two ago I stepped briefly into a discussion on Facebook in which Rod Dreher was being taken to task for having a "race problem." This seems to be what you say when you want to call someone a racist but don't have enough evidence to justify that word and are not, as someone like the professor above is, willing to use it anyway. Dreher had linked, in one of his many, many columns, to an article in which a former Peace Corps volunteer who had once spent a year in Ethiopia talked about how much she had disliked it. The context was Trump's alleged remark about not wanting immigrants from "s**thole" countries; Dreher was thinking aloud, as he often does, about whether some countries are in fact very messed-up places whose emigrants might not be desirable.

I didn't think Dreher's link was evidence of racism, especially considering that Dreher has written often and sympathetically about the situation of black people, and said so, remarking that the charge of racism is becoming meaningless. That was immediately taken to mean that I was denying the existence of racism. I of course intended for "meaningless" to apply to the making of the charge, not to the existence of the thing itself. It rankled me that I was being misunderstood, but there were at least a couple of people in the conversation who seemed very eager to make the charge, and it looked to me like any attempt to clear things up would only result in my acquiring a "race problem," if in fact I hadn't already, if only because I was defending Dreher. So I made a sort of quick intuitive cost-benefit, risk-reward assessment and dropped out of the discussion.

In our current polemical environment, to attempt to defend yourself from a charge of racism is one of the dumbest things you can do. It's like blood in the water for the attackers, and you cannot win. The more you protest your innocence, the guiltier you will look to them. Best to just walk away.

Why am I even relating this little incident? Because it did rankle, and it gives me some sort of satisfaction to say publicly and clearly what I meant. Why didn't I say it then and there? Because of the factors I've already mentioned, and because it was Facebook, and I don't personally know any of the people involved. I have no reason to think any of them reads this blog, but if they want to argue with me here, I can require a level of respect and decorum that I can't on someone else's Facebook post. In all the years I've been blogging I don't think I've deleted more than two comments, but I like having the tool available. 


And speaking of Dreher: I have often criticized his overly-agitated approach, and in fact have for long periods not read him at all. Lately I've gotten into the habit of reading him regularly. That's not altogether a good thing, because he does focus on all the bad things that are happening, and specifically on the deteriorating situation of Christianity in this country and in Europe. But though those are not the whole story, they are happening, and it's well to keep an eye on What Is Actually Happening


I don't know whether this is referring to some of the German bishops or what.


Sunday Night Journal, January 7, 2018

How, how is it possible that this is the year 2018 A.D.?  I have a clear memory of sitting in Mrs. Bruce's 6th grade class, which means it was 1959 or '60 and I was eleven or twelve years old, and wondering for the first time (as far as I remember) how old I would be in the year 2000. I remember doing the arithmetic to find out...borrow one, ten minus eight is two, nine minus four is five.


Astonishing! Inconceivable! I truly had no way to conceive of that length of time, much less what it would be like to be that age.

A few years later I was reading science fiction, much of which was set in that far-distant time. As late as 1969 Arthur C. Clarke and Stanley Kubrick thought it reasonable to set their drama of the next step in human development (guided by those substitute gods, super-intelligent extraterrestrials) in 2001. They furnished their vision with a space travel infrastructure comparable to that which existed for air travel at the time. Well, that has strikingly failed to appear, along with many other visions of a future set in or just beyond the beginning of the millennium. The bright sleek shiny ultra-modern future has in general strikingly failed to appear (as has the dystopia that apparently became the preferred vision sometime after the 1960s). The only really significant technological development since 2000 has been the emergence and spread of the pocket-size supercomputer, otherwise known as the smart phone. Arguably the web of information and communication, and the computers which constitute and give access to it, are the only really significant technological development since 1970--at least if by "significant" we mean something that changes daily life in some important way for almost everyone. 

And culturally--well, some would say we've advanced, some that we've declined, and I favor the latter view, but all in all the change has not been so dramatic as science fiction writers expected. We've been pretty well stalled for most of the past fifty years, actually: the cultural revolution of the late 1960s happened, and things were rather different in 1975 from what they had been in 1965. But since then they've been fairly static, really, at least domestically--the culture war started and the antagonists have been locked in a struggle ever since, neither side winning a clear and decisive victory. 

Most striking to me is the swiftness of the passage of the eighteen years since 2000. It's a tiresome thing for an old person to say, I know, but it keeps occurring to me because it's so astonishing. The turn of the millennium is roughly as many years in the past for us no as the end of the Second Word War was at the arrival of the Beatles.  This year, children born in 2000 will be finishing high school.  They've gone from being newborns with everything to learn to having at least some level of education and ability to manage for themselves. But for me it's mostly been a sort of plateau, and I've traversed it very quickly.

For someone my age the phrase "the year 2000" once had the aura of distant, exciting, hardly imaginable futurity. Now it's just an ordinary and rather dull bit of the past, and for me not a very far distant past, as it must be for someone who will be, say, just old enough to vote and buy alcohol this year. 

And fifty-two seems fairly young. Or at least not old.

I'm happy to say, by the way, that I never was much worried about the Y2K disaster. There was a guy who gained a fair amount of fame for himself at the time by predicting, with a great deal of confidence and a fair amount of irrelevant or invented data, that civilization as we know it could not survive. I thought he was nuts, unless he was lying. A few years after the day had come and gone with barely a ripple, I went looking online to see what he had to say about having been proved wrong. I couldn't find any trace of him.


I'm also happy to say that I've managed pretty well to keep to my resolve to stay out of the controversy surrounding Amoris laetitia, and in general the controversies surrounding the papacy of Pope Francis. I've not only stayed out of them but have gotten pretty good at ignoring them. But yesterday someone recommended to me this piece by Christopher Altieri at Catholic World Report as being "the most balanced explanation" of the situation he'd read. Well, to be honest, he actually said "of the current mess." And it does seem to be something of a mess. Maybe it shouldn't be, maybe it's really much ado about nothing. But in that case the fact that it seems so important would have to be counted as part of the mess. At any rate, Altieri's piece does seem a pretty good explanation of why it isn't crazy (pharasaical, etc.) to have concerns about what the pope seems to want to do.

It strikes me, and it is not a happy stroke, that what's going on in the Church looks a lot like what's going on in American politics:  bitter factions hurling invective at each other, "Pope Francis can do no wrong" vs. "Pope Francis can do no right." As with the political situation, though, I don't really see that much effect in my own life and in my own place. My diocese, the parishes within it, and my Ordinariate group muddle along as usual, trying to practice the faith to the best of our ability. The Pope is far away and the fight doesn't directly affect us (not counting the clergy, for whom it may have immediate practical import in their ministry). I think I'll be glad if the era of the celebrity pope fades away. Much as I loved John Paul II, I was always a little concerned about that. 

One happy effect of all this, as Altieri notes (though not happily), is that the folks at America magazine, and the similarly-minded, now have the opportunity to denounce their opponents as "dissenters." It's a pretty silly charge, but it must feel pretty good to be able to make it, after several decades of digging in their heels against the last two popes. So let them have their fun, I say.


I've been meaning to mention that Francesca Murphy is now doing a bi-weekly blog post at First Things. They're excellent, and I think this piece, "The Secrets of the Confessional," is my favorite so far, a reflection on the fact that "Feeling great after confession is probably the most widespread experience in Catholicism, a religion not founded on religious experience as such."


I never was very good at learning the rules of grammar, and am only bothered by the mistakes which I don't usually make. One such has been bothering me lately. Isn't it the distinction between "transitive" and "intransitive" verbs that describes whether the subject is doing something or having something done to it? Whatever the correct terminology is, the distinction seems to be breaking down. I keep running across sentences which seem to be misusing transitive verbs, if that's the right way to describe it. For instance, I was about to subscribe to a certain magazine a few days ago, and the web site referred to "the Spring issue which releases in March." Shouldn't that be "is released"? 

And this: if you do this or that on your computer, you're told, "a message displays." Shouldn't that be "is displayed"?

And this: "It transforms into...." Shouldn't that be "is transformed into"?

Like I said, I don't really even know the rules. I just play by ear. But these are like off-key notes to me.

I'm giving up on "I'm going to lay down" and "He said to John and I." Those battles are hopelessly lost.

But I reserve the right to laugh when I hear individual persons referred to as "they" and "them" in deference to gender-bending fads. As in "My boyfriend has recently come out as transgender, and they are saying I'm a bigot because I'm thinking of breaking up with them." I actually read something like that a few days ago, though I can't find it now. As I've mentioned before, those who think Donald Trump is inaugurating the reign of Orwellian Newspeak and thoughtcrime are barking up the wrong tree entirely.  


We had an actual cold snap last week. By "actual" I mean that the temperature was low enough (mid-20s Fahrenheit) and the wind from the north strong enough (mid-20s MPH), to qualify as being, so to speak, objectively cold. Sometimes in winter, or in a hurricane that hits at the right angle, a strong north wind can blow much of the water right out of the bay. That was the case when this picture was taken on New Year's morning. Under normal conditions the only thing you would see besides water and sky in this picture would be the top of that furthest post. This debris by the way is partly trees fallen years ago and partly the remains of the city's original sewer line. I really should try to get them to remove it.


Banana trees do not take the cold very well. This was taken the following morning.


Sunday Night Journal, December 17, 2017

One of the things that make being a pessimist less enjoyable than it might be is that the pleasure of being proven right about the impending collision with the oncoming train is substantially diminished by the discomfort of experiencing the collision itself. On balance, I would rather be wrong than right about many of my predictions and expectations.

Recently someone remarked that the culture of Catholicism on the Internet seemed to be dominated by those whose motto was taken from Flannery O'Connor's Misfit ("A Good Man Is Hard to Find"): no pleasure but meanness. You can certainly see his point, and of course it's not only the Catholic subculture that has this problem. It's everywhere. I'm not on Twitter, and I don't think I ever will be, so my impression may be distorted, but that impression is that some very large portion of what goes on there consists of people savaging each other, and sometimes of a large number of people attacking one person in what can fairly be described as a verbal lynch mob. 

Almost twenty-five years ago, in the Fall 1993 issue of Caelum et Terra, I published a short piece in which I considered the potential for the world of online communication to increase the level of hostility in the world rather than to bring people together. I don't think I had heard the term "World Wide Web" at that point. It did exist but if its existence was known to me at all I'm fairly sure I hadn't yet experienced it, as the original browser, Mosaic, had not yet been introduced to the world at large. But I'd had five or six years of experience in pre-Web online forums, including Usenet, the text-only discussion groups on the Internet, and at least one of the commercial services which provided similar capabilities by subscription. (America Online, or AOL, you may remember, was pretty big at the time.) And I'd seen the way people could turn on each other with startling viciousness.

That piece was called "Global Metropolis," and I've thought about it often since the Web changed our lives so strikingly. But I don't think I'd read it since it was published, so I decided to dig the magazine out of the closet and, if it still seemed relevant, spend some of the time this evening when I would have been writing typing it in instead. Well, I did find it interesting as a view from 1993, and at least part of it seems pretty accurate. The next-to-last paragraph is the one that seems a fairly accurate prediction of the Internet's potential for exacerbating rather than pacifying conflicts.

I sometimes think it might be more accurate to say that we are creating a global metropolis, a violent one like those of the United States, where people perceive each other socially either as naked individuals in isolation from family and community, connected, if at all, by financial ties, or as anonymous components of a class or race. In these great warrens, people live in close physical quarters but without much sense of belonging to the same community; with, in fact, a dangerous sense of having tribal enemies everywhere. What this combination of proximity and anonymity has done for the great cities, television, along with increasing economic interdependencies, may perhaps do for the world at large: to increase both the level of tension and the degree of isolation.

You can read the whole piece here (it's not very long, about a thousand words). I certainly didn't imagine anything like Twitter, which as far as I can tell is an active deterrent to substantial discussion, which is actually intended to reward the quick and superficial remark. The glimpses I get of it make me think of an enormous number of dogs barking hysterically and ineffectively at each other. And if I had imagined it I would never have dreamed that we would have a president who sits in the White House and uses such a ridiculous medium to bark at anyone among the citizenry who annoys him.  

I'm not saying that the Internet in general and "social media" in particular are altogether bad--I mean, here I am. And I even have a Facebook account. Still, it seems that the "bringing people together" effect of it is very often to bring them into tribes united by their detestation of another tribe, and thereby to purify and concentrate their anger. The net did not cause the political divisions in this country, but it is certainly exacerbating them.

Still, when I look at the people around me going about the ordinary business of their lives, I don't see this division, and that's a hopeful thing.


A couple of weeks ago on the first Sunday of Advent our priest made the point in his homily that Advent is as much about the Second Coming as it is about the Nativity. "We await the birth of a sweet little baby who will one day destroy the world," was more or less the way he put it. 


While watching The Crown on a chilly evening last week I discovered that my old dog, who is too blind and deaf to notify us of strangers in the vicinity, still is capable of service as an all-natural toe warmer. 

ToeWarmerI enjoyed The Crown considerably, by the way, and I think my wife enjoyed it even more. It's wise to keep in mind that they filled in some historical blanks with their own speculations and that it's a mixture of fact and fiction, not a documentary. Claire Foy's performance as Elizabeth is extremely convincing. I must say, however, that I remain puzzled to say the least about the function of the British monarchy.

Have I mentioned that we also recently watched the third series of Broadchurch? I suppose it's not surprising that I didn't like it as much as the first two, but it's still very good. There is apparently little to no chance of a fourth series, which might be a good instance of quitting while you're ahead, but this one left an important matter unresolved, which is a little frustrating. Well, maybe it isn't unresolved--maybe that was just the end of...well, if you've seen it you know what I'm talking about, and if you haven't I shouldn't say any more.

Sunday Night Journal, November 19, 2017

One of the milder vexations of getting old is that I find my hands and feet, especially my feet, feeling cold at times when they would not have in the past. It seems disconcertingly old-man-like. It's especially noticeable when I spend several hours sitting at the computer, which I do almost every day. In summer I turn up the thermostat as soon as my wife, who is still going off to work every day, leaves the house, and let the temperature get into the upper 70s. In winter I sometimes bring a little electric space heater into my office, just like those women in offices whom I've teased for many years. All this annoys me, and I'm a little embarrassed to admit it. The only good thing about it is that getting those feet and fingers warm when they're chilled feels pretty good.

This morning, for instance, it was a little cool--the outside temperature was 50-ish, not exactly cold, but there was a strong and chilly north wind. I went out to walk the dog wearing a light jacket over a t-shirt, and when we got out onto the beach where there's nothing to get in the way of the wind I was quickly cold enough to be uncomfortable, especially as I spent a few minutes dragging some newly-washed-up pieces of Hurricane Nate debris away from the water's edge, thus getting my hands wet and well-chilled.

Back at the house it was time to have a shower, and that warm water felt really good on my hands, and started me reflecting on what a luxury the hot shower is. I remember a conversation with my father some years ago when I was talking about people voluntarily giving up the conveniences of modern life in order to be more self-sufficient or environmentally conscious, or something--I don't remember exactly what the context was, but he remarked that "Not many people are not going to want to give up their hot shower." 

Well, that's certainly true. I certainly don't. And this morning was one of those moments when I realize just how soft life is for most of us in the industrialized world compared to what it has been for most people in most times and places throughout human history. We take it for granted most of the time, and we tend to assume that what has now been invented and produced and widely distributed will remain available. Not all of us by any means: the post-apocalyptic return to pre-industrial conditions is a staple of fiction and film, and often the subject of very serious and dire warnings from environmentalists. We may deplore this or that aspect of technological civilization--I deplore a number of them, for instance the dominance of the automobile. But almost nobody wants to give it all up. And most of us most of the time don't anticipate that we--humanity at large if not ourselves--will lose it, that what we now regard as basic necessities will disappear.

That's partly because we don't really appreciate the expenditure of time, effort, intelligence, and labor that produced our material way of life, and, more importantly, sustains it. All over the world millions of people are doing millions of things that keep this whole machine operating. Now and then I run across some small business that makes a living by supplying some little thing or service--some type of fastener, maybe--that's a component of some bigger thing and that in the normal course of events no one apart from the people who provide it and the people who make the bigger thing ever give a second's thought to. And yet the bigger thing, and the bigger thing or things which depend on it, may be something which the rest of us expect to be there whenever we need it, and would be sorely missed if it didn't exist, which it wouldn't do without that little thing and many other little things. 

A complex web of law and custom is an equally essential part of this apparatus. And so my train of thought about material progress soon switched toward the question of moral and spiritual progress, and to that "arc of history" which is "long, but bends toward justice" and is sometimes invoked as a guarantor that in the long run progress, as conceived by the person invoking the principle, will prevail. Maybe it will. Christians believe that it will, though the trajectory runs right out of the far end of history into another realm. But Christians have a pretty clear and definite explanation for this movement, and for its source. What's striking about many of those who seem to take it as an article of faith is that they don't seem to ask the obvious questions it raises. It's a faith, but a very vague one: toward what is it directed? What is it faith in? The Force? What? Those of us who have a more traditional faith and, after two thousand years, a very well-thought-out one, but which is frequently treated as irrational, are entitled to question this one; to interrogate it, as the post-modernists like to say.

To say that history has a moral arc implies that it is being directed by something or someone. Who or what is this? The direction must be conscious--surely no one can seriously maintain that matter and energy alone are imbued with moral convictions and the power to impose them. There is certainly no evidence anywhere outside of human affairs that life is cognizant of justice or aspires to it, much less that the concept of justice even has any meaning to the cosmos at large.

And what is that thing called "justice" anyway? Does it exist in some abstract but nevertheless objectively real sense? Is it an absolute and intrinsic good, and its opposite, injustice, an absolute and intrinsic wrong? If so, there is at least one moral absolute, and there may be others. But if not, if it is not an absolute, and can change, how can history be aiming for it? 

The whole idea seems to point necessarily to a moral intelligence that guides, if not the entire cosmos, then at least human history. But my impression is that most of those who cite this inevitable direction of history don't inquire about its source. If you press them on how they know what justice is and why it's a good thing, they scoff and say that it's obviously good, and they don't need anyone to tell them that. And they don't see that the very fact that they hold that belief is in itself a very interesting phenomenon and one that does not, in fact, explain itself, at least not very satisfactorily. What is conscience? Why do we have one and where does it come from, and where does it get its principles? The answers are not obvious. If you don't want to dig for them, that's fine; not everyone is obliged to. But if you shrug off the questions you shouldn't strike a pose of intellectual superiority, and act as if you have the answers.

At any rate, I don't see that the moral structure of our civilization is any more necessary and inevitable and permanent than its material structure, and any less dependent on a complex web of small but significant things, in this case principles, and prejudices in favor of those principles. Contemporary developments pretty well confirm that, in fact. Christian writers have been saying for more than a hundred years that modern civilization is far more indebted to Christian and Judeo-Christian morality than it knows. Those who like the "arc of history" figure also tend to think that the progress it describes will include the abandonment of primitive delusions such as Christianity. It may, at least for some time, but if it does the arc may bend in a direction rather different from the one expected by contemporary progressives. Concern for the weak and vulnerable is hardly an historical constant. 

The idea, by the way, seems to have originated with a New England Transcendentalist, Theodore Parker. That's fitting. Matthew Rose, writing at First Things, has an extensive and interesting piece on all this, "Our Secular Theodicy." Also by the way, on the question of moral progress, I lean toward C.S. Lewis's idea that good and evil progress simultaneously. 


Well, once again I've gone on about one thing rather longer than I had intended, and now I don't have time to say much about the other thing I had meant to write about: Stranger Things 2. I'll just say that I have watched it, and I liked it. Neither my wife nor I could remember very much about the first series, so we watched it again before the new one. I liked it better this time, and as of now I'd say I like the first one a little better, but they're both good. The second one is more spectacular, has more and more disturbing monsters, and more gore. It's closer to being a horror film than the first, I think. What makes them both so engaging is the principal characters, most of all "the kids," the middle-school group who are the main characters, and some of the adult characters, especially Winona Ryder's portrayal of the slightly nutty but passionately devoted mother, and David Harbour as the sheriff. And in spite of all the terror and violence, there is an underlying sweetness about both series, with love playing a major role. That makes them very different from some other popular series, such as Breaking Bad, or the dark detective shows like The Killing

I hereby open Janet's Undead Thread v3.0 for spoilers-allowed discussion of Stranger Things.



Black Mirror: Informing, not Recommending

I recently watched the six available episodes, plus a sort of holiday special, of a British TV series called Black Mirror. It's a sort of sci-fi production, with each episode being a complete story, and no connection other than a thematic one among the episodes. The theme is the extent and effect of our immersion in electronic media and other technology, and the uses to which more advanced developments might be put. I didn't make the connection till I read it somewhere after I'd already seen the show, but the title refers to the glossy black screen of a smart phone, and is the sort of brilliant stroke that any writer would hope to come up with for a title, combining a direct and specific reference with symbolic import.

The reason I'm telling you about it is that it's very well done, very insightful about its theme, and very effective in drawing out its darker possibilities. The reason I'm only informing you about it, not actively recommending it, is that it's all too successful in that effort, and is very disturbing. Just to give you one example: the first episode involves the prime minister of Great Britain being placed in the position of having to perform a truly disgusting act on live television in order to prevent an assassination. 

I was somewhat shaken by that episode, and you will naturally wonder why I watched any more. Well, part of the reason is that I thought surely the other episodes wouldn't be so disturbing. (Why did I think that? I don't know). And part of the reason is that it did present important insights in a clever and powerful way. None of the other episodes involved anything as disgusting as the first, but every one was disturbing in one way or another. There is little physical violence, but there is a lot of psychological violence, even torture.

So: watch at your own risk. In fact, having written this post, I'm thinking of discarding it. A compromise with my better judgment: I'll post this, but warn you that the episode called "White Bear" and the 90-minute "holiday" special (some holiday!), "White Christmas," involve what seemed to me the most extreme, almost unbearable, psychological torture. (Interesting that they both have the word "white" in the title.) "White Bear" is the 5th episode. Had it been the first, I definitely would not have continued.

There is another series in the works. I think I'll definitely skip it.

Apparently I'm Not the Only One Who Wonders

 I started to search for "is recycling worth it" and when I had gotten as far as "is rec", this is what Google offered:


If you're interested, here's an article at Popular Mechanics that seems reasonable and plausible. The short answer is "yes". The slightly longer answer is that recycling may not be "worth it" by a short-term dollars-and-cents calculation--it may be a money loser--but that it clearly is worth it by a longer-term resource consumption calculation.

I'm glad to see that Popular Mechanics still has articles about slightly goofy inventions like a knife that toasts bread as it cuts.

End of an Era

I mentioned, in talking about Pat Metheny's "It's For You" last week, that the phrase would soon lose its telephone-related significance. For many it already has, but I suppose the majority of people now living would at least recognize it, even if it hasn't been part of their lives for a while. 

I suppose it hasn't been part of mine, either, for a while, because my wife has used her mobile phone almost exclusively for a long time, and although I was several years behind her I was going that way, too. For a year or two now we have rarely gotten a call on our home phone that wasn't a sales or fund-raising pitch, or a political robo-call. A few months ago it stopped working, and weeks passed before we even bothered to do anything about it. Attempting to report it--via the web, of course--we were taken through a series of trouble-shooting steps only to be told in the end that there was something wrong with our equipment, and that it couldn't be fixed. Of course it could have been, if we had pressed the issue. But what would be the point? We had hardly missed it, only checking--again, via the web--every day or two to see if we had any voicemail messages.

So, after several hours in the AT&T store, and on the phone with AT&T customer support, we transferred our home number to my mobile phone, and cancelled the home phone. (I kept that number because for over twenty years it's been the number known to family, friends,  and businesses, and very few people other than my wife ever called my mobile number.) Lots and lots of people are doing this, of course. I'm not sure any of my children have ever had a "land line" phone of their own.

I'm tempted to draw some kind of conclusion about social fragmentation from this. But really, aren't we all a little tired of saying and hearing things like that? What really strikes me more is a mild nostalgia, and a sense that the world in which I grew up is passing away. Fourteen years into the twenty-first century, we are leaving the culture of the twentieth behind in more and more ways. I'm always inclined to think things in general are getting worse, and always having a conversation with myself about whether that's really true. But it does seem to me that many of the hopes of the twentieth have faded, especially the hope that followed the Second World War, at least in this country; it seems a meaner world, all in all, notwithstanding the fact that we then had the threat of global nuclear war hanging over our heads--that was the big picture, but the nearer picture held promise.

The family telephone seems quaint and old-fashioned now. This morning at breakfast my wife and I were discussing the old rhyme that children not quite into adolescence used to tease each other with, when a boy or girl was suspected of liking a girl or boy. Using our names, it would go:

Mac and Karen, settin' in a tree
First comes love, then comes marriage
Then comes Karen with a bay-bee carriage

That seems pretty quaint and old-fashioned, too.

To Architect Or Not To Architect

In the comments a few days ago discussing "graduated college" vs. "graduated from college," there were mentions of several current verbal tics that get on people's nerves. Rob G mentioned that he'd heard one that came from IT (information technology) that really bugs him, but couldn't remember what it was.

I wonder if it was the use of "architect" as a verb. Not only a verb, but one commonly found in the past tense, as "architected." As in "We architected this system to be very responsive." There are multiple annoying usages that come from the IT world, but this one makes me snarl.

It means, as you can gather, nothing more or less than "designed." There is a reason for its existence, and the reason reveals a lot about the linguistic knowledge and sensitivities of technologists. Going back forty years and more, there has existed a sub-category of computer engineering called computer architecture, or sometimes system architecture. It refers to the structure of a system, or some major feature of it, considered as a whole, comparable to the ordinary use of the word "architecture" in relation to building.

Some person with absolutely no feel for language apparently decided that architecture is the result of the activity of architecting, and began to speak in ugly sentences like the one above. It's caught on to some degree, though I hope it's so clumsy that few outside the technical world will adopt it. 


Voyager I Departs

Although there is still some argument about it, many experts seem to believe that the Voyager 1 spacecraft has left the solar system. In the absence of a marker out there with a sign saying "You are now leaving the sphere of Sol," there are subtle differences in the relative emptiness of space that distinguish solar from interstellar space, and Voyager is detecting them. 

I don't think it's very likely that we'll ever go very far into space, and I strongly suspect there's really nowhere to go. (Yes, I know all those probability-based arguments about life on other planets, and maybe they're right, but they can't be anything more than conjecture.) Still, there's something a little awe-inspiring about this: that little bit of metal, sailing and sailing and sailing, at what would be an impossibly high speed on earth but is extremely slow in cosmic terms.

The probe has spent 36 years travelling 12 billion miles. A quick and very rough calculation: a light year is about 6 trillion miles, so that's about 1/500 of a light year. So 36 x 500 = 18,000 years to travel one light year. The nearest star (not counting the sun, obviously) is 4.2 light years away. That comes to a bit over 75,000 years. That may be longer than the human race (language, tools, and culture) as we know it has existed, although I'm sure there are widely divergent views on that.

I wonder how long Voyager will stay intact. Millions of years? I really don't know. In any case, I doubt that this is its future:


Motoring To A New World

Here is a vision of how the automobile would change the world, as seen in the Illustrated World of May, 1922.

It's easy to laugh at a lot of this: "There can be no slums in the future." But really, when you take away the naive gosh-it's-wonderful tone, and the irony of the predictions of peace and quiet and uncongested cities, the article is quite accurate in its broad point, that a whole way of life would be built around the automobile, and that town and country would be brought much closer together.

Appreciating the convenience of these highways, people move out of the city to have little homes and gardens along the concrete roads. Thus begins the building of residential communities, any number of which can now be found outside the large cities of the country.... Then the small country school disappears and a consolidated school...appears.

And so, too, the farmer that lives ten miles or fifteen miles from town on the concrete is closer in time and convenience than was the farmer who lived three miles along a mud stretch. Ten miles of concrete and an automobile makes the remote farmer almost a town citizen.

It was indeed the beginning of a new world, for better or for worse.

I had stashed that piece away for comment several weeks ago, and now I can't remember how I came across it. But that's a pretty interesting web site. See, for instance, this account of portable radio, 1916 style.

Everybody Has Wanted to Do This

Destroy an office printer, I mean. Maybe you've seen this before, but I'm sure you'll enjoy it anyway. I wouldn't be surprised if I've posted it before. The first two scenes here are from earlier in the movie, the rest near the end. (Office Space, and it's very funny.) Warning: a bit of rough language, and I think that stretch of silence may represent the removal of some of the crude stuff on the rap sound track.


Personally I hate and fear printers and copiers and avoid using them. 

A Week in CrazyWorld

I have resisted the smartphone trend, partly out of general contrary reluctance but more out of parsimony: the dang things are expensive, and they cost a lot to use. I do have a mobile phone. I've had one for something like ten years, since I embarked on a long drive alone and my wife talked me into getting one just in case I needed it, and to let her know I was all right. I've gone through several of them now, because she is rather attached to them and interested in them, and whenever she got an upgrade I got her hand-me-down, up until the point a couple of years ago (I guess) when she got an iPhone. 

For a long time I rarely used the mobile, mainly for simple purposes such as calling home or work to say I'd be late. But of course the usage has slowly grown, and people at work have the number, so that I can be reached if I'm away from home and there's an emergency there. I even learned to send text messages.

Last Christmas we upgraded the iPhone belonging to one of our children, and I got her former model as a hand-me-down. It doesn't function as a phone, because I have no intention of paying the monthly fee for that service. But I took it because in my job I really need to have some familiarity with these things, and this was a way to do so without spending any money. It does have WiFi, so I can do all sorts of things with it if I have that connectivity, including reading my email. Mostly I've used it as an mp3 player.

This week I'm at a technology conference. I've been away for the whole week, and there are a lot of problems that come up at work every day that no one else knows how to handle. So I've been carrying around with me both the ordinary mobile phone, for emergencies, and the iPhone, for less pressing matters that can be handled by email (the convention center has Wi-Fi, naturally). 

It's crazy-making. I sit in a conference session in which someone is talking about some terribly complicated technical matter, and the phone buzzes with a text message. I try to handle whatever problem it presents me with while not completely losing the thread of the talk. If there's a lull I pick up the iPhone and check my email, to see what's transpiring with the problems I've been dealing with that way, and probably encounter a new one. 

My attention span and ability to concentrate, not so hot at best, are so attenuated that there is never a moment when I'm entirely focused on one thing. It doesn't help that being in a large crowd always tends to make introverted me uneasy and distracted. 

When I leave this conference I won't be tied to these devices anymore, but such seems to be a normal way of life for an awful lot of people. Setting aside the introvert part, I can't believe that the adoption of these habits isn't going to have the effect of making people in general less able and/or willing to think deeply and to reflect. I hope they're more resilient than I am.

No doubt you have often wondered... I have, whether an object dropped through the upper surface of a bubble would also pierce the lower surface, or instead arrive at that location after the bubble had finished bursting and was no longer there.  Of course the result would vary with the size of the bubble etc., but this answers the question for a fairly typical case.


You Are Being Watched

Or at least your email is. I don't think this is exactly a break in my promise not to post about politics till after the New Year, because it's really beyond politics in the usual sense: an NSA whistleblower says everyone in the U.S. is under virtual surveillance. This is a Russian web site, and I normally wouldn't pay much attention to this sort of thing, but I have heard other reports along these lines that seemed more or less credible. Worse, sad to say, it is the sort of thing one might expect from our government. 

Why Is It Always Me?! (II)

As anyone who has a Gmail account knows, Gmail shows you advertising based on the contents of your email, which is creepy and annoying, and occasionally amusing. Like this one:


In case anyone reading this doesn't know, I am not a girl of any standing at all, whether standout, stand-in, stand-up, or stand-down. No word relating directly to the product being advertised occurs anywhere in my mail (I checked), unless you count words like "girl," "woman," and "female." And if those were responsible you'd think I would get a constant stream of ads stuff, which I don't. The only thing I can figure is that it was triggered by the word "standout," which appears twice, most recently in an email that referred to a movie containing "a number of standout performances." I wouldn't have minded it showing up once but it's been there most of the time for several days now.

I briefly considered clicking on the link to see how standout girls shine, but if I do that Gmail will never believe I don't aspire to be one.

My World

And yours, too, though you may not know it, every time you read a blog, log on to Facebook, or use a service like Gmail or Hotmail. From an IT trade publication:

Cloud computing builds on automated virtualized infrastructures by using extended management software to deliver environments where users can define, provision, and operate their own services in service models such as IaaS [infrastructure as a service], PaaS [platform as a service], or SaaS [software as a service].

Someday I won't have to think about this stuff anymore. 

The industry likes the term "cloud computing," because it sounds nice and sweet and dreamy and simple. It's anything but. Considered from top to bottom, it might be the most complex thing the human race has yet constructed. The quote above is part of a discussion of a particular set of sub-problems associated with the technology.