Technology Feed

Le Spleen de Samsung?*

My DVD player is having problems so I started looking for a new one. Of course all sorts of stuff has changed since I bought this one--everything is Internet-enabled, etc. So, trying to figure out what I actually need and/or want, I was reading the users' questions and answers for one I'm interested in, and saw this:

Question:  Does it have app for MLB tv
Answer:     i’m not good with tech stuff - i don’t know what MLB means and i hate apps
By Brandy [name redacted] on March 5, 2019

I would almost sort of like to meet Brandy.

* (Baudelaire, Le Spleen de Paris: "The title of the work refers not to the abdominal organ (the spleen) but rather to the second, more literary meaning of the word, "melancholy with no apparent cause, characterised by a disgust with everything".)


More AI Nonsense

On stilts. At Vox: "Robot priests can bless you, advise you, and even perform your funeral".

I'm slightly surprised that the author is a former religion editor of The Atlantic. Sadly, I'm not at all surprised that a Catholic theologian--a Franciscan sister, no less--is on hand to add some extra touches of fatuousness. 

"So would I want a robot priest? Maybe!” she said. “A robot can be gender-neutral. It might be able to transcend some of those divides and be able to enhance community in a way that’s more liberating.”


Skynet Is Never Going To Become Self-Aware

For the benefit of those who have managed not to have seen the Terminator movies, or even to have picked up the pop culture lore that originated with them: Skynet is the computer system that initiated nuclear war on its own volition, and began to rule the world in its own interests--which were not those of its inventors. Never mind the rest of the plot, as you probably either know it or don't care. 

Today is the day (or rather the 22nd anniversary of the day) on which, in the movie, the catastrophe occurs:

The system goes online August 4th, 1997. Human decisions are removed from strategic defense. Skynet begins to learn at a geometric rate. It becomes self-aware at 2:14 a.m. Eastern time, August 29th. In a panic, they try to pull the plug. (from IMDB)

Hardly a week goes by that I don't see some mention, either a "science" article in the news, or discussion of a movie or a TV show, which deals with some variation of the idea that artificial intelligence will at some point develop human consciousness, "self-awareness," an independent will, emotions, and interests, and so forth: in short, become a person. Sometimes this is presented as a bad thing, sometimes a good thing. 

If this worries you, or if you are eagerly anticipating it, I'm here to tell you: forget it. Don't worry about it. It is never going to happen. If you like stories that play with the idea, fine, have fun, though they're not usually to my taste. But don't concern yourself with the possibility that it might actually happen. 

How can I be so sure? Well, of course I can't say it with 100% certainty. But I'll commit to 99.999%. The reason is that I know how computers work. I am not a computer scientist, just a journeyman programmer, but I do know how the things work at the most fundamental level. And I also can see, as anyone who bothers to think about it can, that the idea that they can develop consciousness is based on a naturalistic assumption about the nature of consciousness--that it is, in us, an epiphenomenon of the brain, and therefore a probably inevitable development in computing machinery that mimics certain functions of the brain. This is a pure act of materialist faith. There is no evidence for it. No one can actually describe in detail how it can happen; it's simply postulated.

We speak of computers "knowing" things, and in a sense they do. But in the same sense it can be said that a light bulb "knows" when it should glow. It "knows" because you flipped a switch that supplied electricity to it. If you set up an array of 256,000,000,000 lights (very roughly the number of on-off switches in a 32 gigabyte computer memory), and rigged up an elaborate mechanism in which symbols representing letters and numbers were encoded as sets of lights that are either on or off, and could be manipulated so that the information represented by the symbols was constantly shifting in accordance with your instructions, do you think the array would somehow have the capacity to "know" what the symbols mean? 

The fact--I feel justified in calling it a fact--is that there is no reason to believe that consciousness is a by-product of any physical process. For dogmatic materialists, it's a necessary belief: consciousness exists, therefore physical processes produced it. To the rest of us it only sounds plausible and reasonable because we're so used to looking at the world through materialist assumptions, or at least prejudices. 

If you want to worry about the risks of artificial intelligence, worry about the extent to which it is able to simulate human mental processes by means of a combination of calculation and symbol manipulation, and thus do things which once required human intelligence. In combination with very advanced robotics, these machines can do an awful lot of jobs that are now, or were once, done by people. That's been going on for some time now, and it has serious social implications. But the machines aren't going to start conspiring against us.

The speech recognition and synthesis involved in, for instance, Apple's Siri, and the fact that the computer that does it can fit in your pocket, do seem almost miraculous compared to the technology of forty years ago when I first got involved with computers. But we all know Siri is not in fact a person. That's part of the reason why it can be amusing to ask "her" silly questions, give "her" impossible commands, and so forth. I think if you dug around you could find predictions made thirty or forty or fifty years ago that computers with the speed and capacity of your phone could and probably would develop true conscious intelligence. But that's no closer than it ever was. Or ever will be.

 


Machine vs Bugs

I logged in to Facebook and there was a link to a YouTube video from a band I "liked" (and like), Laki Mera. So I thought I'd listen to the song. It was good. Then I noticed on the YouTube sidebar a video called "Bothering Bald-faced hornets with an Action Drone AD-1". Why did YouTube connect that to the music video? I have no idea. But it caught my attention.

That looks interesting. No it doesn't. Liar. It's 13 minutes long. So what? Okay, I admit I'm curious. Of course--who wouldn't want to see a drone bothering a hornets' nest? But I don't want to take that much time. Seeing a drone bothering a hornets' nest is worth 13 minutes of your time. Yeah, but it's getting kind of late, I should do something more useful. I repeat, seeing a drone.... Okay, okay. You just have to take out the recycling, you have plenty of time for that.  OKAY.

So the voice in italics won and I watched it. You can, too. It is exactly what the title says--well, a bit more than "bothering"--it's a drone attacking a hornet's nest. And it is kind of fascinating. But I will warn you that at several points the hornets are flying so thick that it's creepy, and I'm a little concerned that it could give me a nightmare, so I'm going to get away from the computer for a while before I go to bed.

I suggest watching/listening to the Laki Mera video to help clear that out.

 


Sunday Night Journal, April 22, 2018

It seems I'm not the only person who thinks the most noticeable thing about Twitter is the amount and level of venom it seems to produce, very often in combination with stupidity. I freely admit that this is unfair on my part, because I'm not on Twitter, and so all I hear about for the most part is the controversies that spill out onto the Internet at large--the "Twitter mobs," as they are aptly called, which form and attack someone who has attracted their hostility. But the other day I saw a graphic associating various "social media" platforms with one of the seven deadly sins, and whoever composed it assigned Wrath to Twitter.

I know it isn't all bad. Much of it is harmless, and some of it is probably good. My wife the archivist occasionally mentions Twitter posts from archivists, or from museums or libraries, which contain interesting bits of lore. The Archbishop of Mobile uses it to tell the world that, for instance, he will be in Thomasville doing confirmations this weekend. I've seen jokes from Twitter that were actually funny.

But none of that changes my basic animosity toward it, which pre-dates any of the pathological phenomena. I was ill-disposed to it from the moment I heard of it, because of its name and because what one does with it is to "tweet." I took this as an open declaration that it was designed to be a vehicle for noisy, frequent, and trivial remarks. "Venomous" did not immediately occur to me as a likely possibility, though it probably should have; it's not as though the Internet was an altogether benign place before Twitter.

I had an instant conviction that it was not for me and that it was to be ignored, and I noticed with mild horror as it grew over the years and became one of that very small group of Internet platforms that, taken together, almost seem to define the net itself: Google, Facebook, YouTube, Twitter. My disdain has even extended to a reluctance to refer to it at all, and a great resistance to beginning a sentence with "So-and-so tweeted that...." How can you possibly take seriously anything that follows those words? It's like saying "So-and-so yapped...." "So-and-so grunted...." "So-and-so belched..."

Never of course did it cross my mind that a duly-elected president of these tenuously-united states would use Twitter as his favored means of addressing the nation and its problems (to say nothing of his own personal grudges) and that he would do it every day, so that no day would pass without a news story beginning "The president tweeted that...."

I found myself unwilling to use the word "tweet," whether noun or verb, in a sentence, without putting it in quotation marks. To do otherwise seemed to give it some kind of legitimacy that it didn't deserve, to suggest that I was somehow approving of it as a means of rational conversation. But this week I've had a slight change of heart. Very slight. I'm giving up the quotation marks.

This change was catalyzed by a case you probably heard of: a nutty professor who took the occasion of Barbara Bush's death to tweet (there, I've done it) that the former First Lady was, among other bad bad things, an "amazing racist," and that her death was an occasion for joy. A great furor immediately erupted, of course, including, apparently, a length exchange of hostilities between the professor, Randa Jarrar of Fresno State University, and people who criticized her. If you have somehow managed to remain ignorant of it, this Washington Post story has the basics. And it struck me: why should I feel any need to find a word more serious than "tweet" for this sort of jibber-jabber? Tweeting is appropriate and pleasant from a goldfinch. Coming from a human being it's ridiculous. If people are going to use this medium which makes them appear bird-brained, why should I try to dignify their twittering with a less silly-sounding word? 

Almost as striking to me as the professor's charge of racism was the way she used the word "amazing." This is a verbal tic which I associate with teen-aged girls, who seem to use it as a sort of all-purpose positive hyperbole, the way they used to use "awesome" (maybe they still do). Anyone the teenager likes is "an amazing person." To enjoy oneself is to have "an amazing time." She just ate"an amazing apple." And so on. I think this is the first time I've heard it used in a negative sense, though I suppose the word itself is just as applicable to something amazingly bad as amazingly good. 

Actually, looking more closely at what the professor said, I'm not sure that she meant "amazing" as negative. Her words were "Barbara Bush was a generous and smart and amazing racist." What exactly does "amazing" mean there? Positive, but ironic? But then, who cares? It was a tweet.

I can't say I was much surprised to learn that what the professor professes is creative writing. Maybe she can write. Who knows, and, again, who cares? I don't plan to find out. I was actually a little surprised to find that her title is in fact "professor." The news media tend to use that word for anyone who teaches at the university level, though it is in fact a specific title, coveted and respected.

Naturally there have been calls, loud calls, for her firing. Some conservatives have spoken against that and I think they're right. In truth Jarrar seems like a rather pathetic person, who would be more pitiable than anything else if she didn't have such a mean streak.

*

Here is Kevin Williamson's summary of the role of Twitter in his firing from The Atlantic.  

Where my writing appears is not a very important or interesting question. What matters more is the issue of how the rage-fueled tribalism of social media, especially Twitter, has infected the op-ed pages and, to some extent, the rest of journalism. Twitter is about offering markers of affiliation or markers of disaffiliation. The Left shouts RACIST!, and the Right shouts FAKE NEWS! There isn’t much that can be done about this other than treating social media with the low regard it deserves.

You may be able to read his side of the whole story here at The Wall Street Journal. (Thanks to Grumpy for the link.) I found that from my phone that link was blocked as being subscriber-only, but from my laptop it wasn't.

*

Speaking of people shouting "RACIST!": a month or two ago I stepped briefly into a discussion on Facebook in which Rod Dreher was being taken to task for having a "race problem." This seems to be what you say when you want to call someone a racist but don't have enough evidence to justify that word and are not, as someone like the professor above is, willing to use it anyway. Dreher had linked, in one of his many, many columns, to an article in which a former Peace Corps volunteer who had once spent a year in Ethiopia talked about how much she had disliked it. The context was Trump's alleged remark about not wanting immigrants from "s**thole" countries; Dreher was thinking aloud, as he often does, about whether some countries are in fact very messed-up places whose emigrants might not be desirable.

I didn't think Dreher's link was evidence of racism, especially considering that Dreher has written often and sympathetically about the situation of black people, and said so, remarking that the charge of racism is becoming meaningless. That was immediately taken to mean that I was denying the existence of racism. I of course intended for "meaningless" to apply to the making of the charge, not to the existence of the thing itself. It rankled me that I was being misunderstood, but there were at least a couple of people in the conversation who seemed very eager to make the charge, and it looked to me like any attempt to clear things up would only result in my acquiring a "race problem," if in fact I hadn't already, if only because I was defending Dreher. So I made a sort of quick intuitive cost-benefit, risk-reward assessment and dropped out of the discussion.

In our current polemical environment, to attempt to defend yourself from a charge of racism is one of the dumbest things you can do. It's like blood in the water for the attackers, and you cannot win. The more you protest your innocence, the guiltier you will look to them. Best to just walk away.

Why am I even relating this little incident? Because it did rankle, and it gives me some sort of satisfaction to say publicly and clearly what I meant. Why didn't I say it then and there? Because of the factors I've already mentioned, and because it was Facebook, and I don't personally know any of the people involved. I have no reason to think any of them reads this blog, but if they want to argue with me here, I can require a level of respect and decorum that I can't on someone else's Facebook post. In all the years I've been blogging I don't think I've deleted more than two comments, but I like having the tool available. 

*

And speaking of Dreher: I have often criticized his overly-agitated approach, and in fact have for long periods not read him at all. Lately I've gotten into the habit of reading him regularly. That's not altogether a good thing, because he does focus on all the bad things that are happening, and specifically on the deteriorating situation of Christianity in this country and in Europe. But though those are not the whole story, they are happening, and it's well to keep an eye on What Is Actually Happening

*

I don't know whether this is referring to some of the German bishops or what.

PulverizePests


Sunday Night Journal, January 7, 2018

How, how is it possible that this is the year 2018 A.D.?  I have a clear memory of sitting in Mrs. Bruce's 6th grade class, which means it was 1959 or '60 and I was eleven or twelve years old, and wondering for the first time (as far as I remember) how old I would be in the year 2000. I remember doing the arithmetic to find out...borrow one, ten minus eight is two, nine minus four is five.

 2000
-1948
   52

Astonishing! Inconceivable! I truly had no way to conceive of that length of time, much less what it would be like to be that age.

A few years later I was reading science fiction, much of which was set in that far-distant time. As late as 1969 Arthur C. Clarke and Stanley Kubrick thought it reasonable to set their drama of the next step in human development (guided by those substitute gods, super-intelligent extraterrestrials) in 2001. They furnished their vision with a space travel infrastructure comparable to that which existed for air travel at the time. Well, that has strikingly failed to appear, along with many other visions of a future set in or just beyond the beginning of the millennium. The bright sleek shiny ultra-modern future has in general strikingly failed to appear (as has the dystopia that apparently became the preferred vision sometime after the 1960s). The only really significant technological development since 2000 has been the emergence and spread of the pocket-size supercomputer, otherwise known as the smart phone. Arguably the web of information and communication, and the computers which constitute and give access to it, are the only really significant technological development since 1970--at least if by "significant" we mean something that changes daily life in some important way for almost everyone. 

And culturally--well, some would say we've advanced, some that we've declined, and I favor the latter view, but all in all the change has not been so dramatic as science fiction writers expected. We've been pretty well stalled for most of the past fifty years, actually: the cultural revolution of the late 1960s happened, and things were rather different in 1975 from what they had been in 1965. But since then they've been fairly static, really, at least domestically--the culture war started and the antagonists have been locked in a struggle ever since, neither side winning a clear and decisive victory. 

Most striking to me is the swiftness of the passage of the eighteen years since 2000. It's a tiresome thing for an old person to say, I know, but it keeps occurring to me because it's so astonishing. The turn of the millennium is roughly as many years in the past for us no as the end of the Second Word War was at the arrival of the Beatles.  This year, children born in 2000 will be finishing high school.  They've gone from being newborns with everything to learn to having at least some level of education and ability to manage for themselves. But for me it's mostly been a sort of plateau, and I've traversed it very quickly.

For someone my age the phrase "the year 2000" once had the aura of distant, exciting, hardly imaginable futurity. Now it's just an ordinary and rather dull bit of the past, and for me not a very far distant past, as it must be for someone who will be, say, just old enough to vote and buy alcohol this year. 

And fifty-two seems fairly young. Or at least not old.

I'm happy to say, by the way, that I never was much worried about the Y2K disaster. There was a guy who gained a fair amount of fame for himself at the time by predicting, with a great deal of confidence and a fair amount of irrelevant or invented data, that civilization as we know it could not survive. I thought he was nuts, unless he was lying. A few years after the day had come and gone with barely a ripple, I went looking online to see what he had to say about having been proved wrong. I couldn't find any trace of him.

*

I'm also happy to say that I've managed pretty well to keep to my resolve to stay out of the controversy surrounding Amoris laetitia, and in general the controversies surrounding the papacy of Pope Francis. I've not only stayed out of them but have gotten pretty good at ignoring them. But yesterday someone recommended to me this piece by Christopher Altieri at Catholic World Report as being "the most balanced explanation" of the situation he'd read. Well, to be honest, he actually said "of the current mess." And it does seem to be something of a mess. Maybe it shouldn't be, maybe it's really much ado about nothing. But in that case the fact that it seems so important would have to be counted as part of the mess. At any rate, Altieri's piece does seem a pretty good explanation of why it isn't crazy (pharasaical, etc.) to have concerns about what the pope seems to want to do.

It strikes me, and it is not a happy stroke, that what's going on in the Church looks a lot like what's going on in American politics:  bitter factions hurling invective at each other, "Pope Francis can do no wrong" vs. "Pope Francis can do no right." As with the political situation, though, I don't really see that much effect in my own life and in my own place. My diocese, the parishes within it, and my Ordinariate group muddle along as usual, trying to practice the faith to the best of our ability. The Pope is far away and the fight doesn't directly affect us (not counting the clergy, for whom it may have immediate practical import in their ministry). I think I'll be glad if the era of the celebrity pope fades away. Much as I loved John Paul II, I was always a little concerned about that. 

One happy effect of all this, as Altieri notes (though not happily), is that the folks at America magazine, and the similarly-minded, now have the opportunity to denounce their opponents as "dissenters." It's a pretty silly charge, but it must feel pretty good to be able to make it, after several decades of digging in their heels against the last two popes. So let them have their fun, I say.

*

I've been meaning to mention that Francesca Murphy is now doing a bi-weekly blog post at First Things. They're excellent, and I think this piece, "The Secrets of the Confessional," is my favorite so far, a reflection on the fact that "Feeling great after confession is probably the most widespread experience in Catholicism, a religion not founded on religious experience as such."

*

I never was very good at learning the rules of grammar, and am only bothered by the mistakes which I don't usually make. One such has been bothering me lately. Isn't it the distinction between "transitive" and "intransitive" verbs that describes whether the subject is doing something or having something done to it? Whatever the correct terminology is, the distinction seems to be breaking down. I keep running across sentences which seem to be misusing transitive verbs, if that's the right way to describe it. For instance, I was about to subscribe to a certain magazine a few days ago, and the web site referred to "the Spring issue which releases in March." Shouldn't that be "is released"? 

And this: if you do this or that on your computer, you're told, "a message displays." Shouldn't that be "is displayed"?

And this: "It transforms into...." Shouldn't that be "is transformed into"?

Like I said, I don't really even know the rules. I just play by ear. But these are like off-key notes to me.

I'm giving up on "I'm going to lay down" and "He said to John and I." Those battles are hopelessly lost.

But I reserve the right to laugh when I hear individual persons referred to as "they" and "them" in deference to gender-bending fads. As in "My boyfriend has recently come out as transgender, and they are saying I'm a bigot because I'm thinking of breaking up with them." I actually read something like that a few days ago, though I can't find it now. As I've mentioned before, those who think Donald Trump is inaugurating the reign of Orwellian Newspeak and thoughtcrime are barking up the wrong tree entirely.  

*

We had an actual cold snap last week. By "actual" I mean that the temperature was low enough (mid-20s Fahrenheit) and the wind from the north strong enough (mid-20s MPH), to qualify as being, so to speak, objectively cold. Sometimes in winter, or in a hurricane that hits at the right angle, a strong north wind can blow much of the water right out of the bay. That was the case when this picture was taken on New Year's morning. Under normal conditions the only thing you would see besides water and sky in this picture would be the top of that furthest post. This debris by the way is partly trees fallen years ago and partly the remains of the city's original sewer line. I really should try to get them to remove it.

WaterBlownOutOfBay2

Banana trees do not take the cold very well. This was taken the following morning.

BananaTreeAfterFreeze


Sunday Night Journal, December 17, 2017

One of the things that make being a pessimist less enjoyable than it might be is that the pleasure of being proven right about the impending collision with the oncoming train is substantially diminished by the discomfort of experiencing the collision itself. On balance, I would rather be wrong than right about many of my predictions and expectations.

Recently someone remarked that the culture of Catholicism on the Internet seemed to be dominated by those whose motto was taken from Flannery O'Connor's Misfit ("A Good Man Is Hard to Find"): no pleasure but meanness. You can certainly see his point, and of course it's not only the Catholic subculture that has this problem. It's everywhere. I'm not on Twitter, and I don't think I ever will be, so my impression may be distorted, but that impression is that some very large portion of what goes on there consists of people savaging each other, and sometimes of a large number of people attacking one person in what can fairly be described as a verbal lynch mob. 

Almost twenty-five years ago, in the Fall 1993 issue of Caelum et Terra, I published a short piece in which I considered the potential for the world of online communication to increase the level of hostility in the world rather than to bring people together. I don't think I had heard the term "World Wide Web" at that point. It did exist but if its existence was known to me at all I'm fairly sure I hadn't yet experienced it, as the original browser, Mosaic, had not yet been introduced to the world at large. But I'd had five or six years of experience in pre-Web online forums, including Usenet, the text-only discussion groups on the Internet, and at least one of the commercial services which provided similar capabilities by subscription. (America Online, or AOL, you may remember, was pretty big at the time.) And I'd seen the way people could turn on each other with startling viciousness.

That piece was called "Global Metropolis," and I've thought about it often since the Web changed our lives so strikingly. But I don't think I'd read it since it was published, so I decided to dig the magazine out of the closet and, if it still seemed relevant, spend some of the time this evening when I would have been writing typing it in instead. Well, I did find it interesting as a view from 1993, and at least part of it seems pretty accurate. The next-to-last paragraph is the one that seems a fairly accurate prediction of the Internet's potential for exacerbating rather than pacifying conflicts.

I sometimes think it might be more accurate to say that we are creating a global metropolis, a violent one like those of the United States, where people perceive each other socially either as naked individuals in isolation from family and community, connected, if at all, by financial ties, or as anonymous components of a class or race. In these great warrens, people live in close physical quarters but without much sense of belonging to the same community; with, in fact, a dangerous sense of having tribal enemies everywhere. What this combination of proximity and anonymity has done for the great cities, television, along with increasing economic interdependencies, may perhaps do for the world at large: to increase both the level of tension and the degree of isolation.

You can read the whole piece here (it's not very long, about a thousand words). I certainly didn't imagine anything like Twitter, which as far as I can tell is an active deterrent to substantial discussion, which is actually intended to reward the quick and superficial remark. The glimpses I get of it make me think of an enormous number of dogs barking hysterically and ineffectively at each other. And if I had imagined it I would never have dreamed that we would have a president who sits in the White House and uses such a ridiculous medium to bark at anyone among the citizenry who annoys him.  

I'm not saying that the Internet in general and "social media" in particular are altogether bad--I mean, here I am. And I even have a Facebook account. Still, it seems that the "bringing people together" effect of it is very often to bring them into tribes united by their detestation of another tribe, and thereby to purify and concentrate their anger. The net did not cause the political divisions in this country, but it is certainly exacerbating them.

Still, when I look at the people around me going about the ordinary business of their lives, I don't see this division, and that's a hopeful thing.

*

A couple of weeks ago on the first Sunday of Advent our priest made the point in his homily that Advent is as much about the Second Coming as it is about the Nativity. "We await the birth of a sweet little baby who will one day destroy the world," was more or less the way he put it. 

*

While watching The Crown on a chilly evening last week I discovered that my old dog, who is too blind and deaf to notify us of strangers in the vicinity, still is capable of service as an all-natural toe warmer. 

ToeWarmerI enjoyed The Crown considerably, by the way, and I think my wife enjoyed it even more. It's wise to keep in mind that they filled in some historical blanks with their own speculations and that it's a mixture of fact and fiction, not a documentary. Claire Foy's performance as Elizabeth is extremely convincing. I must say, however, that I remain puzzled to say the least about the function of the British monarchy.

Have I mentioned that we also recently watched the third series of Broadchurch? I suppose it's not surprising that I didn't like it as much as the first two, but it's still very good. There is apparently little to no chance of a fourth series, which might be a good instance of quitting while you're ahead, but this one left an important matter unresolved, which is a little frustrating. Well, maybe it isn't unresolved--maybe that was just the end of...well, if you've seen it you know what I'm talking about, and if you haven't I shouldn't say any more.


Sunday Night Journal, November 19, 2017

One of the milder vexations of getting old is that I find my hands and feet, especially my feet, feeling cold at times when they would not have in the past. It seems disconcertingly old-man-like. It's especially noticeable when I spend several hours sitting at the computer, which I do almost every day. In summer I turn up the thermostat as soon as my wife, who is still going off to work every day, leaves the house, and let the temperature get into the upper 70s. In winter I sometimes bring a little electric space heater into my office, just like those women in offices whom I've teased for many years. All this annoys me, and I'm a little embarrassed to admit it. The only good thing about it is that getting those feet and fingers warm when they're chilled feels pretty good.

This morning, for instance, it was a little cool--the outside temperature was 50-ish, not exactly cold, but there was a strong and chilly north wind. I went out to walk the dog wearing a light jacket over a t-shirt, and when we got out onto the beach where there's nothing to get in the way of the wind I was quickly cold enough to be uncomfortable, especially as I spent a few minutes dragging some newly-washed-up pieces of Hurricane Nate debris away from the water's edge, thus getting my hands wet and well-chilled.

Back at the house it was time to have a shower, and that warm water felt really good on my hands, and started me reflecting on what a luxury the hot shower is. I remember a conversation with my father some years ago when I was talking about people voluntarily giving up the conveniences of modern life in order to be more self-sufficient or environmentally conscious, or something--I don't remember exactly what the context was, but he remarked that "Not many people are not going to want to give up their hot shower." 

Well, that's certainly true. I certainly don't. And this morning was one of those moments when I realize just how soft life is for most of us in the industrialized world compared to what it has been for most people in most times and places throughout human history. We take it for granted most of the time, and we tend to assume that what has now been invented and produced and widely distributed will remain available. Not all of us by any means: the post-apocalyptic return to pre-industrial conditions is a staple of fiction and film, and often the subject of very serious and dire warnings from environmentalists. We may deplore this or that aspect of technological civilization--I deplore a number of them, for instance the dominance of the automobile. But almost nobody wants to give it all up. And most of us most of the time don't anticipate that we--humanity at large if not ourselves--will lose it, that what we now regard as basic necessities will disappear.

That's partly because we don't really appreciate the expenditure of time, effort, intelligence, and labor that produced our material way of life, and, more importantly, sustains it. All over the world millions of people are doing millions of things that keep this whole machine operating. Now and then I run across some small business that makes a living by supplying some little thing or service--some type of fastener, maybe--that's a component of some bigger thing and that in the normal course of events no one apart from the people who provide it and the people who make the bigger thing ever give a second's thought to. And yet the bigger thing, and the bigger thing or things which depend on it, may be something which the rest of us expect to be there whenever we need it, and would be sorely missed if it didn't exist, which it wouldn't do without that little thing and many other little things. 

A complex web of law and custom is an equally essential part of this apparatus. And so my train of thought about material progress soon switched toward the question of moral and spiritual progress, and to that "arc of history" which is "long, but bends toward justice" and is sometimes invoked as a guarantor that in the long run progress, as conceived by the person invoking the principle, will prevail. Maybe it will. Christians believe that it will, though the trajectory runs right out of the far end of history into another realm. But Christians have a pretty clear and definite explanation for this movement, and for its source. What's striking about many of those who seem to take it as an article of faith is that they don't seem to ask the obvious questions it raises. It's a faith, but a very vague one: toward what is it directed? What is it faith in? The Force? What? Those of us who have a more traditional faith and, after two thousand years, a very well-thought-out one, but which is frequently treated as irrational, are entitled to question this one; to interrogate it, as the post-modernists like to say.

To say that history has a moral arc implies that it is being directed by something or someone. Who or what is this? The direction must be conscious--surely no one can seriously maintain that matter and energy alone are imbued with moral convictions and the power to impose them. There is certainly no evidence anywhere outside of human affairs that life is cognizant of justice or aspires to it, much less that the concept of justice even has any meaning to the cosmos at large.

And what is that thing called "justice" anyway? Does it exist in some abstract but nevertheless objectively real sense? Is it an absolute and intrinsic good, and its opposite, injustice, an absolute and intrinsic wrong? If so, there is at least one moral absolute, and there may be others. But if not, if it is not an absolute, and can change, how can history be aiming for it? 

The whole idea seems to point necessarily to a moral intelligence that guides, if not the entire cosmos, then at least human history. But my impression is that most of those who cite this inevitable direction of history don't inquire about its source. If you press them on how they know what justice is and why it's a good thing, they scoff and say that it's obviously good, and they don't need anyone to tell them that. And they don't see that the very fact that they hold that belief is in itself a very interesting phenomenon and one that does not, in fact, explain itself, at least not very satisfactorily. What is conscience? Why do we have one and where does it come from, and where does it get its principles? The answers are not obvious. If you don't want to dig for them, that's fine; not everyone is obliged to. But if you shrug off the questions you shouldn't strike a pose of intellectual superiority, and act as if you have the answers.

At any rate, I don't see that the moral structure of our civilization is any more necessary and inevitable and permanent than its material structure, and any less dependent on a complex web of small but significant things, in this case principles, and prejudices in favor of those principles. Contemporary developments pretty well confirm that, in fact. Christian writers have been saying for more than a hundred years that modern civilization is far more indebted to Christian and Judeo-Christian morality than it knows. Those who like the "arc of history" figure also tend to think that the progress it describes will include the abandonment of primitive delusions such as Christianity. It may, at least for some time, but if it does the arc may bend in a direction rather different from the one expected by contemporary progressives. Concern for the weak and vulnerable is hardly an historical constant. 

The idea, by the way, seems to have originated with a New England Transcendentalist, Theodore Parker. That's fitting. Matthew Rose, writing at First Things, has an extensive and interesting piece on all this, "Our Secular Theodicy." Also by the way, on the question of moral progress, I lean toward C.S. Lewis's idea that good and evil progress simultaneously. 

*

Well, once again I've gone on about one thing rather longer than I had intended, and now I don't have time to say much about the other thing I had meant to write about: Stranger Things 2. I'll just say that I have watched it, and I liked it. Neither my wife nor I could remember very much about the first series, so we watched it again before the new one. I liked it better this time, and as of now I'd say I like the first one a little better, but they're both good. The second one is more spectacular, has more and more disturbing monsters, and more gore. It's closer to being a horror film than the first, I think. What makes them both so engaging is the principal characters, most of all "the kids," the middle-school group who are the main characters, and some of the adult characters, especially Winona Ryder's portrayal of the slightly nutty but passionately devoted mother, and David Harbour as the sheriff. And in spite of all the terror and violence, there is an underlying sweetness about both series, with love playing a major role. That makes them very different from some other popular series, such as Breaking Bad, or the dark detective shows like The Killing

I hereby open Janet's Undead Thread v3.0 for spoilers-allowed discussion of Stranger Things.

*

SunBehindTree


Black Mirror: Informing, not Recommending

I recently watched the six available episodes, plus a sort of holiday special, of a British TV series called Black Mirror. It's a sort of sci-fi production, with each episode being a complete story, and no connection other than a thematic one among the episodes. The theme is the extent and effect of our immersion in electronic media and other technology, and the uses to which more advanced developments might be put. I didn't make the connection till I read it somewhere after I'd already seen the show, but the title refers to the glossy black screen of a smart phone, and is the sort of brilliant stroke that any writer would hope to come up with for a title, combining a direct and specific reference with symbolic import.

The reason I'm telling you about it is that it's very well done, very insightful about its theme, and very effective in drawing out its darker possibilities. The reason I'm only informing you about it, not actively recommending it, is that it's all too successful in that effort, and is very disturbing. Just to give you one example: the first episode involves the prime minister of Great Britain being placed in the position of having to perform a truly disgusting act on live television in order to prevent an assassination. 

I was somewhat shaken by that episode, and you will naturally wonder why I watched any more. Well, part of the reason is that I thought surely the other episodes wouldn't be so disturbing. (Why did I think that? I don't know). And part of the reason is that it did present important insights in a clever and powerful way. None of the other episodes involved anything as disgusting as the first, but every one was disturbing in one way or another. There is little physical violence, but there is a lot of psychological violence, even torture.

So: watch at your own risk. In fact, having written this post, I'm thinking of discarding it. A compromise with my better judgment: I'll post this, but warn you that the episode called "White Bear" and the 90-minute "holiday" special (some holiday!), "White Christmas," involve what seemed to me the most extreme, almost unbearable, psychological torture. (Interesting that they both have the word "white" in the title.) "White Bear" is the 5th episode. Had it been the first, I definitely would not have continued.

There is another series in the works. I think I'll definitely skip it.