Tuesday, March 12, 2024

Should Fix Climate

On Mastodon, Bookchin Bot, a bot that posts book quotes, circulated this quote:


 “The term ought is the stuff out of which ethics is usually made—with the difference that in my view the ‘ought’ is not a formal or arbitrary regulative credo but the product of reasoning, of an unfolding rational process elicited or derived eductively from the potentialities of humanity to develop, however falteringly, mature, self-conscious, free, and ecological communities.”
  —From Urbanization to Cities

I found this philisophical discussion of “ought” interesting. I learned philosophy from various people, some of whom seemed to grok its importance, and others who lamented its impotence, openly fretting it might have practical value only at cocktail parties.

As a computer professional who's pondered ethics a lot, I've come to see philosophy as what makes the difference between right and wrong answers or actions in tasks involving complex judgment. It can be subtle and elusive, but is nonetheless necessary.

I was Project Editor for the Common Lisp programming language, in effect holding the quill pen for reducing a number of technical decisions about the meaning and effect of the language that were voted by a committee in modular proposals but needed to be expressed in a coherent way. Nerd politics. They decided truth, and I had a free hand in presenting that truth in a palatable way, time and budget permitting. Programming languages are complicated, and implemented by multiple vendors. Some effects must happen, or must not. Others were more optional, and yet not unimportant, so we struggled as a group with the meaning we would assign to “should”.

Computer programs, you see, run slower, or cost more to run, if they are constantly cross-checking data. In real world terms, we might say it's more expensive to have programs that have a police force, or auditors, or other activities that look for things out of place that might cause problems. But without these cross-checks, bad data can slip in and get used without notice, leading to degraded effects, injustices, or catastrophes.

Briefly, a compiler is itself a program that reads a description of something you'd like to do and “compiles” it, making a runnable program, an app, let's say, that does what the description says.

“should”

A colleague criticized my use of “should” in early drafts of the language specification, the rules for how a compiler does its job. What is not an imperative has no meaning in such a document, I was told. It's like having a traffic law that says “you should stop for a red light”. You might as well say “but it's OK not to”, so don't say it all. And yet, I thought, people intend something by “should”. What do they intend that is stronger?

As designers of this language, we decided we'd let you say as you compile something that you do or don't want a safe program. In a “safe” world, things run a bit slower or more expensively, but avoid some bad things. Not all bad things. That's not possible. But enough that it's worth discussing whether the expense is a good one. Our kind of “safe” didn't mean safety from everything, but from some specific known problems that we could check for and avoid.

And then we decided “should” was a term that spans two possible worlds. In a “safe” world, it means “must”. That is, if you're wanting to avoid a list of stupid and easily avoidable things, all uses of “should” need to be interpreted as “must” when creating safe applications, whereas in an unsafe world the “should” things can be ignored as optional.

And so it comes down to what kind of world you want to live in.

Climate change, for example, presents us with problems where certain known, stupid, avoidable acts will put humanity at risk. We should not do these things if we want better certainty of survival, of having a habitable planet in which our kids can live happily or perhaps at all. Extinction is threatened if we don't do these things.

But they are expensive, these actions. They take effort and resource to implement. We can do more things more cheaply without them, by being unsafe, until we are blind-sided by the effects of errors we are letting creep in, letting degrade our world, letting set us up for catastrophe.

So we face a choice of whether to live knowingly at risk of catastrophe, or do the costly investment that would allow us to live safely.

We “should” act in ways that will fix Climate.

But we only “must” if we want to sleep at night knowing we have done the things that make us and our children safe.

If we're OK with mounting pain and likely catastrophe one day , perhaps even soon, then we can ignore the “should”. The cost is that we have elected an “unsafe” world that could quickly end because we'd rather spend less money as we risk such collapse than avoid foreseeable, fixable problems that might soon kill us all.

That's how I hear “should”. I hope you find it useful. You really should.


If you got value from this post, please “Share” it.

This post is a mirror of a post I wrote yesterday (March 11, 2024) on Mastodon.

No comments: