Friday, July 14, 2023

Lying to Ourselves

My friend David Levitt posted this hypothesis on Facebook:

Humans are so mentally lazy and emotionally
dishonest about what they know, soon AI will
be much better leaders.

I responded as follows. Approximately. By which I mean I've done some light editing. (Does that mean I lied when I say this is how I responded?)

I think the notion of honesty here is a red herring. There are a lot of human behaviors that do actually serve a purpose and if you're looking for intellectual honesty, it's as much missing in how we conventionally summarize our society as in how we administer it or ourselves.

Of course we lie sometimes.

  • We lie because not all answers are possible to obtain.
    What is an approximation to pi but a lie?
  • We lie because it comforts children who are scared.
  • We lie because it's more likely to cause success when you tell people your company is going to succeed than if you say "well, maybe" in your pitch to rally excitement.
  • We lie because it saves face for people who tried very hard or never had a realistic chance of affecting things to tell them they are blameless.
  • We lie because some things are multiple-choice and don't have the right choice.
  • We lie because it protects people from danger.
  • We lie because some things happen so fast that abstractions like "now" are impossible to hold precise.
  • We lie because we are imprecise computationally and could not compute a correct truth.
  • We lie because not all correct truth is worth the price of finding out.
  • We lie because papering over uninteresting differences is the foundation of abstraction, which has allowed us to reason above mere detail.
  • We lie because—art.

So when we talk of machines being more intellectually honest, we'd better be ready for what happens when all this nuance that society has built up for so long gets run over.

Yes, people lie for bad reasons. Yes, that's bad and important not to do.

But it is naive in the extreme to say that all lies are those bad ones, or that of course computers will do a better job, most especially computers running programs like ChatGPT that have no model whatsoever of what they're doing and that are simply paraphrasing things they've heard, adding structural flourishes and dropping attribution at Olympic rates in order to hide those facts.

Any one of those acts which have bootstrapped ChatGPT, by the way, could be called a lie.

Author‘s Notes:

If you got value from this post, please “share” it.

Laziness is also misunderstood and maligned, but that is topic for another day. For now, I refer the ambitious reader to an old Garfield cartoon that I used to have physically taped to my door at my office, back when offices were physical things one went to.

No comments: