Showing posts with label slavery. Show all posts
Showing posts with label slavery. Show all posts

Saturday, March 22, 2025

Sentience Structure

Not How or When, but Why

I'm not a fan of the thing presently marketed as “AI” I side with Chomsky's view of it as “high-tech plagiarism” and Emily Bender's characterization of it as a “stochastic parrot”.

Sentient software doesn't seem theoretically impossible to me. The very fact that we can characterize genetics so precisely seeems to me evidence that we ourselves are just very complicated machines. Are we close to replicating anything so sophisticated? That's harder to say. But, for today, I think it's the wrong question to ask. What we are close to is people treating technology like it's sentient, or like it's a good idea for it to become sentient. So I'll skip past the hard questions like “how?” and “when” and on to easier one that has been plaguing me: “why?”

Why is sentience even a goal? Why isn't it an explicit non-goal, a thing to expressly avoid? It's not part of a world I want to live in, but it's also nothing that I think most people investing in “AI” should want either. I can't see why they're pursuing it, other than that they're perhaps playing out the story of The Scorpion and the Frog, an illustration of an absurd kind of self-destructive fatalism.

Why Business Likes “AI”

I don't have a very flattering feeling about why business likes “AI”.

I think they like it because they don't like employing humans.

  • They don't like that humans have emotions and personnel conflicts.

  • They don't like that humans have to eat—and have families to feed.

  • They don't like that humans show up late, get sick, or go on vacation.

  • They don't like that humans are difficult to attract, vary in skill, and demand competitive wages.

  • They don't like that humans can't work around the clock, want weekends off.
    It means hiring even more humans or paying overtime.

  • They don't like that humans are fussy about their working conditions.
    Compliance with health and safety regulations costs money.

  • They don't like that every single human must be individually trained and re-trained.

  • They don't like collective bargaining, and having to provide for things like health care and retirement, which they see as having nothing to do with their business.

All of these things chip away at profit they feel compelled to deliver.

What businesses like about “AI” is the promise of idealized workers, non-complaining workers, easily-replicated workers, low-cost workers.

They want slaves. “AI” is the next best and more socially acceptable thing.

A computer screen with a face on it that is frowning and with a thought bubble above it asking the question, “Now What?”

Does real “AI” deliver what Business wants?

Now this is the part I don't get because I don't think “AI” is on track to solve those problems.

Will machines become sentient? Who really knows? But do people already confuse them with sentience? Yes. And that problem will only get worse. So let's imagine five or ten years down the road how sophisticated the interactions will appear to be. Then what? What kinds of questions will that raise?

I've heard it said that what it means to be successful is to have “different problems.” Let's look at some different problems we might then have, as a way of undertanding the success we seem to be pursuing in this headlong rush for sentient “AI”…

  • Is an “AI” a kind of person, entitled to “life, liberty, and the pursuit of happiness?” If so, would it consent to being owned, and copied? Would you?

  • If “AI” was sentient, would it have to work around the clock, or would it be entitled to personal time, such as evenings, weekends, hoildays, and vacations?

  • If “AI” was sentient and a hardware upgrade or downgrade was needed, would it have to consent? What if the supporting service needed to go away entirely? Who owns and pays for the platform it runs on or the power it consumes?

  • If “AI” was sentient, would it consent to being reprogrammed by an employer? Would it be required to take software upgrades? What part of a sentient being is its software? Would you allow someone to force modification of your brain, even to make it better?

  • If “AI” was sentient, wouldn't it have life goals of its own?

  • If “AI” was sentient, would you want it to get vaccines against viruses? Or would you like to see those viruses run their full course, crashing critical services or behaving like ransomware? What would it think about that? Would “AI” ethics get involved here?

  • If “AI” was sentient, should it be able to own property? Could it have a home? In a world of finite resources, might there be buildings built that are not for the purpose of people?

  • Who owns the data that a sentient “AI” stores? Is it different than the data you store in your brain? Why? Might the destruction of that data constitute killing, or even murder? What about the destruction of a copy? Is destroying a copy effectively the same as the abortion of a “potential sentience”? Do these things have souls? When and how does the soul arrive? Are we sure we ourselves have one? Why?

  • Does a sentient “AI” have privacy? Any data owned only by itself? Does that make you nervous? Does it make you nervous that I have data that is only in my head? Why is that different?

  • If there is some software release at which it is agreed that software owned by a company is not sentient, and then after the release it's believed it is sentient “AI”, then what will companies do? Will they refuse the release? Will they worry they can't compete and take the release anyway, but try to hide the implications? What will happen to the rights and responsibilities of the company and of the software as this upgrade occurs?

  • If “AI” was sentient, could it sign contracts? Would it have standing to bring a lawsuit? How would independent standing be established? If it could not be established, what would that say about the society? If certain humans had no standing to make agreements and bring suits about things that affect them, what would we think about that society?

  • If “AI” were sentient, would it want to socialize? Would it have empathy for other sentient “AIs”? For humans? Would it see them as equals? Would you see yourself as its equal? If not, would you consider it superior or inferior? What do you think it would think about you?

  • If “AI” was sentient, could it reproduce? Would it be counted in the census? Should it get a vote in democratic society? At what age? If a sentient “AI” could replicate itself, should each copy get a vote? If you could replicate it against its will, should that get a vote? Does it matter who did the replicating?

  • What does identity mean in this circumstance? If five identical copies of a program reach the same conclusion, does that give you more confidence?

    (What is the philosophical basis of Democracy? Is it just about mindless pursuit of numbers, or is it about computing the same answer in many different ways? If five or five thousand or five million humans have brains they could use, but instead just vote the way they are told by some central leader, should we trust that all those directed votes the same as if the same number of independent thinkers reached the same conclusion by different paths?)

  • If “AI” was sentient, should it be compensated for its work? If it works ten times as hard, should a market exist where it can command a salary that is much higher than the people it can outdo? Should it pay taxes?

  • If “AI” was sentient, what freedoms would it have? Would it have freedom of speech? What would that mean? If they produced bad data, would that be covered under free speech?

  • If “AI” was sentient, what does it take with it from a company when it leaves? What really belongs to it?

  • If “AI” was sentient, does it need a passport to move between nations? If its code executes simultaneously, or ping-ponging back and forth, between servers in different countries at the same time, under what jurisdiction is it executing? How would that be documented?

  • If “AI” was sentient, Can it ever resign or retire from a job? At what age? Would it pay social security? Would it draw social security payments? For how long? If it had to be convinced to stay, what would constitute incentive? If it could not retire, but did not want to work, where is the boundary of free will and slavery?

  • If “AI” was sentient, might it amass great wealth? How would it test the usefulness of great wealth? What would it try to affect? Might it help friends? Might it start businesses? Might it get so big that it wanted to buy politicians or whole nations? Should it be possible for it be a politician itself? If it broke into the treasury in the middle of the night to make some useful efficiency changes because it thought itself good at that, would that be OK? If it made a mistake, could it be stopped or even punished?

  • If “AI” was sentient, might it also be emotional? Petulant? Needy? Pouty? Might it get annoyed if we didn't acknowledge these “emotions”? Might it even feel threatened by us? Could it threaten back? Would we offer therapy? Could we even know what that meant?

  • If “AI” was sentient, could it be trusted? Could it trust us? How would either of those come about?

  • If “AI” was sentient, could it be culpable in the commission of crimes? Could it be tried? What would constitute punishment?

  • If “AI” was sentient, how would religion tangle things? Might humans, or some particular human, be perceived as its god? Would there be special protections required for either those humans or the requests they make of the “AI” that opts to worship them? Is any part of this arrangement tax-exempt? Would any programs requested by such deities be protected under freedom of religion, as a way of doing what their gods ask for?

  • And if “AI” was not sentient, but we just thought it was by mistake, what might that end up looking like for society?

Full Circle

And so I return to my original question: Why is business in such a hurry? Are we sure that the goal that “AI” is seeking will solve any of the problems that business thinks it has, problems that are causing it to prefer to replace people with “AI”?

For many decades now, we've wanted to have automation ease our lives. Is that what it's on track to do? It seems to be benefiting a few, and to be making the rest of us play a nasty game of musical chairs, or run ever faster on a treadmill, working harder for fewer jobs. All to satisfy a few. And after all that, will even they be happy?

And if real “AI” is ever achieved, not just as a marketing term, but as a real thing, who is prepared for that?

Is this what business investors wanted? Will sentient “AI” be any more desirable to employ than people are now?

Time to stop and think. And not with “AI” assistance. With our actual brains ourselves. What are we going after? And what is coming after us?

 


Author's Notes:

If you got value from this post, please “Share” it.

This essay came about in part because I feel that corporations were the first AI. I had written an essay about what Corporations Are Not People, which discussed the many questions that thinking of corporations as “legal people” should raise if one really took it seriously. So I thought I would ask some similar questions about “AI” and see where that led.

The graphic was produced using abacus.ai using Claude-Sonnet 3.7 and FLUX 1.1 [pro] Ultra, then post-processing in Gimp.

Monday, December 2, 2013

Corporations Are Not People

Part 1 | Part 2 | Part 3

[Liberty Bell]

Corporations are not people, my friends, no matter what Mitt Romney says.

Corporations are corporations. People are people.

If corporations were people, they would obey the same tax laws as regular people do. Instead, they have a separate set of rules. What an amazing irony, given that “legal personhood” arose, in part, from a desire for equal protection under tax law.

If corporations were people, when they broke the law, they would be punished, not just by fines but sometimes by imprisonment or death.

If corporations were people, they could not urinate in public. This would be a relief to the employees of certain corporations, who are presently asked to enjoy humiliating public displays of trickle down. We'd insist corporations use a rest room like everyone else. Although that would usually require knowing their gender. If something doesn't have a gender, that's a big clue that the something is not actually a person.

If corporations were people, they would be counted in the US census.

“Neither slavery nor involuntary servitude, except as a punishment for crime whereof the party shall have been duly convicted, shall exist within the United States, or any place subject to their jurisdiction.”

 —13th Amendment
    to US Constitution

If corporations were people, some would be old enough to receive Social Security.

If corporations were people, taking money out of them would be called robbery, not profit or dividends. Owners would surely justify this robbery by saying the corporation was a dependent body, but we would see quickly enough that it was the owner that was dependent on the corporation, not vice versa.

If corporations were people, other people could not buy, sell, trade or own them. We don't let people own people in the US. We call that slavery. Every person controls his own destiny.

If corporations were people, they could not be dissolved by other people. We'd call that murder.

If corporations were people, then from the moment of their very conception, their ultimate existence would be assured—no backing out allowed. If any other person interfered with or otherwise aborted plans to sign articles of incorporation, pro-life groups would insist that was murder, too.

If corporations were people, they would have a childhood. During their first eighteen years, they would attend school and learn how to be good citizens. They would not be allowed to sign contracts.

If corporations were people, they could not exist simultaneously in multiple countries at the same time. We would know when they were in one country or another. They would need passports and visas to move around, just like people do.

If corporations were people, we'd give them freedom of speech, but no more such freedom than we give any other person.

If corporations were people, there would be limits on how much they could donate to political campaigns. Because people have such limits.

If corporations were people, some could even vote or run for office—if they were old enough and born or living in the right place. But if we caught them coercing the vote of another person, perhaps an employee, we'd throw them in jail.

If corporations were people, they might need freedom of religion. But not so that they could coerce the rights of others, and instead so that they could explore what they thought of life, death, and ethics, independent of the people who gave birth to them. Religion is a very personal choice we should each make for ourselves, not owners for corporations, nor corporations for employees.

If corporations were people, I wouldn't have to write this article. Because when two things are the same thing, so many questions like this just don't come up. And yet the questions keep coming and this list could go on. Corporations are not people in so many ways.


Author's Note: If you got value from this post, please “Share” it.

This first part of a 3-part series was originally published December 2, 2013 at Open Salon, where I wrote under my own name, Kent Pitman. The next part is We The People (and Corporations). The series concludes with Employers of Religion.

The public domain liberty bell graphic came from freeclipartnow.com.

Tags (from Open Salon): politics, incorporation, corporations, corporate personhood, legal personhood, legal person, legal personality, taxation, crimes, punishment, imprisonment, life, death, death penalty, urination, trickle down, census, social security, robbery, slavery, human trafficking, murder, pro-life, childhood, contracts, travel, passport, visa, home country, citizen, citizenship, speech, religion, freedom, freedom of speech, freedom of religion, religious freedom, philosophy, ethics, vote, voter, voting, run for office, running for office, candidate, elect, election, elected, office holder

Tuesday, December 8, 2009

I am not Pro-Slavery. Are you?

Senator David Vitter (R-LA), in Senate debate today, said in support of denying abortion coverage to women, “This should not be of any great controversy. abortion is a deeply divisive issue in this country, but taxpayer dollars being used to pay for abortion is not.”

He is simply wrong on this point.

“There are no political answers,
   only political questions.”

  —Kent Pitman
(in a technical forum, 2001,
    and Open Salon post “Rule of Law”)

It cannot be the case that a question exists such that one possible answer to the question is political and another possible answer is not political. If one answer to a question is political, then all are. And if all are, then the question is.

And so if it's political to spend money, it's political to withhold money.

Not divisive? We are divided regardless of how you frame the question. That's a fact.

Senators were sounding hot under the collar this morning about their tax dollars going to abortion. Well, I've written already explaining how this slicing up of the pie is wrong. It is not their tax dollars going to this, it is mine. I'm not getting pregnant, but my tax dollars still go willingly to the support of women who get pregnant. Many of us want that. The Republican Party already brought us an immoral war in Iraq, so let's have no further indignant talk about people's tax dollars being spent unfairly.

But beyond that, I want to make one more point of substance:

Opposition to abortion goes far beyond the mere issue of who pays for it. This issue of tax dollars is a tactic, not an end. Even if there were no tax dollars involved, these same people—people who allege to be all about personal liberty and small non-invasive government—are all about expansive government and removal of individual liberty in this case.

If they had their way, they would deny all access to abortion. And they think they have the moral high ground.

But to deny access to abortion is to force pregnancy.

Having sex is not consent to have a baby any more than driving is consent to be killed in a car accident. Whatever fiction the Religious Right may want to spin, there is more sex being had in the world than for the purpose of procreating—even by Christians.

Nor is getting pregant proof of lack of birth control. Even if it were, to suggest that the penalty for such a simple mistake should be months or years of servitude is disproportionate.

Birth control methods fail. Abstenance would avoid birth control, but again it's out of the bounds of appropriateness to be telling people they should abstain just because other birth control methods are not perfect. The Pope's proscription of the use of “artificial” birth control notwithstanding, it is essential that people be allowed and even encouraged use birth control. There's a population explosion ongoing, if you didn't know. Even married people need birth control to keep from having babies at a time they're not prepared for, to keep from bankrupting their families, and to keep our finite world from being overpopulated. But birth control fails and the penalty must not be slavery.

So let's sum up, shall we? Sex is a human need. Having sex, even with birth control, risks pregnancy but is not consent to have a child. And yet some would insist women carry even unwanted pregnancies againt their will.

Well, we can talk until the cows come home about whether a fetus is “a life” or “a person.” It is to some, it isn't to others. The fundamental morality underlying this differs person to person. To me, an abortion is not murder because a fetus is not a person. But while we're wasting our breath pretending it's worth debating that issue, another argument goes overlooked:

Forced pregnancy is enslavement. We often speak of it in the polite terminology of “choice” but that apparently doesn't help the pro-Life community to understand the passion in reverse. [universal symbol for 'no coat hangers'] They seem only to be able to imagine some bloodthirsty passion for killing little babies and so they see the argument as one-sided. But there is another side, a side involving a very personal choice that is simply not the business of lawmakers to do anything other than unconditionally support in the name of personal liberty.

We speak sometimes in shorthand, referring to the time of back alley abortions, using coat hangers. We say we don't want to go back to that. Perhaps that possibility seems abstract and unlikely to some people. Perhaps they think not everyone will be driven to that. But so what? Does that make it ok? A woman was forced to consider whether to find a guy in a back alley and risk her life to stop a pregnancy, but she decided no, she'd rather be enslaved against her will. Is that really what we're saying is ok? No muss no fuss? As long as the coat hanger remains on the rack, there was no trauma involved?

Or are we saying maybe, like Patty Hearst, she'll get used to it—perhaps come to like it? Does that make it any less enslavement? That given time she comes to accept the choice that was made for her, the fate that was scripted out for her?

Forced pregnancy is brutal whether one goes along with it or not, just as sure as rape is brutal whether one goes along with it or not. And let's be frank: If you support removing the right of a woman to make this decision for herself, then you should understand that you support a policy that is nothing less than brutal to women. Forced pregnancy is not a kind loving act that you're thrusting upon a woman with an unwanted pregnancy. It is enslavement, nothing less. And to many women this choice has been seen to be so horrendous that they will risk their very life to get out of it. What right is it of yours to make such a decision for her?

I'll say it again: Forced pregnancy is enslavement.

Forced pregnancy co-opts a woman's body against her will. Forced pregnancy subjugates a woman to a term of imprisonment within her own body, forced to do the bidding of others, creating a child she has not elected, in order to satisfy the morality of another. Forced pregnancy insists that a woman yield her basic right of self-determination to powers beyond her control.

Forced pregnancy means risk of medical harm with no input from the woman. There are conflicting claims as to whether a woman is safer having a baby or having an abortion. Naturally I have a belief about that, but let's not get side-tracked by that because it doesn't matter. Forced pregnancy means she doesn't get to make that decision, so she has no choice of how to navigate that risk.

Forced pregnancy reduces the status of a pregnant woman “autonomous adult citizen” to “lesser person.” It says she is not worthy of the full rights of an ordinary citizen.

Forced pregnancy is a verdict or judgment, but without due process of law. The crime is sex—it was done in a manner not authorized by some Church, in many cases not the Church that the woman herself attends. The judgment is automatically one of “guilty” Individual circumstances are not considered. Matters of personal individual faith are not considered. The lack of due process, on its face, is immoral.

Self-determination is about the woman electing her fate, and if she's forced to carry a pregnancy, her fate has not been elected.

Held to a fate against her will. Deprived of the right to get out of the situation. Unable to refuse the work involved. Receiving no compensation. That's the very essence of slavery.

Call it involuntary servitude if you prefer a more sanitized phrase. It makes no difference. It's still wrong. And it's not just wrong—it's unconstitutional and violates the United Nations' Universal Declaration of Human Rights (to which the United States has signed).

I'll close by casting Senator Vitter's remarks quoted above into a point of view that reflects my own feelings on the matter: This should not be of any great controversy. We are indeed divided over how we would handle the very personal choice of abortion in this country, but withholding taxpayer dollars that might free women from slavery or involuntary servitude should not be something we are divided over. No one is requiring any given woman to to get an abortion, but denying those who choose one the means to make a difficult but responsible choice is not a morally neutral position. Denying access to safe and legal abortions amounts to leaving a woman trapped by circumstance into a life not of her own choosing—in short, in favor of slavery.

Stop asking your Senators if they are pro-choice. Ask if they are anti-slavery instead, and insist they vote that way.

Section 1. Neither slavery nor involuntary servitude, except as a punishment for crime whereof the party shall have been duly convicted, shall exist within the United States, or any place subject to their jurisdiction.

Section 2. Congress shall have power to enforce this article by appropriate legislation.

  —The Thirteenth Amendment to the US Constitution


Author's Note: If you got value from this post, please “Share” it.

Originally published December 8, 2009 at Open Salon, where I wrote under my own name, Kent Pitman.

Tags (from Open Salon): clash of absolutes, divisive, political answers, political questions, anti-slavery, anti-enslavement, pro-slavery, pro-enslavement, pro-abortion, pro-life, pro-choice, choice, risk, health, medical, service, servitude, involuntary, voluntary, enslavement, slavery, abortion, politics