Slaves & Robots
Robot (definition): serf/slave
Serf/slave: worker with no rights, forced to work, no pay, no benefits. The condition passed down from parents to children.
Please note: No race has been spared from slavery. There were white slaves as there were black, yellow, and brown slaves.
If the usage of color for races seems derogatory kindly provide the appropriate terms and I'll make the necessary corrections.
My question: Will/Should the descendants of slaves (basically all of us) use robots?
Serf/slave: worker with no rights, forced to work, no pay, no benefits. The condition passed down from parents to children.
Please note: No race has been spared from slavery. There were white slaves as there were black, yellow, and brown slaves.
If the usage of color for races seems derogatory kindly provide the appropriate terms and I'll make the necessary corrections.
My question: Will/Should the descendants of slaves (basically all of us) use robots?
Comments (61)
I was just wondering if we might see a little bit of our painful history as slaves in the way we treat (mistreat?) robots and if that might make us at the very least think twice about using robots. We are, after all, going to do to them the exact same thing we did to each other back in the heydays of slavery. :grimace: No?
Sentience seems to be key in re your position but if I may say so, I maybe completely wrong of course, the problem with slavery wasn't a deficiency/absence of sentience in slaves but actually in their masters.
Why not start the question with enslavement of humans (or animals) instead of robots. We treat classes of humans like shit which are much more likely to be sentient than whatever constitutes a "robot".
Should they use washing machines and cars?
Quoting TheMadFool
We already do. It won't be bad for the robots, unless they develop the ability to suffer.
Bearing in mind we enslave animals for our taste pleasure, it would not be unreasonable to assume we would enslave sentient robots for our pleasure too.
:up:
What I want to bring to your attention is a rather simple fact. Slavemasters were treating human slaves back when slavery was the norm the same as we intend to treat robots in the future. The sentience of human slaves was completely ignored i.e. human slaves were treated as if they weren't sentient. In other words human slaves were equivalent to robots for all intents and purposes.
Thus, I was just curious about how all of us - white, yellow, black, and brown - having a family history of slavery would feel about using robots because there's no difference between slaves and robots. The fact that slaves were/are sentient human beings is irrelevant because they were treated as if they weren't. That's the whole point of slavery and robotics - in the latter case, sentience is absent and in the former case, sentience is deemed absent.
Quoting Nils Loc
Read my reply to 180 Proof above.
Quoting Down The Rabbit Hole
Indeed the "ability" ( :chin: disability?) to suffer is key to the ethics of slavery but my point is those involved in the slave trade closed their eyes to the suffering of the slaves which boils down to treating slaves as robots.
All I'm saying is that when slavery was all the rage, if someone had invented reasonably functional robots, slaves would've had the same rights as robots i.e. no rights at all!
Thus, if I were an emancipated slave, I would see my past (slavery) in a robot and would, for that reason, feel uncomfortable around robots to say nothing of using them.
Do folks born with the inability to feel pain suffer as much psychologically as normal folks? I wonder how much the "ouchie" kind of pain shapes the ability/capacity to feel embarrassed or guilt or are such pathways to suffering more functionally independent in the human person, related to an absence of desire. Are the "pains" of hunger completely unconditioned/unrelated to "ouchie" pain.
Assuming a person was indestructible, more like robot built to be super tough, and could not feel pain, like a toaster, they would not suffer at all compared to a normal person.
Maybe we as humans are to be the nociceptors and sex organs of the machine world.
:ok: I won't pursue the matter further.
Quoting Nils Loc
Interesting!
If robots become too human then we will be forced to treat them morally for our own sakes.
Of course, we will do so. It will depend in our incomes and the purpose of use that robot. But, who knows? Probably these robots will have some rights in the future. I guess we cannot hire them with an agreement or according to civil law :chin:
Slaves are human and yet we treated them like robots!
Sure, but it was at a very high cost to ourselves.
Yes, losing our "humanity" is what some would call it and that too for a "few" bucks! The slavemasters' luxurious and comfortable lives were tainted with the blood, sweat, and toil of slaves. They didn't realize that. Hopefully it wasn't that they didn't give jack shit!
Well there is benefits and drawbacks to cooperation and competition.
Society now considerers slavery to have more drawbacks then benefits.
I wouldn't view slavery that way. It's not about pros and cons. It's about what's right and wrong, good and bad. What's good is good no matter how many or how severe the drawbacks. Bad is bad no matter how many or how great the advantages. Perhaps this is the dreamer in me speaking because I've noticed that as of late, being bad is a huge drawaback - people immediately and viciously call one out on it. On the flip side, being good gets you all the best deals there are. Mind you, this is what it looks like. Smoke and mirrors?!
Evil has benefits in the short term. Drawbacks in the long term.
Take more spacetime into account and being good has more benefits most of the time.
Taking more spacetime into account requires more intelligence and experience.
But why do we need benefits? Because we are needy.
Fascinating! Well put. However, this may not necessarily be true. It's possible that evil has far-reaching positive effects. That however is not a reason to be bad. I think those who resort to evil based on the maxim ends justifying the means lack imagination!
Nevertheless, good & bad aren't about gain/loss and if you insist they are then you'll have to concede the value inversion that takes place: loss is good & gain is bad. Of course I'm talking of personal gain and loss. As soon as an other is involved, gain is good & loss is bad, the world is right-side-up again.
I'm hqving trouble equating good with benefit and bad with drawback. It's as if we're monetizing morality and while I don't see an actual issue with it, my gut instinct is to resist such an interpretation. Maybe it's just word play in the end.
Quoting hope
Again, let's not reduce morality to economics.
Quoting hope
Spacetime? Indeed, it's all about space and time, both are scarce resources. Robots can save a lot of time and will also be able to take out more from a given space than a human can. As for slaves, they were merely substitutes - replacing a freeman with a slave meant that the former had time to do other things and also could own vast tracts of land (space).
Quoting hope
See above.
Right and wrong
Good and bad
Good and evil
Are three totally different things. Although related.
Expand and elaborate...please.
Update
Imagine if I tell you that I have an entity X and I make X work from daybreak to nightfall in my house, I bought X from the market, I don't pay X for the work X does, and if X is unable to do the work assigned to X, I simply abandon X and go to the market and find a replacement. With the information I've provided, only the clues given, can you tell whether X is a slave or a robot? :chin: There's something robotish about slaves and, conversely, there's something slavish about robots.
"I was just wondering if we might see a little bit of our painful history as slaves in the way we treat (mistreat?) robots
Are you serious?
Quoting khaled
Is it me? I'm not sure but my point is there's an overlap in the traits that define slaves and robots. Commonalities tend to elicit a sense of oneness, unity, camarederie, brotherhood, kinship, between categories that relate in this way. For instance, we call primates our cousins and once we view apes in this light, we immediately become reluctant to do bad things to them, things we have no qualms over doing to other animals.
Let it not be forgotten thought that this is not a hard and fast rule - chimps are test subjects in the dangerous phases of drug trials, bush meat sells, etc., as the following character in Shakespeare's play Macbeth laments,
[quote=Donalbain]The near in blood, The nearer bloody.[/quote]
Setting the exceptions to the more or less general rule of acknowledging our genetic closeness aside, it's safe to say that we do mind harming/hurting our less-evolved cousins.
If so, we, as scions of slaves ourselves, should feel, if our heart is in the right place, some degree of distress when using robots. That's all.
Yes. Are you serious?
Read above.
Do you really think we treat robots likes slaves? That presupposes robots experience. But they don't. You cojld ask how I know but I know...
Then this is where we part ways...Good luck, fellow traveller.
I can smash every robot in the world without feeling remorse.
Please, please, don't do that. Take this as a plea, a earnest, heart-felt entreaty. :pray:
If that hurts you I wont do that. Unless the robots become too many. Threatening Nature.:wink:
There is light at the end of this tunnel :sweat:
"There is light at the end of this tunnel :sweat:
"
No! Its another train...:gasp:
Thanks for the laugh.
I don't know whether to laugh or cry.
:smirk:
Why? "Robots" (e.g. electric can-openers, department store escalators, clocks, vaccines, seeds) are not sentient in any functional, or recognizable, manner.
On target 180 Proof, not something new to you. I guess I'm trying to draw a rather disturbing similarity between machines - the most popular term for AI (The Matrix/Terminator/others) - and slaves. We treated one (slaves) as we treat the other (machines).
Of course, there's a very good reason why we do this despite our own slave heritage (black/white/brown/yellow) - machines aren't (as of yet) sentient. Nonetheless, this :point: sex dolls suggests that once we have humanoid robots, it's going to get pretty hard not to get emotionally attached to them, sentient or not. If I can engage in coitus with a sex doll and experience pleasure even if only submaximally, I still am treating the, rather unfortunate sex doll, as a person; if this isn't true what happened to perfectly reliable old-fashioned handjobs?
(Btw, I hope to live long enough to "see" fully functional, nonsentient / p-zombie & customizable (adult-form only!) sex dolls sold at an affordable price on Amazon. :party: :yum:)
Point made, point taken!
Nonetheless, sex dolls probably give a man an experience superior to wanking, you know, like as if in bed with a real woman - that's it's selling point if I'm anywhere near the truth. Therein lies the rub.
Quoting 180 Proof
May your dreams come true, 180 Proof, may they be,
[quote=Bernardo Kastrup (on psychedelic experiences)]Realer than real (Hyper-reality).[/quote] :up: :lol:
You ARE serious!? :lol:
@180Proof :lol:
As long as you don't have to blow (them up) yourself...
:lol:
[quote=Confucius]Serious, bad sometimes, good sometimes. Seriously sick, always bad.[/quote]
:smile:
The saving grace:
If Aristotle were alive today, in the age of automation, there is no reason to believe that he would defend slavery.
[quote=Aristotle]If every instrument could achieve its own work, obeying or anticipating the will of others, like the statue of Daedalus...if, likewise, the shuttle could weave and the plectrum touch the lyre, overseers would not want servants nor would masters slaves[/quote]
Perhaps, instead, Aristotle would deplore automation (à la Heidi's 'ontological ludditism') as even more dehumanizing – contra the "telos" of the "zoon politikon" – than (what he calls "natural") slavery.
:point: "Commerce is our goal here at Tyrell. 'More human than human' is our motto.
~Dr. Eldon Tyrell, Los Angeles, 2019"
Yes.
What I'm seeing here are two things:
1. You display more sympathy and empathy for other people than average humans.
2. Your line of reasoning seems to work on the premise that people (should) internalize the identity as ascribed to them by others.
E.g. that if a slave owner believes that slaves are in some essential way subhuman, and expects his slaves to believe this about themselves, that the slaves will or should believe it.
To what extent is one what other people claim that one is?
A person's identity is not autonomous, and is to an extent determined by other people's claims about said person's identity. One cannot escape other people's ideas about who one is.
Is it indeed dehumanizing if there is disagreement about what is going on?
The slave owners didn't think they were dehumanizing the slaves. Probably many slaves didn't view their treatment as dehumanizing either, but, depending on the particular system of slavery, as a punishment, or "just the way things are".
This question makes no sense to me.
What a stupid thing to say, baker. So fucking what? Millennia of 'devout Christians' didn't think marital rape was "dehumanizing" either just as many 'devout Hindus' and 'devout Muslims' still don't think honor killings are "dehumanizing". What the slave owners thought – rationalized – they were doing doesn't mean shit in light of what they knew – what we know – they were actually doing: forcibly, violently, rapaciously enslaving other human beings.
Fair to infer from this statement you also believe that "probably many slaves" weren't as human as the "slave owners". :shade:
It's a type of reasoning you're well familiar with and do not shy back from practicing.
Look at yourself, the disparaging things you say about me. You surely expect me to believe them, to see you as the person who defines me.
There you go. Exactly like the slave owners, religious people, etc. etc.
You define me.
Objective reality is on your side.
I am whatever you say that I am.
You speak The Truth.
Always use you-language.
Millennia of philosophy down the drain.
I' not sure how to respond to this comment. I may have my quirks though.
Quoting baker
Nope, that's not the way I see things. However, I'm not denying that that's not the way it is. Our self-worth seems tied to how others view/regard us. I think, despite how annoying it is, there's a really good reason why it's like that. Speaking for myself, hypothesis non fingo.
:up: You really can cut through all the noise!
I recall a discussion we had before on the nexus between money and slavery and I believe I've hit upon an idea on how to make people good or, if that's not possible, less evil. Make evil expensive and/or good cheap. Money has its own logic - people always seem to understand, are more reasnable, once money's involved. Simple.