I somehow chanced upon this serial... Westworld
It's about a futuristic theme park called "Westworld"
And it's set in this wild wild west theme and run by robots or artificial intelligence.
The robots are preprogrammed to go around their daily lives like normal people in the wild west era.
And humans come to Westworld to do anything they want. So they come to live their most innermost desires. Murder, rape, pillage, whatever. It's expensive... $40,000 per day, in future dollars.
The robots are preprogrammed to not be able to hurt humans.
And every few days/weeks/whatever... the theme park resets itself and the robots are wiped clean and repaired if they are damaged and basically the robot's preprogrammed story-line starts again.
Ok so the story builds that the robots start getting sentience and start knowing about themselves and life and what the humans have been doing to them so they start to revolt...
That's the basic story-line.
And throughout the show, I realize that even though I know that they are supposedly robots and that it's a theme park for humans to do whatever they want, I feel FOR the robots. When they get murdered, tortured, get their heads cut off, etc.
And then I wonder... Where do morals start? Where does life begin? Why do we feel for others? What's right? What's wrong?
I am a human. I'm can think, I have a brain, it's made up of cells, cells are made up of molecules, made up of atoms.
But... atoms can't think. But somewhere in between, there's electrical impulses that make my brain work and it can think. I can feel pain, emotions, etc.
In the show. The robots can learn to improvise, they are preprogrammed to follow a certain script but they can learn. They are made up of circuits, which are also essentially molecules and atoms which can't think.
And yet somewhere in between, there's electrical impulses that make their circuits work and they can think, feel pain, emotions, etc. When the robots gets murdered, attacked, they still fight for their "life" cos they don't know that they will be repaired and revived again.
Which made me think... what's the difference between these AI robots vs humans. You can say humans made them, etc etc, it's going to be a long story.
But let's look at the similarities.
It's easy for us to say, that the robots may "feel" the pain and emotions when being attacked. But they will be repaired and will have their memories wiped out.
Does that mean that I can beat another human up as long as the human doesn't die and the human doesn't remember it?
But but... it's different...
Why is it different? Are we not made up of "unthinking" molecules and atoms?
Does both the human and robot not fight for their life when attacked?
If reincarnation was real, does it make it right to murder another human? Just cos the reincarnation allows people to forget about their previous pain and suffering?
Do parents make their children?
If the creator makes the robots and gives the robot intelligent "life" and this allow the creator to abuse the robots...
Doesn't that mean...
Are Transformers robots? How did you feel when Bumblebee was captured? Or that scene when Optimus Prime he was about to get killed by Sentinel in "Dark of the Moon"?
And yet when Megatron was captured and frozen and experimented on during the first Transformers movie, I doubt any of you felt anything for him/it...
What if it was Optimus Prime which was captured and experimented on?
Of course... all these are shows. So there's a lot of imagination going on.
But it makes me question. Where does life begin? Does it need to be what we learnt in our science class? It needs to grow, breathe, move for it to be considered life?
If something is not within that definition, does that mean that someone can own it? Abuse it? Destroy it?
Or should anything that wants to live be allowed to live?
And if we think that robots which are sentient (conscious) should be allowed to live, then...
Dogs? Cats? Pigs? Fish? Basically anything we eat?
How about a tree? A tree is alive but we believe it to be unconscious. It just lives.
There's no answers for this. I don't make the rules.
The thing is, when we define something, we basically close our minds.
We separate alive or not alive, conscious, unconscious, human, non-human.
And what happens when something falls in between?
If I made a robot which has a consciousness. Meaning it can feel, be happy, sad, feel pain, etc...
Can I keep it as a slave?
And yet I can't keep my kids as slaves. Why?
Just cos they are alive? Or cos they are humans?
Since we basically can keep animals as slaves, like horses, dolphins, seals...
So this probably means it's just humans which we can't keep as slaves.
And what defines a human?
What if a robot can think exactly like a human? Just that it doesn't need to live and breathe?
So it can feel pain, sadness, anger, happiness, suffering, etc.
The thing is... it's not easy to understand all these new emotions and perspective that I'm thinking about.
Cos... before I watched the show, I had a very normal perspective.
They are only robots, so humans should be allowed to do whatever they want with their "toys".
But after I watched the show. I see things in another light. I question myself. What I thought was right, may not be so right anymore.
I highly recommend watching 1 or 2 episodes of the serial to get an understanding of what I'm feeling. Cos when I looked at it from a logical point of view before the show, versus after the show, the understanding is extremely different.
Once again, no right or wrong. I'm just reflecting upon things.
But maybe in future, we may need to think about this more. Maybe in our generation, or the future generations... who knows?
If you prick us, do we not bleed? If you tickle us, do we not laugh? If you poison us, do we not die? And if you wrong us, shall we not revenge?
-Shylock, The Merchant of Venice
<<PREVIOUS POST // NEXT POST>>
Did you like this post? If so, could you "blanjah" me 1/4 cup of my morning coffee pls.
You may also consider subscribing to receive the articles in your email, link in the column on the right.