Quantcast

Jump to content


Photo

AI & robot rights


  • Please log in to reply
15 replies to this topic

#1 Coops

Coops
  • 🌧️🌩️🌧️




  • 3,602 posts


Users Awards

Posted 24 February 2017 - 07:31 PM

I didn't want to post this in debate since I don't specifically want to know whether you're for/against robot rights and why (you can still discuss that if you want). I would like to hear general thoughts and opinions about AI, robots and robot rights. How soon do you think we will see AI? Do you think AI could/would program themselves to feel pain/emotions? How would you personally respond to an AI that could feel pain? Do you think robots will demand rights? 
 



#2 Adam

Adam
  • Coffee God


  • 4,347 posts


Users Awards

Posted 24 February 2017 - 07:49 PM

We have a cheeto with legs as President. I give it 10 years before the situation you've described comes around, and another 50 years until America is nothing but ruins. Skynet is upon us!



#3 Keil

Keil
  • Above Average Mediocrity


  • 6,343 posts


Users Awards

Posted 24 February 2017 - 07:58 PM

How soon do you think we will see AI?

 

Do you think AI could/would program themselves to feel pain/emotions?

 

How would you personally respond to an AI that could feel pain?

 

Do you think robots will demand rights? 

 

17 years. I am sure of it. I have a tin foil hat and a brochure to space camp.

 

No. Humans do the programming. AI could only build upon it and evolve.

 

Well, I'll treat them the same as humans... shittily. 

 

Yes. It'll end up being the Animatrix / that one Kids Next Door episode where they made adults in order to be lazy. Spoiler alert, it'll not turn out well for us.



#4 Coops

Coops
  • 🌧️🌩️🌧️




  • 3,602 posts


Users Awards

Posted 24 February 2017 - 08:22 PM

17 years. I am sure of it. I have a tin foil hat and a brochure to space camp.

 

No. Humans do the programming. AI could only build upon it and evolve.

 

Well, I'll treat them the same as humans... shittily. 

 

Yes. It'll end up being the Animatrix / that one Kids Next Door episode where they made adults in order to be lazy. Spoiler alert, it'll not turn out well for us.

If AI can build and evolve, why would they not be able to evolve to program themselves to feel?



#5 Keil

Keil
  • Above Average Mediocrity


  • 6,343 posts


Users Awards

Posted 24 February 2017 - 08:35 PM

If AI can build and evolve, why would they not be able to evolve to program themselves to feel?

 

You're basically asking if a four function calculator from the 80's can smile on its own. If the foundation isn't there, don't have hold your breath.

 

I'm guessing you're taking the word 'evolve' in the same context as biological evolution.

 

If you really want to go further, you really have to define what AI is. So far, all the current suppositions of AI are just glorified RNGs that are programmed to repeat an action if humans approve of it.



#6 Coops

Coops
  • 🌧️🌩️🌧️




  • 3,602 posts


Users Awards

Posted 24 February 2017 - 09:06 PM

You're basically asking if a four function calculator from the 80's can smile on its own. If the foundation isn't there, don't have hold your breath.

 

I'm guessing you're taking the word 'evolve' in the same context as biological evolution.

 

If you really want to go further, you really have to define what AI is. So far, all the current suppositions of AI are just glorified RNGs that are programmed to repeat an action if humans approve of it.

Given I asked "how long until we see AI", clearly I am not referencing current suppositions of AI. I am referencing artificial intelligence that can program itself to improve/become more intelligence, or program other AI. 

 

https://en.wikipedia...cal_singularity

That's a good place to start for the kind of AI I'm directly asking about.



#7 Keil

Keil
  • Above Average Mediocrity


  • 6,343 posts


Users Awards

Posted 24 February 2017 - 10:08 PM

Given I asked "how long until we see AI", clearly I am not referencing current suppositions of AI. I am referencing artificial intelligence that can program itself to improve/become more intelligence, or program other AI.


Clearly is a big person word that I often see people use when someone wants to get a point across immediately and with minimal effort and success. I use it too.

I am disgusted as an academic that you linked a Wikipedia article and not the literature that it borrowed information from. Everyone should read it. Skip the mathematical logic part if you need ton since all you really need is the gist from the introduction. I say it's worth the 20 minutes of reading.

Now I ask you this to finish off the definition: Is the essence of AIs limited to only computers and robots (in other words, are they just programs that humans made using code they typed into a processor)?

#8 Amethyst

Amethyst
  • I swallowed a rock


  • 2,717 posts


Users Awards

Posted 25 February 2017 - 09:12 AM

It's a nice thought in theory, but you can't program emotion. You can replicate the actions that would take place with certain emotions if you programmed it to react to certain stimuli. 

 

 

I don't think they'll earn rights, as they wouldn't evolve on the same as a biological level. I don't think they could develop emotions, or program themselves to feel what they can't understand. 

 

It doesn't entirely make sense, I suppose if it were somehow possible in theory then we'd be forced to give them the same rights as other sentient beings. 



#9 Karla

Karla
  • I live, I learn, I rule.


  • 2,347 posts


Users Awards

Posted 25 February 2017 - 02:17 PM

I most certainly hope they don't earn rights. If they do, that could put humans out of jobs, or put them in jail if robots are mishandled in the slightest way! D:

 

Plus they're manmade. I don't think anything unnatural (or not created by God, if you want to put it that way) deserves rights.



#10 Kaddict

Kaddict
  • 1,654 posts


Users Awards

Posted 25 February 2017 - 03:13 PM

Gotta agree with kelvin on this one. Programs can be programmed to learn a certain thing by trial and error (like the go machine that beat the world champion), but I don't think an AI would think on its own to build emotional programming, nor would it be able to figure out how to do it without any programming teaching it to do that.



#11 Coops

Coops
  • 🌧️🌩️🌧️




  • 3,602 posts


Users Awards

Posted 25 February 2017 - 03:53 PM

Clearly is a big person word that I often see people use when someone wants to get a point across immediately and with minimal effort and success. I use it too.

I am disgusted as an academic that you linked a Wikipedia article and not the literature that it borrowed information from. Everyone should read it. Skip the mathematical logic part if you need ton since all you really need is the gist from the introduction. I say it's worth the 20 minutes of reading.

Now I ask you this to finish off the definition: Is the essence of AIs limited to only computers and robots (in other words, are they just programs that humans made using code they typed into a processor)?

I don't know why you're bagging on Wikipedia when you used it as a starting point and were able to access relevant, solid information on the subject. Wikipedia is a great starting place for lots of subjects. It just feels like you're being nasty for the sake of just being nasty or to rile me up. =/

 

AI is the theory of computers/robots being able to do things humans normally would be required for (ie. programming, improvement, adaptation, learning, voice/visual recognition, etc). Right now, I don't think we're at the point where AI can do most, if many, of these things. But I don't see why eventually it wouldn't be able to do these things.

Anyways, I'm just gonna leave this here for fun:



#12 ohml

ohml
  • 118 posts


Users Awards

Posted 25 February 2017 - 04:29 PM

I don't see a point for some scientist to program a robot to feel pain or have conscience. If it did happen, I wouldn't see why they can't have the same rights as us. What separates us from them if they too feel what we feel?



#13 Keil

Keil
  • Above Average Mediocrity


  • 6,343 posts


Users Awards

Posted 25 February 2017 - 07:22 PM

I don't know why you're bagging on Wikipedia when you used it as a starting point and were able to access relevant, solid information on the subject. Wikipedia is a great starting place for lots of subjects. It just feels like you're being nasty for the sake of just being nasty or to rile me up. =/

 

Anyways, I'm just gonna leave this here for fun:

 

I wasn't bagging on Wikipedia. I was bagging on you for using Wikipedia. 

 

Someone is a fan of Overwatch. (3:26)

Zenyatta_portrait.png



#14 Kaddict

Kaddict
  • 1,654 posts


Users Awards

Posted 25 February 2017 - 07:23 PM

I don't know why you're bagging on Wikipedia when you used it as a starting point and were able to access relevant, solid information on the subject. Wikipedia is a great starting place for lots of subjects. It just feels like you're being nasty for the sake of just being nasty or to rile me up. =/

 

AI is the theory of computers/robots being able to do things humans normally would be required for (ie. programming, improvement, adaptation, learning, voice/visual recognition, etc). Right now, I don't think we're at the point where AI can do most, if many, of these things. But I don't see why eventually it wouldn't be able to do these things.

Anyways, I'm just gonna leave this here for fun:

But initiative, imagination and desire to evolve is not something I think can be programmed, but it definitely wouldn't be self-programmed by the AI, if it didn't know about those things in the first place.



#15 Coops

Coops
  • 🌧️🌩️🌧️




  • 3,602 posts


Users Awards

Posted 25 February 2017 - 09:11 PM

But initiative, imagination and desire to evolve is not something I think can be programmed, but it definitely wouldn't be self-programmed by the AI, if it didn't know about those things in the first place.

I can see what you're saying. Initiative, imagination and desire are pretty human qualities (as far as we know), and unless we programmed some of these things, the AI may not be able to do it themselves.



#16 InsertUsername

InsertUsername
  • 106 posts


Users Awards

Posted 20 March 2017 - 07:23 PM

We forget about something here. At the beginning, we were only atoms, inorganic particle. Those inorganic particle transformed into organic particles, like cells and organite. Then came life, feeling and pain. If we consider A.I. as "inorganic", why couldn't they feel and live emotions like us by transforming themselves into "organic" beings. (I am conscious that I am extrapolating the meaning of inorganic and organic, but bear with me.) Atoms by themselves doesn't feel emotion, but because they wired themselves to created neurons that transmit electron to indicates that we are indeed in pain, why an A.I. couldn't transmit electron to do the same thing for a robot? (I am referencing A.I. as the software and robot as the hardware here. The "Robot" could very well be a computer.) Another thing is, if an A.I. react the exact same way as a human would, even if learned from a human gesture due to a stimulus, then we can't just put away that the A.I. is indeed feeling the same emotion without certain proof. Otherwise, what proof do I have that another human is feeling the same thing than me when crying because it just fell. I am not in its place. All the information I have is the same as if a robot fell and where crying in pain. The only thing that differentiate us is the preconception that non-organic thing can't feel, but what if that preconception is actually a misconception?

 

Fun non-serious food for thought, what if the Windows multiple error sounds where cry of pain?  :ninja:


Edited by InsertUsername, 20 March 2017 - 07:24 PM.



0 user(s) are reading this topic

0 members, 0 guests, 0 anonymous users