AI - What is Mankind Actually Doing?

Status
Not open for further replies.
I agree smart settle things without war, however it's also a naive thought. You're wanting a Utopia, and let's face it... of all the races, religions, mental states, etc in the world, that will NEVER happen. A smart does not WANT war may be a better saying, because say you have a homicidal maniac on the other side of the table who really only wants you dead. No matter what you say, no matter how smart you are, there are always the percentage of people out there who DON'T care. They will trump your want for peace evertime.
 
I agree smart settle things without war, however it's also a naive thought. You're wanting a Utopia, and let's face it... of all the races, religions, mental states, etc in the world, that will NEVER happen. A smart does not WANT war may be a better saying, because say you have a homicidal maniac on the other side of the table who really only wants you dead. No matter what you say, no matter how smart you are, there are always the percentage of people out there who DON'T care. They will trump your want for peace evertime.

so your saying that a robot led utopia will never work because of the small percentage that resist it?

i really dont know where you interpreted that, no one was suggesting that. i was only suggesting it in a combat situation, making decisions on the battlefield for troops to follow.

however a robot lead utopia would lead to robots dominating over man, which is exactly our fear. if a robot smarter than humans would choose between the rights of man and machine, which would it choose? its own kin or humans?

leading on from that, we would not be able to program a smarter robot than us which would still have the same morales. sure, you can make the hardware more complex and advance than the human brain, but then the programming would have to be left to its own learning abilities. afterall how can you program something to think like you except be smarter? can you imagine how youd see the world if you were smarter?

besides that, the small minority that would oppose a robot lead utopia (if any) would soon dissipate. do you still see ludites around opposing the industrial revolution?
 
here are my questions to anyone who still cant see it the way i do:

what happens when robots begin to question our decisions, including the three laws protecting ourselves from them?

what happens if robots have the knowledge to reproduce and to create more robots, maybe even the ability to re-design them and improve upon themselves?

what happens when they no longer need humans to provide them with power, and can collect energy free from any human device?

what if robots feel a need to survive?
 
Just to enforce TheEnd's point, the human brain will never be paralleled by any artificial intelligence we can create. It is simply amazing what it can do, it is just hard to realise it because you don't compare what a human brain can do in comparison to a computer.

The way I see it the human brain is a computer that keeps upgrading when it is in function, creating new connections (neurons) all the time.

At the very least I certainly don't believe that in the next 100 years we will have anything close to our intelligence.
 
computers will never be able to take over humans.
Sure...they might be faster, more efficient, but who made them that way!!!

The human's brain still hasnt been figured out and prolly never will.
Humans only use about ...i might be wrong on this, about 28 percent of their brain.
 
Can Robots be Programmed to learn stuff. I mean like actually learn by reading a book etc. Can they identify what is right and wrong?. Can they make decisions and plan out certain objectives. Do they have an aim in existence?

Lol dono anything about programming but wouldnt they follow certain instructions like....

If missile is approaching
=runaway

If not
=Stay and do nothing.

lmao.

No but seriously how would you make them think for them selves?
 
here are my questions to anyone who still cant see it the way i do:

what happens when robots begin to question our decisions, including the three laws protecting ourselves from them?

what happens if robots have the knowledge to reproduce and to create more robots, maybe even the ability to re-design them and improve upon themselves?

what happens when they no longer need humans to provide them with power, and can collect energy free from any human device?

what if robots feel a need to survive?

dont wanna be harsh but there is no 3 laws ,u just saw that onmovies and that book.THE 3 LAWS ARE FICTIONAL,ofcourse theus willthink of something ,heck we control space why not the worlds robots
 
Status
Not open for further replies.
Back
Top Bottom