Nanotech is gonna kill us?

Seriously guys... it does seem like you're thinking in science fiction rather than the real world. The world's leading experts in this field have been focusing on AI for a good few decades now, and we're still not at the point where anything they've created, however complex, can hold a simple conversation without giving away the fact it's actually an automated bot underneath.

It's a slow, slow slog, and our intelligence and knowledge as the human race is increasing far more rapidly than any of the bots out there.

Personally, I think if you're going to worry about something being dangerous there's far more pressing things around...!

You're right...

..BUT: There will come a time when AI will reach human's intelligence. When it does so (unless it is prevented in some way) it has the possibility of becoming smarter, which then can go bad, but not necessarily so. I'm just saying, that later on, it's something you have to be weary of. At the moment, it would be very advantageous to have AI.
 
It's probably only a matter of time before we all start living life as if we are living in Terminator Salvation.

Hastalavista, Baby!!!

terminator-arnold.jpg
 
There's a scientific term for this, called technological singularity.

http://en.wikipedia.org/wiki/Technological_singularity

At that point, AI becomes as smart as human intelligence. At the rate we're going, this could happen within the next 10 to 15 years (my opinion - backed up by the amazing advances in technology over the past 10 years alone, particularly CPU strength.)
 
. At the rate we're going, this could happen within the next 10 to 15 years (my opinion - backed up by the amazing advances in technology over the past 10 years alone, particularly CPU strength.)

I'm not so sure. Computing capacities are one thing, AI is another. Computers have already surpassed us in terms of how quickly computations can be done. However, we're still the ones who computer how they're going to computer something, and what they're going to compute.
 
I'm not so sure. Computing capacities are one thing, AI is another. Computers have already surpassed us in terms of how quickly computations can be done. However, we're still the ones who computer how they're going to computer something, and what they're going to compute.

Yes, but it's like a virus. There only needs to be ONE free-thinking thinking computer made by some brilliant person to screw everything up...it will spread.
 
The theory of creating a computer that could create code or even a another computer. Is a fascinating hobby of mine.

I have put a lot of thought into the idea, I would like to see this happen.
You would of course need some type of system to control this "race".

If they were to suppress us... I don't know what would happen..
Because in theory the 'controllers' would then be able to be controlled by them. Since they would be smarter.
 
I'm not so sure. Computing capacities are one thing, AI is another. Computers have already surpassed us in terms of how quickly computations can be done. However, we're still the ones who computer how they're going to computer something, and what they're going to compute.

Completely agree - higher clock speeds are no measure for how well AI is progressing, they're not really related at all along the research side of things. If someone had a program that could behave 100% like a human given a 128 core 3Ghz processor then yes, but at the moment hardware is not the limiting factor.

The theory of creating a computer that could create code or even a another computer. Is a fascinating hobby of mine.
There is a branch of computer science out there that deals with "evolving" code that quite literally changes itself as it runs as it gets to know its environment - but it's very much in its early stages (not least because it's so damn difficult to debug - as soon as you start tracing a particular problem you've got no guarantee the code that caused it is still there... Or if it is there's no guarantee it was there when you first ran the program!) Fascinating stuff though :)
 
I don't think you ever have a situation where machines are smarter than humans or take over the human race. Period! Machines make either/or, yes/no decisions.

You'll never get computers or this 'grey goo' stuff that you talk about making complex decisions like us.
 
Back
Top Bottom