Robot Evolution

http://discovermagazine.com/2008/jan/robots-evolve-and-learn-how-to-lie/

By the 50th generation, the robots had learned to communicate—lighting up, in three out of four colonies, to alert the others when they’d found food or poison. The fourth colony sometimes evolved “cheater” robots instead, which would light up to tell the others that the poison was food, while they themselves rolled over to the food source and chowed down without emitting so much as a blink.

Very interesting indeed.

Discuss.
 
lol the 4th colony is funny and scary at the same time. i can't say i didn't expect this though, i mean we are already making semi conscious robots. there is a man, i can't remember where he is from (japan i think), who created a robot that can "feel" pain and discomfort. its weird but not out of the question anymore.
 
WILL-E MEGAMAN I, ROBOT.

Actually this is really interesting, robots evolving makes the technology a lot more scarier and the "robots taking over the world" actually possible at one point :D
 
Robots will "evolve" or "learn" accordingly to the objectives you give them. If you place robots in an environment where they must "feed" themselves as efficiently as possible, they will develop behavior/intelligence that is similar to ours, because they are in a situation that is similar to ours. If we do that, yes, they could eventually outsmart us and "take over the world" because there is nothing that steers their development towards serving humans.

On the other hand, if you train a killing machine to hit the targets that you tell it to take down and avoid those that you tell it are allies, in no point in its development will it become "self-conscious". It will just get better and better at hitting targets, maybe it will become the very best, using insanely clever schemes to kill people. But it will always kill who you tell it to and it will always protect who you tell it to, because doing what you want is akin to getting food and not doing what you want is akin to hurting itself. And you define that.

Intelligence and consciousness are not correlated. Eventually, it's pretty clear that machines will outdo us in every single intellectual or physical task. But machines with consciousness and/or empathy are just a particular class of machines that we can produce using particular techniques and do not serve any particular purpose other than being like us. If you just want a machine that "does what you want", obviously, you'll be using other techniques and you'll be fine.

Robots will never take over the world unless somebody is controlling them, somebody is seriously incompetent or somebody is crazy. None of these options is out of question, unfortunately :(
 
this is kind of a funny read right after watching the matrix

but yeah, who knows how far robots will have been developed 20 years from now? the technology keeps getting better and better
 
I was hoping this might attract more posts like Brain's, not just 'OMG ROBOTS ARE GONNA KILL US AND RULE @!?!?#?". ):
 
'Prey' was a good book.

This is very interesting, also loling at the 'cheater' robots.


I think the cheater robots is the most interesting aspect of this. I don't know enough on any of these subjects to say anything remotely educated, but if cheating arose through evolution, what does that say about it and related "evils"?
 
Robots will "evolve" or "learn" accordingly to the objectives you give them. If you place robots in an environment where they must "feed" themselves as efficiently as possible, they will develop behavior/intelligence that is similar to ours, because they are in a situation that is similar to ours. If we do that, yes, they could eventually outsmart us and "take over the world" because there is nothing that steers their development towards serving humans.

On the other hand, if you train a killing machine to hit the targets that you tell it to take down and avoid those that you tell it are allies, in no point in its development will it become "self-conscious". It will just get better and better at hitting targets, maybe it will become the very best, using insanely clever schemes to kill people. But it will always kill who you tell it to and it will always protect who you tell it to, because doing what you want is akin to getting food and not doing what you want is akin to hurting itself. And you define that.

Intelligence and consciousness are not correlated. Eventually, it's pretty clear that machines will outdo us in every single intellectual or physical task. But machines with consciousness and/or empathy are just a particular class of machines that we can produce using particular techniques and do not serve any particular purpose other than being like us. If you just want a machine that "does what you want", obviously, you'll be using other techniques and you'll be fine.

Robots will never take over the world unless somebody is controlling them, somebody is seriously incompetent or somebody is crazy. None of these options is out of question, unfortunately :(

i dont really have anything to add beyond the mention that i dont think ive ever found a brain post that didnt entertain me, and i mean that in the least condescending way possible
 
i dont really have anything to add beyond the mention that i dont think ive ever found a brain post that didnt entertain me, and i mean that in the least condescending way possible

So often I click on a thread, read the first post and a good idea for a response immediately pops into my head...but then I scroll down and discover that Brain has already posted a beautifully worded reply that encompasses pretty much everything you need to know about the topic.

But then you still get the gloriously insightful follow-ups from posters like Mr. "alterior motives" here.
 
This is interesting, and slightly scary at the same time. However, Brain sums it up pretty nicely.

And Prey was an awesome book.
 
Back
Top