by Max Barry

Latest Forum Topics

Advertisement

Post

Region: Lazarus

Leonism wrote:A very good post. Once AI reaches this point where it can modify itself (maybe not physically but in its programming) it should be capable of learning in the human way and thus display a hallmark of sentience.

This is more of a philosophical topic though, and thus not my area of expertise. I don't even know what the formal definition of sentience is.

Electronic systems are already capable of modifying and reprogramming themselves-they can't physically grow more transistors, of course, but they are entirely capable of changing their own memory and therefore their programming. It's actually part of the definition of a Turing Machine.

Personally, I think the difference lies in desire; a creature wants things, different things as time passes. A machine does not want. But then again, just because something is not a machine does not make it a person.

Treadwellia, Cianlandia, The Sigometh Dynasty, Your imaginary friend, and 1 otherLoftegen 2

ContextReport