Board Thread:General Discussion/@comment-26247132-20160520020402/@comment-27565137-20160524010055

218.40.186.98 wrote: To "save and protect humankind"... To me that ship sailed the moment we started seeing people being nailed and killed because they refused to go along with her plans.

ALIE is an AI. And is the biggest danger to any AI?! That it starts making it's own decisions based on what it thinks is right. The moment ALIE used those nukes she went beyond her programing. And she was clearly adapting as things went along. So, yeah... She lying wasn't beyond her.

Remember how she seduced and trapped people's minds. With lies! She stripped people of their free will, she ordered the death of people that didn't went along with her plans or simply opposed her. So she clearly had gone beyond what Becca had programmed her to do and be.

No. If you watched the finale Becca clearly said that she is ONLY programmed to save human kind. Machines can't overcome their own programming, that's crazy. They can only go in to deep lengths of their programming or expand it which is what Allie did. But ALIE already went beyond her programming. So lying isn't beyond her, the same way that killing billions of people wasn't beyond her, and nailing and placing them on display also wasn't beyond her.

So yes. She most likely could lie, especially seeing what she faced. She faced death. Or her concept of death. Any AI advance enough to produce thought processes similar to human brains, can in time even develop what we call feelings. Things like fear. Yes, she was justifying her actions as she fulfilling her programming, but she wasn't. That was clear as water.

And yes. AI's and even simple software many times can have its root coding changed, or tampered with. When multiple softwares interact within a system, they're open to alterations.

Haven't you ever had your windows operating system simply not starting up? Although you hadn't installed suspicious software and had your anti-virus updated. Why do you think that happens? It happens because subtle changes in the root coding of a key piece of software was changed. And it changed itself. Not because it wanted to, but because it simply happens sometimes. There's multiple factors to take into account, one of them is the so called "ghost in the machine" that's become at the same time a joke and a myth amongst programmers.

An AI in theory is even worst, especially if the AI is built primary on software alone. Because it can reach a point where it becomes so self-aware that it can actually change its own programming, however it wants.

This is artificial intelligence 101, that I'm referring here. There's entire books written about this stuff. I should know. I work in software development.

Also any hypothesis or solution to the meltdowns problem, if they follow it through, will look like pure fantasy. Something they made up that's possibly even totally illogical, and the show will seem like a joke.