Loading...

Can we build an AI without losing control over it?

In this video Sam Harris talks about a “failure of intuition” most of us experience when we think about Artificial Intelligence.

If the following three assumptions are true:

  1. Intelligence is the product of information processing.
  2. We will continue to improve our intelligent machines.
  3. Humans are nowhere near the summit of possible intelligence.

… then it’s very unlikely that we won’t build superhuman machine intelligence at some point.

The danger lies in doing this before we solve the problems associated with creating something that may treat us the way we treat ants.

Read more here.

And here.

Editor's choice