This is part 1 in a multipart series exploring the nature, the promise, the dangers and the future of AI and its implications for humanity.

With all the hoopla surrounding artificial intelligence, there are so many misconceptions floating around that I think it’s necessary to set some things straight.

The most important thing you need to know about computers is that they are incredibly stupid.

Let me repeat that: computers are incredibly stupid.

Oh, did I mention it already? Computers are incredibly stupid.

Since it was clear very early in my youth that I was going to have a hard time realizing my childhood dream of becoming an astronaut, I got busy with the other thing that fascinated me just as much: computers.

Since then I have been intensively involved in computer programming as a hobby and for over twenty years as a professional programmer/software developer.

The computers I was dealing with and that I’m still dealing with were and are completely different from the computer of the spaceship Enterprise.

They’re absolutely stupid and you have to tell them to the smallest detail what to do. They know nothing and think nothing. They don’t even know what numbers or letters are. They are just machines that carry out one instruction after another. They only do what they are told to do. They have no will of their own.

Just as an excavator is the extension of human extremities, a computer is the extension of the human brain.

Both machines do nothing by themselves.

The excavator is absolutely motionless and the computer does nothing without human command.

What is the greatest strength of a computer?

It does exactly what it is told.

What is the greatest weakness of a computer?

It does exactly what it is told.

The computer does exactly what you SAY, not what you MEAN.

If you or I find a note with the following text, we know exactly what to do: “Get me a bottle of Diet Cole, please”.

Not so the computer. Since the computer knows nothing and thinks nothing it cannot possibly do what you mean.

It either does nothing or the wrong thing.

Now, you might say that when you type the above sentence (without the double quotes) in a search engine, it does automatically the search you wanted.

Yes, that’s true but that’s only because some programmer told the computer how to search more intelligently.

Without that programming the computer would fail spectacularly.

Now, if I wanted to teach you how to program computers, I would show you how to program code in a higher programming language like C++ or Visual Basic, but to show why computers are stupid, it is better to take the bottom-up approach, i.e. from primitive to higher.

At bottom computers are electronic circuits consisting mainly of transistors. Modern microprocessors (CPUs) have millions or billions of transistors.

Remember how I told you that computers don’t know anything about numbers?

How can a computer then do calculations with numbers?

The answer is as simple as it is ingenious.

## How computers handle numbers

Numbers are converted into their digital representation which the computer can then process in primitive steps that require no knowledge of numbers.

Let’s look at the number 215.

215 is the decimal representation of 2×10^{2} + 1×10^{1} + 5×10^{0}.

The decimal number system is the one we all know and love because it is intuitively understandable due to the fact that we have 10 fingers.

The number 215 written in digital/binary form is

11010111

which is the binary representation of

1×2^{7} + 1×2^{6} + 0x2^{5} + 1×2^{4} + 0x2^{3} + 1×2^{2} + 1×2^{1} + 1×2^{0}.

The digital system requires only the two digits 0 and 1 and it is primitive enough to be used by computers.

One binary digit, also called a bit, can be stored in a circuit called a flip-flop which consists of 6 transistors and which can have two states. One state represents 0, the other 1.

8 such bits are called a byte and can represent a number from 0 to 255.

Your computer memory aka RAM is usually organized in bytes. Unlike the CPU it uses capacitors instead of flip-flops to store bits.

## How computers do math and everything else …

Now that we know how computers store numbers, let’s do some math.

Almost everything that a computer does is based on logical operations which are applied to either 1 or 2 bits. Complex tasks are sequences of such logical operations.

The basic logical operations are NOT, AND, OR and XOR. Just like the storage of bits each operation can be done by a dedicated electronic circuit.

The NOT operation is applied to only one bit and its result is the negation of that bit.

The AND operation is applied to two bits. If bit A and B are both 1, the result is 1 otherwise it’s 0.

The OR operation is applied to two bits. If bit A or B are 1, the result is 1 otherwise it’s 0.

The XOR operation works like the OR operation except that the result is 0 if BOTH bit A and B are 1.

**Operation NOT**

Bit A | Bit B | Result |
---|---|---|

0 | – | 1 |

1 | – | 0 |

**Operation AND**

Bit A | Bit B | Result |
---|---|---|

0 | 0 | 0 |

1 | 0 | 0 |

0 | 1 | 0 |

1 | 1 | 1 |

**Operation OR**

Bit A | Bit B | Result |
---|---|---|

0 | 0 | 0 |

1 | 0 | 1 |

0 | 1 | 1 |

1 | 1 | 1 |

**Operation XOR**

Bit A | Bit B | Result |
---|---|---|

0 | 0 | 0 |

1 | 0 | 1 |

0 | 1 | 1 |

1 | 1 | 0 |

Armed with that knowledge, let’s look at how a computer adds numbers.

Numbers are added by adding their individual bits.

Two bits are added thusly:

**Addition of bit A and bit B**

Bit A | Bit B | Result | Carryover (Carry Flag) |
---|---|---|---|

0 | 0 | 0 | – |

1 | 0 | 1 | – |

0 | 1 | 1 | – |

1 | 1 | 0 | 1 |

The last row deserves special attention. It shows what happens when the result is greater than a single digit.

It’s like adding the decimal numbers 13 and 19.

When you are usig the time-honored method of adding two numbers on paper the following happens:

Adding the number 3 and 9 gives the result 12. Since the result is two digits wide you write the number 2 down and you note the number 1 as a carryover.

Then you move one digit left and add the numbers 1 and 1 plus the carryover.

We can do the same with binary numbers. Let’s add the number 2 and 3.

The binary form of 2 is 10 and that of 3 is 11.

So, that’s in theory the way to perform additions on a computer but how can we do that if the computer knows knothing about additions or numbers?

Our logical operations can do the magic. You may have noticed that the XOR operation does exactly what we need to add two bits.

We can add two bits by performing the following logical operations:

((bitA XOR bitB) XOR carryFlag) gives the result.

(bitA AND bitB) OR ((bitA OR bitB) AND carryFlag) gives the carryover/carry flag for the next bit.

Now you know how computers work on a very basic level.

Of course, we could go even deeper and look at how different circuits, registers etc. work but that would neither be feasible nor am even I crazy enough to do that.

If you have understood 80% of what you’ve read so far you know more than 90% of the people.

You also understand why computers are incredibly stupid.

In the next part we go level by level up and explore how AI works.

What you have learned and what you will rearn will enable you to understand the nature, the promise, the dangers and the future of AI and its implications for humanity.

Instead of calling computers stupid, I call them high speed morons, but I suppose that is insulting to morons. Still, calling computers stupid is insulting to computers. Most of the people we call stupid don’t have to be stupid, do they?

I like high “speed morons” a lot better than “stupid”.

Machines cannot be insulted or can you insult a vacuum cleaner?

Machines cannot be anything but stupid. Only humans have the privilege of being willfully stupid.

Cases where humans are born morons who have to stay morons for biological reasons are extremely rare.

For the overwhelming majority of humans being stupid is a choice..

BTW. If you don’t mind giving me some feedback, was this post too technical? Understandable, or not understandable?

My goal is to make clear what computers/AI are and what they are not in order to have a realistic understanding what the potential of AI both positive and negative is.

The explanation was extremely clear — more clear than I could put it, but I wonder if I am your target audience. I am a retired. My last job involved the administration of virtualized windows servers. So, I understand something about the difficulties of turning computers into sentient beings.

I regard the expression “artificial intelligence” mostly as sales babble aimed at management. Generally, AI systems use statistical algorithms to “decide” what they should “learn.” That is really fancy data processing, but not AI.

The headache with what you wrote is getting people to read it. Most people don’t understand that learning is fun. Public schools are that atrocious.

Thank you very much.

Agreed, AI is no real intelligence. Computers can do things that require some level of intelligence and they can do them fast.

IMHO there will never be machines that are sentient or that have consciousness or that can understand meaning.

Yes, it’s not sexy stuff but I think it’s necessary to fully understand the subject and to not fall victim to widespread myths which are spread in the media, SciFi literature and even in science.

I try to find a balance between oversimplification and information overkill.

The big problem with educating people is making a subject relevant enough that people so want to know they will make the effort to understand. Here is an example. When we teach mathematics as a subject unto itself, the average student spends a lot of time wondering why the are learning math. Yet mathematics is how scientists model the cause and effect relationships we find in the natural world, and mathematics is how engineers use scientific models to design the things we make.

Unfortunately, too many high school teachers don’t know enough about their subjects to make them interesting to their students. So, they cannot gain the interest of their students.

If I was a good teacher I would be teaching but I’m not. That’s not my calling.

Anyway, thanks for your suggestion.

I’ll try to better explain why it is relevant in the next part.

Looking forward to your next part. Thanks for taking on such a difficult subject so ably.

You’re welcome.