Human ^ AI

This has been the year of AI hype, and I wanted a little firsthand experience with what AI could do, or not do. I decided to let an AI (for legal reasons, I will not say which one) do a little coding for me.

Before long, I was looking at a very peculiar function. I’ll skip the nitty-gritty details of the function. Here’s all you need to know:

uint16_t calculate_crc( uint8_t* bytes, size_t byteCount) {
uint16_t crc = 0;
// I’ll skip the polynomial loop to calculate “crc”
crc = crc ^ 0;
return crc;
}

The question that came to my mind was: why add the next-to-last line of code

  crc = crc ^ 0;

I started writing software back when dirt had just been invented. I have written software in more than a dozen languages – and that includes binary machine language, for a microsequencer of my own design. I will humbly say that I know a thing or two about it. It took me something less than one second to determine: this line of code does not do anything at all.

As I looked suspiciously at this code, my first thought was: does this AI really know something I don’t know? So I asked. The AI’s response, paraphrasing just a bit, was:

This exclusive-OR operation is a very efficient method to set a variable to zero. It is especially efficient when the variable is already in register storage.

This very techno-believably-sounding explanation is, quite simply, bullshit in so many ways that it’s just not funny. But the big four that come to mind are:

  1. Unless you are writing a very, very targeted piece of code, you should let the compiler decide on the best way to optimize for efficiency.
  2. If it’s such an efficient method to zero a variable, why didn’t the AI use it in the very first line of the function, when the variable was first declared?
  3. You just went to a lot of trouble to calculate a CRC. Why do you want the last step to erase the CRC you just painstakingly calculated and replace it with zero?
  4. Most importantly of all: this response is just plain WRONG. That line of code does not set ANYTHING to zero. As (hopefully) any freshman CS student can tell you, a bitwise XOR of any value with all zeroes will give back the very same value you started with. 0 ^ 0 = 0. 1 ^ 0 =1. End of line.

I spent the better part of a day in a deep dive on why this particular AI acted as it did. At the end of that day, I reached a simple conclusion. Those who trust AI are doomed, and the rest of us are probably not far behind.

Leave a Reply