

1·
2 days agoKinda of actually
Kinda of actually
I’m reminded of an early model that was trained to find if tanks were hiding pictures of forests / jungles. Was doing great with the training data then was given new images and seemed to be guessing wildly.
Turns out it in the training data all the pictures with tanks were taken on cloudy days.
Because the AI doesn’t know what it’s being asked, it’s just a algorithm guessing what the next word in a reply is. It has no understanding of what the words mean.
“Why doesn’t the man in the Chinese room just use a calculator for math questions?”