Look How Far AI has Come in 20 Years! (Funny Read)
In the earliest days of what we can best describe as “the modern internet,” when Twitter and Reddit were barely one year old, and Facebook had just entered “toddlerhood” (in other words… about twenty years ago,) there was a humorous blog called “The Daily WTF.” (in fact, it’s still around: www.thedailywtf.com.)
The Daily WTF was a blog for IT professionals and software developers to laugh at other people’s screw ups or bad ideas. Names were always fictionalized to protect the innocent, but the stories were/are real, or so we’re told.
As a young and budding software engineer in 2005, I read the Daily WTF religiously. It was a tradition at the office. It sparked the everyday water cooler conversation just as much as talking about the last episode of Lost, the Sopranos, or Battlestar Galactica. And yet, after 20 years, there is only one Daily WTF article that has stuck with me to this day. It’s called, https://thedailywtf.com/articles/No,_We_Need_a_Neural_Network. It’s from 2006. Read the whole article, especially the part at the bottom, and marvel how far we’ve come!
Reading this again in mid 2025, it struck me how much has changed (back then, Neural Networks were seen as a silly academic exercise, and not something real or practical… hence, that day’s “bad idea,”.) It also struck me how much hasn’t (namely… the comments.) Most of the comments on this 20 year old article could have appeared yesterday in Reddit or Hacker News, and no one would have noticed. I guess Nerd Stuff is circular.
The other thing that really hasn’t changed is the science. In 2005, people were talking about all the different ways to do natural language processing and text generation, such as Markov Chains, Support Vector Machines, and the like. Interestingly, you’ll see people promoting all of these useful models instead of a neural network, which back then would be inefficient and take up too much RAM. Oh well.
What’s changed? Compute power of course, and the ability to use GPUs for machine learning (especially neural network creation.) In 2005, GPUs were just used for playing computer games with enhanced graphics. (CUDA, the NVIDIA API that turbocharged their transformation into machine learning powerhouses was still 2 years away from public release.)