By: Amir Tadrisi

Published on: 5/26/2025

Last updated on: 5/28/2025

Is AI Here to Stay?

If a machine can answer as a man, let us call it thinking.’ Thus was born our quest to turn wires and code into mind.
Inspired by Alan Turing

1. Can Machines Think? Birth of an Idea

This was the question Alan Turing posed in his 1950 paper Computing Machinery and Intelligence to define artificial intelligence.

His answer was simple: if a computer can carry on a conversation indistinguishable from a human’s, we should call it intelligent.

2. AI Is Here to Stay for a Long Time

To estimate how long AI will endure, we can turn to Lindy’s Law: if something has survived X years, you can expect it to survive roughly another X years.

The first true AI experiment dates back to 1950 with the Turing Test. Since AI has already thrived for more than 70 years, Lindy’s Law suggests it has at least another 70 years ahead. Don’t regard AI as a passing fad—it’s only going to accelerate. But why?

3. What Are Models

To make computers “think,” we train them on existing data. For example, to build a dog-vs-cat classifier, you feed the system hundreds of labeled images of dogs and cats. The software that maps these examples into predictive patterns is called a model. There are many types of models—one of the most powerful today is the LLM (Large Language Model).

4. Vectors

Computers only understand numbers, so every image, word or document must be converted into a numeric form called a vector. In our dog-vs-cat classifier, each image becomes a vector of pixel-intensity values. The model “learns” textures, edges and colors by adjusting its internal weights on those numbers. Similarly, LLMs convert text into embedding vectors that place similar tokens close together in a high-dimensional space, enabling the model to learn syntax, semantics and patterns.

5. Hardware Limitations

Training on more data makes a model smarter—but processing millions or billions of tokens (chunks of training data) requires massive compute power. CPUs are general-purpose and handle one task at a time, which makes large-scale training painfully slow. GPUs, by contrast, excel at parallelism: hundreds or thousands of cores working simultaneously on the same calculations. Historically, GPUs were prohibitively expensive, so models remained small. Now that GPUs are affordable and widely available, training times have plunged—and models have grown dramatically.

6. AI Is Here to Stay and Rapidly Accelerating

AI isn’t just a passing fad—it’s become foundational infrastructure across every industry, and three forces are now turbo-charging its pace of innovation: 

  • Enduring Momentum – By Lindy’s Law, AI’s 70-year survival signals at least another 70 years ahead.
  •  Data Abundance – Billions of connected devices, apps and sensors continuously produce labeled and unlabeled data
  • Specialized Compute – GPUs, TPUs and custom AI chips now slash training times from months to hour