It’s rather funny that virtually anything we do in Computer Science has already been done and dealt before in some shape of form.
The most prominent example probably being artificial intelligence, the concept of neural networks dates back to 1943. A method called “Backpropagation“, which makes the training of huge neural networks possible, arguably dates back to 1960.
Even when you look at programming itself: The rising popularity of functional programming to solve modern concurrency problems, or as Paul Butcher puts it: “Functional programming is like a futuristic car, not yet widely used, but it’s what we will all rely on in twenty years.“1 The concept of functional programming is derived from the lambda calculus, a formal system in mathematical logic dated 1930.
There is one thing that is rapidly advancing though: As Moore’s law suggests that computational power increases exponentially so will progress and advancement. It’s like computational power is merely catching up to what could’ve already been possible. Let’s return to our first example: artificial intelligence. Backpropagation only came back in trend in 2010—now being able to utilize massive computational power through the GPU.
One might think it’s rather futile to discuss this old epiphany—but only if you stop there.
Maybe, to predict the technologies and trends of the near future—we only have to look back to see what’s ahead.
As it’s for me though, I don’t feel like I’m standing on shoulders of giants—but merely to their bare feet.
Paul Butcher, Seven Concurrency Models in Seven Weeks ↩