TL;DR
Artificial Intelligence (AI) is not a stand-alone solution, but another tool in the Data Science Shop toolbox.
The output of any AI application is a Data Product too.
Leveraging AI at scale follows the same blueprint laid out by the Data Science Shop: you need the same technical people, the same technology and the same process to build Data Products powered by AI.
Artificial Intelligence (AI) is all the rage these days. Scroll through enough news stories, and you come to believe that AI is the ultimate disrupter: it solves every problem under the sun – and better than any human could! Everyone wants a piece of it, and no one wants to miss out on the AI Gold Rush.
But if everything can be done with AI, doesn’t that make Data Science obsolete?
Did the rise of AI kill the Data Science star?
Before answering this question, let’s make one thing clear: the output of any process that uses AI is a Data Product. So, AI is just another engine that powers Data Products. The best kept secret is that AI has been a part of the Data Science toolbox from the very beginning, but under the cloak of a different name: machine learning.
Now we can answer the question: no, AI did not kill Data Science. If anything, it made it even more relevant. Leveraging AI at scale follows the same blueprint laid out by the Data Science Shop: you need the same technical people, the same technology and the same process to build Data Products powered by AI. That realization alone makes the Data Science Shop roadmap even more important.
If you are making decisions about applying the latest Large Language Model (LLM) or Generative AI (GenAI) tool in your company, consider this:
if you are only looking at a single application of AI – like using an LLM to summarize documents from meetings – then you may want to outsource all technical aspects of that Data Product,
if you are looking to apply AI at scale to solve multiple and different problems across your company, then the Data Science Shop approach is the roadmap to follow
But remember, none of this works without data. None of this works without powerful computing. None of this works without properly defining the problem to solve. And all of it requires specialized people to power the formulation of the problem, to operate the computing, and to make data flow.
So what is AI, anyway?
An interesting historical fact about the term “Artificial Intelligence” is that it was effectively a “marketing” tactic by an academic seeking funding for a conference in the 1950s. The common term at the time for the field was “automata studies”, so John McCarthy needed a catchy term to differentiate his funding proposal to the Rockefeller Foundation. The AI star was born!
The original field of Artificial Intelligence aimed to decode the mysteries of the human brain. If we could mimic cerebral functions in a computer – the thinking went – we would inch closer to unraveling the enigmas of human intelligence and consciousness.
However, what we call Artificial Intelligence today traces its roots to a different source and objective. It sprouted from a series of algorithms funded by the US Department of Defense during the 1960s and 1970s to recognize patterns in data. Think of automating the identification of military activity on aerial photographs to prevent the next Cuban Missile Crisis. The generic name given to these pattern recognition algorithms is machine learning as they empower computers to “learn” patterns from data.
The aspect of AI that’s seized media spotlight is a subset of machine learning dubbed Deep Learning, drawing inspiration from the synaptic connections in the human brain. This neural blueprint equips machines with the ability to discern complex patterns, enabling them to execute astonishing feats that captivate and entertain us.
Even though we use human terms to describe what deep learning does, machines do not yet “learn”, “understand”, “think”, “reason” or even “hallucinate” in the human sense. All machines can do is identify, extract and replicate patterns in data, at least for now.
AI’s Rorschach moment
Artificial Intelligence has become the Rorschach inkblot for technology. The term suggests that computers have or may achieve the same "intelligence" that we humans recognize in one another, so we all think we know what AI is. Yet, we struggle to explain it to a 6-year-old. Even if we cannot explain it, we are certain that we know AI when we see it. So, everyone projects in the term Artificial Intelligence what it wishes it could be or do, and rarely what it actually is or can do.
Did we spark your interest? Then also read:
A primer on AI for non-technical readers - our annotated reader to become fully conversant in all things AI tailored to everyone (no technical background needed !) [coming next Sunday – March 31st]