Difference between a Business Analyst (BA) and AI?
What's the difference between a Business Analyst (BA) and AI?
Well… a BA is probably less likely to say you are "brilliant" and instead ask a follow up question (nothing personal, it's just part of the job!)
It sounds like an obvious and simple question answering it means digging down to what defines human intelligence.
For this exercise, let's add the particularities of what our 'less friendly' interrogative BA would do when analyzing a situation or context.
To start off, let's simplify things (another classic BA technique!)
At its core, AI can be both:
- A set of pre-defined rules
- A program excellent at pattern recognition
We've seen with the latest GenAI technology that this can bring AIs to an impressive level of performance but they tend to be notoriously bad at a few things:
- Generalizing and abstracting information to form a logical mental model of the world
- Counting and spatial reasoning (although this seems to be improving)
- Coming up with true novelty
… and humor of course!
For the purposes of this post, I would like to focus on the first item:
The mental model of the world.
I think about having a mental model as building up a mental representation of how something works. When new information is presented, the model is either validated or broken down and adjusted/rebuilt to include the new data.
A BA with a mental model is able to ask the right questions and acquire the most insight from new information by constantly challenging his mental model.
The model can then be leveraged to analyze risk, recommend strategies, anticipate the impact of changes, you name it!
I like to see it as the BA's secret weapon 😎
In other words, it is a way to learn/acquire true understanding and not just familiarity or ability to apply.
A recent study by Harvard and MIT found that when trained on planetary orbits, a transformer based AI model could predict orbits very well but couldn’t uncover Newton’s laws.
It could predict (pattern recognition), but not truly understand (mental world model)
I see that as AI becomes more and more prevalent in our lives and work environments, it is those who are most curious and have the most powerful mental models that will differentiate themselves and reap the greatest rewards from AI.
Come to think of it, humor is often about challenging the expectations we have from our model of the world in an unexpected silly way.
No wonder AIs can't seem to get it right
Would love to hear if anyone has interesting/funny recent examples of AI failing at its task due to lack of a world model?