ChatGPT/Google Can’t Eliminate Human Understanding

Why do we need to remember things when everything is just one Google search away? (Or these days, one ChatGPT query away).

In fact, this was the logic once proffered for why we didn’t need to learn multiplication tables — or for that matter, anything factual. The mantra was, ‘Just look it up!’

This view is fundamentally flawed. Here is why.

Factual information, even when easily Googleable, serves two purposes:

The first one is the standalone value of that knowledge. E.g., knowing that 20 – 13 = 7 is useful if you are a cashier who has to return the money when someone gave you a 20 rupee or dollar bill for something that costs only 13. This can be done by Google or your calculator.

But factual information serves another purpose — it is a raw ingredient that goes into building your mental models.

For instance, once you have internalized adding or multiplying numbers, you have mental models that intuitively tell you what multiplication is. And that unconscious competence is what helps you do higher-level math.

That is why a terrifying-looking equation reads like plain English to an engineer.

Take another example. If you are studying the political problems of the Middle East but don’t have the basic facts (e.g., the British Mandate for Palestine and the subsequent events), you can’t understand the mindset of each player in the region and why they behaved the way they did.

I can give many more examples, but the obvious conclusion is that while we don’t have to be walking encyclopedias, knowing facts, formulas, and information are valuable.

If we plan to look up EVERYTHING on Google, then we will have no understanding of anything.

And I have seen many well-respected and intelligent people profess that we don’t need to remember anything. Sorry, that’s not true.

ChatGPT or Google don’t obviate the need for human understanding. Not yet.

– Rajan

Similar Posts