TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

Ask HN: Why ChatGPT can make Math errors?

3 pointsby stuntabout 2 years ago
I was working on a report to calculate the monthly CPM of a SASS product and decided to ask ChatGPT to do the calculation for me, just to get a feel for what it would be like to have AI-powered sheets in the near future.<p>I decided to test ChatGPT. Instead of pasting a large amount of comma-separated values, I decided to first give ChatGPT a sample record. It replied back with the formula to calculate CPM as well as the result of its calculation for my sample data.<p>While the formula ChatGPT gave me was 100% correct, the result was different from what I had on my sheet. At first, I assumed that I had made a mistake on my end, but after double-checking, I realized that wasn&#x27;t the case.<p>So I told ChatGPT that I was getting a different result when I calculated it, and it immediately apologized (apparently that&#x27;s a thing it does when you point out a mistake) and explained the formula again, but gave me the same incorrect value.<p>This went on for a few rounds, with ChatGPT even making small change to the formula, but still giving me the wrong answer. It was only after the forth time, and only after I gave it the correct result, that it finally gave me the correct value.<p>This whole experience was unexpected because every time, ChatGPT explained the correct formula and even placed the correct values in their correct places in the formula, but the outcome was incorrect.<p>And that led me to wonder! How exactly does generative AI perform math? Can anyone with more knowledge in this area provide insight and an explanation that how anything like that can happen?

2 comments

verdvermabout 2 years ago
GPT is just printing text based on probabilities, it doesn&#x27;t do math or calculations, it does not reason or understand logic.<p>Note, this will change with plugins &#x2F; chaining, but that is still an external system. In the end, these LLMs are just predicting the next best token to output. The &quot;magic&quot; is in our minds and perceptions
nikonyrhabout 2 years ago
GPT is &quot;only&quot; capable of generating text, as of now it cannot do any explicit calculations. But if you ask it what is 1+1, maybe it can answer &quot;2&quot; since it has seen that example so many times. It still didn&#x27;t do any math though, only text &#x2F; token processing.