examples where saying it doesn't understand the text seems completely implausible.there's no "it" there, it's a very complicated lookup table. it has no experience of the world, no goals, and no reflection on itself, because there is no sentience behind it. and at no point in scaling the algorithmic complexity, training data, etc do you get something akin to actual intelligence.
do you even scaling hypothesis, bro?