Does anyone in tech use this crappy method?
GMM is useless

"uncertainty estimation" lol
GMM is a teaching device, nobody applies it outside academia.
In ML they use variational inference when they care about uncertainty estimation.
Unfamiliar term has the reg monkey chuckling?
https://paperswithcode.com/paper/theperilofpopulardeeplearning

You're all saying that there is issues with convergence, depending on initialization; oh please tell me something we didn't already know. In a sense, it may be locally but not globally stable. But we typically have a strong idea of priors on the parameters in the Ortho condition we're trying to estimate.
I personally think it's an elegant way of estimating equations and people bashing it are just trying to sound smart, which is cringey.

Does anyone in tech use this crappy method?
Maybe consider that tech generally doesn't try to infer causal effects from nonexperimental data. Or just continue to follow the tech bandwagon because it seems cool and you don't have the discipline or ability to learn some real math and econometrics.

Does anyone in tech use this crappy method?
Maybe consider that tech generally doesn't try to infer causal effects from nonexperimental data. Or just continue to follow the tech bandwagon because it seems cool and you don't have the discipline or ability to learn some real math and econometrics.
I did not know that GMM is real math

Does anyone in tech use this crappy method?
Maybe consider that tech generally doesn't try to infer causal effects from nonexperimental data. Or just continue to follow the tech bandwagon because it seems cool and you don't have the discipline or ability to learn some real math and econometrics.
I did not know that GMM is real math
The math underlying LH's research is more real than applying a black box ML function on Python and pretending to understand statistics.

Does anyone in tech use this crappy method?
Maybe consider that tech generally doesn't try to infer causal effects from nonexperimental data. Or just continue to follow the tech bandwagon because it seems cool and you don't have the discipline or ability to learn some real math and econometrics.
I did not know that GMM is real math
Define GMM, it's not some meme you can throw around here.

Does anyone in tech use this crappy method?
Maybe consider that tech generally doesn't try to infer causal effects from nonexperimental data. Or just continue to follow the tech bandwagon because it seems cool and you don't have the discipline or ability to learn some real math and econometrics.
I did not know that GMM is real math
The math underlying LH's research is more real than applying a black box ML function on Python and pretending to understand statistics.
did anyone say that ML on Python is real math?

Maybe consider that tech generally doesn't try to infer causal effects from nonexperimental data. Or just continue to follow the tech bandwagon because it seems cool and you don't have the discipline or ability to learn some real math and econometrics.
I did not know that GMM is real math
The math underlying LH's research is more real than applying a black box ML function on Python and pretending to understand statistics.
did anyone say that ML on Python is real math?
I suppose the point is that GMM's usage in tech is not indicative of its merits, and practitioners in tech are not godlike wizards, as OP probably thinks, and in fact often don't use or understand statistics at all, given that they're primarily engineers. Whatever you define to be real, there is probably more technical sophistication involved in GMM than most of what is generated by data scientists in tech these days.

MLE, GMM = highquality research
SMM = vague, mediocre research
ML = total junkNo need to stoke the fire.
They're just different tools for different tasks. If you don't understand that, and you don't have specific criticisms of each, you're probably not in a place to comment at all.