i am in consulting and i can see entire teams apart from good salesmen getting fired
Chat GPT 4 is insane. How can i insulate myself from the oncoming layoffs
-
Obviously it's a large increase in capability but it still has problems with hallucinations and out of sample generalization. The generalization has become significantly better than with GPT-3 but i don't think it's good enough to automate many jobs.
It's not clear to me how long this will take to fix. Memory augmented RNNs seem much better at generalization than transformers, but there isn't a clear path to training them on the entire web (way too costly). Although maybe that won't be necessary. Humans don't train on all that much information. 18 years of compressed video is a few terabytes of data.
-
GPT is only good at sounding correct, not so much at being correct. You should therefore a choose a profession where the delta between sounding correct and being correct is great.
Marketers are done.You are thinking about the current version. 20 years ago people were fascinated about Akinator. 10 years ago Siri/Alexa were unbeliveable. 5 years ago Tesla's image identification software was out of this world.
-
Alexa is gar/bage.
GPT is only good at sounding correct, not so much at being correct. You should therefore a choose a profession where the delta between sounding correct and being correct is great.
Marketers are done.You are thinking about the current version. 20 years ago people were fascinated about Akinator. 10 years ago Siri/Alexa were unbeliveable. 5 years ago Tesla's image identification software was out of this world.
-
Obviously it's a large increase in capability but it still has problems with hallucinations and out of sample generalization. The generalization has become significantly better than with GPT-3 but i don't think it's good enough to automate many jobs.
It's not clear to me how long this will take to fix. Memory augmented RNNs seem much better at generalization than transformers, but there isn't a clear path to training them on the entire web (way too costly). Although maybe that won't be necessary. Humans don't train on all that much information. 18 years of compressed video is a few terabytes of data."it still has problems with hallucinations and out of sample generalization. "
And people don't? We are not talking about GPT replacing the entirety of human jobs. But you can bet that a good 50% of non-manual work jobs can be replaced with the current state of the technology.
-
Obviously it's a large increase in capability but it still has problems with hallucinations and out of sample generalization. The generalization has become significantly better than with GPT-3 but i don't think it's good enough to automate many jobs.
It's not clear to me how long this will take to fix. Memory augmented RNNs seem much better at generalization than transformers, but there isn't a clear path to training them on the entire web (way too costly). Although maybe that won't be necessary. Humans don't train on all that much information. 18 years of compressed video is a few terabytes of data."it still has problems with hallucinations and out of sample generalization. "
And people don't? We are not talking about GPT replacing the entirety of human jobs. But you can bet that a good 50% of non-manual work jobs can be replaced with the current state of the technology.Not to mention the amount of time we are going to save for menial tasks. I just asked it to write a recommendation letter for a student. I only told it his GDP and name. It gave me a 4 paragraph, well-structured letter in 5 seconds. I can now edit it, add some information I didn't tell it, and make the letter more personal. Shouldn't take much more than 3-4 minutes.
So around 5 minutes total to write a rec letter. I can do the same for emails, letter to editors or admin, etc. If I don't like it, I can just ask it a little bit differently and have a completely different version.
We are soon going to be able to do the same for graphs, tables, document structuring, etc.
-
Obviously it's a large increase in capability but it still has problems with hallucinations and out of sample generalization. The generalization has become significantly better than with GPT-3 but i don't think it's good enough to automate many jobs.
It's not clear to me how long this will take to fix. Memory augmented RNNs seem much better at generalization than transformers, but there isn't a clear path to training them on the entire web (way too costly). Although maybe that won't be necessary. Humans don't train on all that much information. 18 years of compressed video is a few terabytes of data."it still has problems with hallucinations and out of sample generalization. "
And people don't? We are not talking about GPT replacing the entirety of human jobs. But you can bet that a good 50% of non-manual work jobs can be replaced with the current state of the technology.The hallucination failure mode is definitely stronger in LLMs than with humans.
Generalization is harder to evaluate. It seems like GPT-4 still struggles with logic problems that indicate it hasnt fully internalized logic. Note the poor AMC performance which are less plug and chug. I would expect less adaptability than desired for many white collar jobs.
I strongly suspect these problems will be fixed in a few years, but for now i think things are unlikely to dramatically change.