The cell sizes are inconsistent in a way that’s instantly recognizable to someone who teaches grad experimental methods or has taken them. It should be instantly recognizable and is the kind of thing you’d look out for in a graduate exam. Eg, I think the number of admins they say they got is impossible and directly contradictory- one time they say 500ish, another 700ish, and they’re close together on the page. That’s just one example. These aren’t small errors, exactly.
I’m gonna bet that Gender & Society will have one of these articles.
I don’t think the issue is refereeing per se, and I agree it is not possible for referees to find all fraud. I certainly don’t think it is reasonable to expect referees to replicate and look at the actual data. But just seeing the response rate the articpe claims makes it obvious something isn’t right. So basically I think this article shows how susceptible some fields are to work that fits their political beliefs.
I can’t figure out if I would be surprised if there were an example at an econ journal. THIS paper couldn’t make it in an econ journal, but could a more cleverly done one make it? A lower ranked journal like econ bulletin or something, maybe. That said, the more cleverly done, the less good an example. If it isn’t a glaring example, the point isn’t made nearly as well.