Field | Before | After |
---|---|---|
Field Study Withdrawn | Before | After No |
Field Data Collection Complete | Before | After Yes |
Field Final Sample Size: Number of Clusters (Unit of Randomization) | Before | After 30 papers |
Field Was attrition correlated with treatment status? | Before | After No |
Field Final Sample Size: Total Number of Observations | Before | After 480 paper-rater observations |
Field Final Sample Size (or Number of Clusters) by Treatment Arms | Before | After 240 paper-rater observations for original papers, 240 paper-rater observations for edited papers |
Field Public Data URL | Before | After https://osf.io/dgfxy/ |
Field Program Files | Before | After Yes |
Field Program Files URL | Before | After https://osf.io/dgfxy/ |
Field Is data available for public use? | Before | After Yes |
Field | Before | After |
---|---|---|
Field Paper Abstract | Before | After For papers to have scientific impact, they need to impress our peers in their role as referees, journal editors, and members of conference committees. Does better writing help our papers make it past these gatekeepers? In this study, we estimate the effect of writing quality by comparing how 30 economists judge the quality of papers written by PhD students in economics. Each economist judged five papers in their original version and five different papers that had been language edited. No economist saw both versions of the same paper. Our results show that writing matters. Compared to the original versions, economists judge edited versions as higher quality; they are more likely to accept edited versions for a conference; and they believe that edited versions have a better chance of being accepted at a good journal. |
Field Paper Citation | Before | After Feld, J., Lines, C., & Ross, L. (2023). Writing matters. |
Field Paper URL | Before | After https://www.iza.org/publications/dp/16571 |