Back to History

Fields Changed

Registration

Field Before After
Study Withdrawn No
Data Collection Complete Yes
Final Sample Size: Number of Clusters (Unit of Randomization) 30 papers
Was attrition correlated with treatment status? No
Final Sample Size: Total Number of Observations 480 paper-rater observations
Final Sample Size (or Number of Clusters) by Treatment Arms 240 paper-rater observations for original papers, 240 paper-rater observations for edited papers
Public Data URL https://osf.io/dgfxy/
Program Files Yes
Program Files URL https://osf.io/dgfxy/
Is data available for public use? Yes
Back to top

Papers

Field Before After
Paper Abstract For papers to have scientific impact, they need to impress our peers in their role as referees, journal editors, and members of conference committees. Does better writing help our papers make it past these gatekeepers? In this study, we estimate the effect of writing quality by comparing how 30 economists judge the quality of papers written by PhD students in economics. Each economist judged five papers in their original version and five different papers that had been language edited. No economist saw both versions of the same paper. Our results show that writing matters. Compared to the original versions, economists judge edited versions as higher quality; they are more likely to accept edited versions for a conference; and they believe that edited versions have a better chance of being accepted at a good journal.
Paper Citation Feld, J., Lines, C., & Ross, L. (2023). Writing matters.
Paper URL https://www.iza.org/publications/dp/16571
Back to top