WELCOME!
I am a PhD student at the Berlin School of Economics and Humboldt Unversität zu Berlin, where I am working on applied microeconomic theory. I am also a fellow at the Berlin Centre for Consumer Policies (BCCP). In my freetime, I contribute to the public good of ensuring reproducibility and replicability of economic research.
Find my CV here and feel free to contact me at paul.rosmer@hu-berlin.de.
PUBLISHED PAPERS
Reproducibility in Management Science (with Fišar, Greiner, Huber, Katok, Ozkes, and Management Science Reproducibility Collaboration). Management Science (2023).
Note: Member of the Management Science Reproducibility Collaboration.
Abstract | Paper
With the help of more than 700 reviewers, we assess the reproducibility of nearly 500 articles published in the journal Management Science before and after the introduction of a new Data and Code Disclosure policy in 2019. When considering only articles for which data accessibility and hardware and software requirements were not an obstacle for reviewers, the results of more than 95% of articles under the new disclosure policy could be fully or largely computationally reproduced. However, for 29% of articles, at least part of the data set was not accessible to the reviewer. Considering all articles in our sample reduces the share of reproduced articles to 68%. These figures represent a significant increase compared with the period before the introduction of the disclosure policy, where only 12% of articles voluntarily provided replication materials, of which 55% could be (largely) reproduced. Substantial het-erogeneity in reproducibility rates across different fields is mainly driven by differences in data set accessibility. Other reasons for unsuccessful reproduction attempts include missing code, unresolvable code errors, weak or missing documentation, and software and hardware requirements and code complexity. Our findings highlight the importance of journal code and data disclosure policies and suggest potential avenues for enhancing their effectiveness.
WORKING PAPERS
Mass Reproducibility and Replicability: A New Hope (with Brodeur, Mikola, Cook et al.)
Sumbitted to ReStud.
Abstract | Paper
This study pushes the boundaries of understanding research reliability by mass reproducing
and replicating claims from 110 papers in leading economic and political science journals. The
analysis involves computational reproducibility checks and robustness assessments. It reveals
several patterns. First, we uncover a high rate of fully computationally reproducible results (over
85%). Second, excluding very minor errors like missing packages or paths, we uncover coding
errors for about 25% of studies, with some studies containing multiple errors. Third, we test the
robustness of the results to 5,511 re-analyses. We find a robustness reproducibility of about 70%.
Robustness reproducibility rates are relatively higher for re-analyses which introduce new data
and lower for re-analyses which change the sample or the definition of the dependent variable.
Fourth, 52% of re-analysis effect size estimates are smaller than the original published estimates
and the average statistical significance of a re-analysis is 77% of the original. Lastly, we rely on a
“many-analysts” approach to answer eight additional research questions on the determinants of
robustness reproducibility. Most analyst teams find a negative relationship between replicators’
experience and reproducibility, while finding no relationship between reproducibility and the
provision of intermediate or even raw data combined with the necessary cleaning codes.
WORK IN PROGRESS
Maximum Matching in Sparse Matching Markets