Scientists are increasingly concerned with making their work easy to verify and build upon. Associated practices include sharing data, materials, and analytic scripts, and preregistering protocols. This has been referred to as a “credibility revolution”. The credibility of empirical legal research has been questioned in the past due to its distinctive peer review system and because the legal background of its researchers means that many often are not trained in study design or statistics. Still, there has been no systematic study of transparency and credibilityrelated characteristics of published empirical legal research. To fill this gap and provide an estimate of current practices that can be tracked as the field evolves, we assessed 300 empirical articles from highly ranked law journals including both faculty-edited journals and student-edited journals. We found high levels of article accessibility (86% could be accessed without a subscription, 95% CI = [82%, 90%]), especially among student-edited journals (100% accessibility). Few articles stated that a study’s data are available, (19%, 95% CI = [15%, 23%]), and only about half of those datasets are reportedly available without contacting the author. Preregistration (3%, 95% CI = [1%, 5%]) and availability of analytic scripts (6%, 95% = [4%, 9%]) were very uncommon. We suggest that empirical legal researchers and the journals that publish their work cultivate norms and practices to encourage research credibility.
Jason Chin, Kathryn Zeiler, Natali Dilevski, Alexander Holcombe, Rosemary Gatfield- Jeffries, Ruby Bishop, Simine Vazire & Sarah Schiavone,
The Transparency of Quantitative Empirical Legal Research (2018–2020)
Available at: https://scholarship.law.bu.edu/faculty_scholarship/1330