Robust and reproducible processes are critical to maintaining confidence in the research enterprise and ensuring the generation of reliable results upon which science builds. The growing use of large-scale, complex data across disciplines has brought this challenge to the forefront, with the generation of reproducibility standards for research being, itself, an emergent foundation of data science.
Standards for open, reproducible research with big data must be clearly articulated and broadly integrated into the research activities at U of T and internationally. Reproducibility requires standards for fundamental research activities, such as reproducible pipelines for processing, cleaning, and sharing of data and code, and best practice application of analytic procedures.
This theme focuses on the development of widely adoptable methodology, processes, and infrastructure to share data and code locally and in privacy-compliant ways, and the development of infrastructure, methods and models that support reproducible and reliable research.
Stay tuned for upcoming news and events.
Co-Leads
Rohan Alexander
Assistant Professor, Faculty of Information and Statistical Sciences, Faculty of Arts & Science
Jason Hattrick-Simpers
Professor, Materials Science & Engineering, Faculty of Applied Science & Engineering
Benjamin Haibe-Kains
Senior Scientist, University Health Network and Medical Biophysics, Temerty Faculty of Medicine