Much of my work in science policy has surrounded research reproducibility and work in effective altruism. Explore my projects below!


Collaboration with Addgene and Harvard Science Policy Group

Following a short stint working with a start-up with a reproducibility-driven mission, I decided I wanted to create an event that opened up a conversation between academic scientists and the forces outside of academia (journals, funding agencies, industry, policy groups). The idea for an afternoon symposium was born and made possible after I reached out to the scientists at Addgene about whether they wanted to co-organize. After forming this partnership, I worked over the course of several months to secure speakers and panelists for the event presented below. I also moderated the event and panel as seen in the video. 

Event schedule: 


  • 3:10 - 3:30 - Reproducibility Overview - Jeffrey S. Flier, Researcher at Harvard Medical School, former dean of the faculty of Medicine at Harvard University
  • 3:35 - 3:55 - Reagent sharing - Susanna Bachle, Addgene the nonprofit plasmid repository
  • 4:00 - 4:20 - Reagent Development - Steven C. Almo, Institute for Protein Innovation 


  • 4:25 - 4:55
    • Alex Tucker (Ginkgo Bioworks)
    • Pamela Hines (Senior Editor at Science)
    • Edward J. Hall (Professor of Philosophy at Harvard University)
    • Tony Cijsouw (Neuroscience postdoc at Tufts University)

Consulting Project on Research Reproducibility

During the Fall of 2016, I worked on a philanthropic consulting project with Harvard's Effective Altruism group. Our clients were the Laura and John Arnold Foundation (LJAF) who wanted to learn more about how to encourage research integrity in the biomedical space. We examined research reproducibility and the many levers within and outside academia that encourage its practice. Over the course of ten weeks, our team worked with the client and provided a report of our findings. The full report can be found here.

Summary of Recommendations

In this report, we considered four broad interventions: post-publication peer review, quantitative metrics for transparency (e.g. “transparency index”), tenure criteria, and characterizing dependency relationships between pieces of research. All of these categories can be implemented in different ways, and can improve reproducibility through multiple channels. Post-publication peer review makes it easier to know the results of publications (e.g. in comments on PubMed Commons), and it also could reduce bias by creating a disincentive for scientists to manipulate results (for fear of facing more scrutiny). Quantitative metrics for transparency could change the culture in science towards more open practices, which would also open up more scrutiny of results as a disincentive to introduce bias, but also make it easier to produce replications (e.g. by having access to data and code). Tenure criteria similarly could create a culture shift, by requiring professors to perform replications and/or share data, but it seems difficult for philanthropy to have a direct effect in this area. And characterizing relationships between articles in scientific research can make research more efficient and accurate by allowing researchers to track unintuitive, tertiary effects of a particular finding in a paper.



Screen Shot 2018-05-13 at 9.50.54 PM.png

Dear Mister President: SITN Science Policy Article Collection

In the buildup to the election, my science outreach group Science in the News decided that we wanted to do a collection of articles on science policy issues--explaining the science behind science policy. I was one of the contributing editors for this special edition of the SITN blog. 

Harvard Science Policy DC Trip

Thirteen students were selected for the 2016 Washington, D.C. trip from over 30 applicants, reflecting a keen interest in Science Policy among graduate students. Some were already driven to pursue a career in science policy, and others sought a general understanding of how decisions about science are made in government, or how scientific information is communicated among legislators, staff, scientists, agency leadership, and voters. We wrote up a report detailing what we learned during the trip.  

We had the opportunity to visit a number of federal agencies to learn more about how science policy gets made including the National Academies, NIH, EPA, Department of State, National Science Foundation, and Department of Defense, among others. 

Additional Training

Introduction to Network Medicine, Harvard Clinical and Translational Science Center, Spring 2015

Engaging Scientists and Engineers in Policy (ESEP) member and event organizer

AAAS member

Union for Concerned Scientists member