Tuesday, June 23, 2020

Need for Speed: Documenting the Two-Week Systematic Review

In a recent post, we summarized a 2017 article describing the ways in which automation, machine learning, and crowdsourcing can be used to increase the efficiency of systematic reviews, with a specific focus on making living systematic reviews more feasible.

In a new publication in the May 2020 edition of the Journal of Clinical Epidemiology, Clark and colleagues incorporated automation in order to attempt systematic review that took no longer than two weeks from search design to manuscript submission for a moderately-sized search yielding 1,381 deduplicated records and eight ultimately included studies.

Spoiler alert: they did it. (In just 12 calendar days, to be exact).

Systematic Review, but Make it Streamlined

Clark et al. utilized some form of computer-assisted automation at almost every point in the project, including:
  • Using SRA word frequency analyzer to identify key terms that would be most helpful inclusions in a search strategy
  • Using hotkeys (custom keystroke shortcuts) within SRA Helper tool to more quickly screen items and search pre-specified databases for full texts
  • Using RobotReviewer to assist in risk of bias evaluation by searching for certain key phrases within each document
However, machines were only part of the solution. The authors also note the decidedly more human-based solutions that allowed them to proceed at an efficient clip, such as:
  • Daily, focused meetings between team members
  • Blocking off “protected time” for each team member to devote to the project
  • Planning for deliberation periods, such as decisions on screening conflicts, to occur immediately after screening so as to reduce the amount of time and energy devoted to “mental reload” and review of one’s previous decisions for context
Time Distribution of 12-Day Systematic Review by Task. Click to enlarge.

All told, the final accepted version of the manuscript took 71 person-hours to complete – a far cry from a recently published average of 881 person-hours among conventionally conducted reviews.

Clark and colleagues discuss key facilitators and barriers to their approach as well as provide suggestions for technological tools to further improve the efficiency of SR production.

Clark, J., Glasziou, P., Del Mar, C., Bannach-Brown, A., Stehlik, P., & Scott, A.M. A full systematic review was completed in 2 weeks using automation tools: A case study. J Clin Epidemiol, 2020; 121: 81-90.

Manuscript avaliable from the publisher's website here.