The higher education consulting firm Ithaka has done a whole series of case studies on innovative practices in higher education, providing food for thought for folks in the sector as they think about how to change. I've had several conversations with Martin Kurzweil there about the kind of "learning engineering" work we're doing at Kaplan: applying learning science at scale by training up instructional designers on what cognitive science is finding about how expertise, learning, and various media work together, investing in developing valid and reliable probes of learning outcomes, and checking carefully (e.g., with randomized controlled trials) to see if expected improvements in learning outcomes are happening with our students in various programs.
Ithaka was particularly interested in the work we've been doing at Kaplan University (KU), led by their President Betty Vandenbosch, to use randomized controlled trials (RCTs) as a systematic way to check "what works" (or doesn't) before deployment. As you'll see in this case study by Ithaka, KU's Research Pipeline team has set up and run more than a hundred different RCTs in the last few years to look for impact. As a consequence, they've had to build out good tools and processes to monitor these pilots along the way - it's far different to manage dozens of trials at once in various stages of development compared with doing an occasional trial or two.
Ithaka thinks there's something to be learned from KU's experience for all higher education institutions. I think so too!