In the run up to the dotcom boom and bust, Cindy Miles and I published an article called Are you Dying to Use Technology. In the piece we made the case that you could track many faculty members’ adoption of technology along Elizabeth Kubler-Ross’ stages of death and dying. Denial, anger, bargaining, depression, and acceptance were everywhere, as the uses of E-mail, PowerPoint, and the Web were driving toward the mainstream.
Casey Green’s annual survey of higher education’s adoption of technology—the Campus Computing Survey—is pretty clear that these basic technologies have arrived. Indeed, over the last fifteen years, we have moved technology and technology leaders from the basement to the boardroom (literally in some cases). Most campuses now employ a chief technology officer and target an increasing amount of their infrastructure budgets toward technology.
This technology mainstreaming notwithstanding, conversations about technology are changing. Folks today are not wondering whether their colleges will leverage technology, but whether or not they are getting a true return on investment—is student learning really being improved with all these bits and bytes. And it’s not just about technology use. There are national, state, and local dialogues about all of our administrative and academic strategies and their effectiveness.
Enter a whole new phase of “dying to use” adoption. This time, it’s about research and analytics, the use of data to inform decisions about reaching and teaching students, or evidence-based education. The beginnings of this movement can be traced back to the days of total quality management and then the learning revolution. Terry O’Banion’s A Learning College for the 21st Century is the seminal work in the latter movement. In this book and in speeches nationally and internationally, he argued that colleges in the future would need to consistently and doggedly answer two key questions: (1) do our policies, procedures, and practices improve and expand learning?; and, more important, (2) how do we know?
As I’ve noted here before, groups like the Community College Survey of Student Engagement have begun to collect data directly from students, and use national benchmarks, to try to answer these questions. Projects like the Lumina Foundation’s Achieving the Dream are pulling colleges together to develop common data definitions (e.g., what is a full-time student, part-time student, course success) so they can start using common data across multiple institutions to really get a handle on what works in driving access and success in community colleges. The Education Testing Service (ETS) recently held a summit in Charlotte called Building a culture of Evidence from the Ground Up that explored these programs and others in community colleges. The reports from the presenters noted that it is hard, but incredibly substantial and rewarding work. It really is making a difference.
CCSSE, Achieving the Dream, and ETS Summits are happening in the context of accrediting agencies demanding plans from institutions to define and measure learning outcomes as part of their accreditation reviews. Moreover, the federal government is beginning to poke—and poke hard—at our use of data with reports like the one recently published by the Spellings Commission on the Future of Higher Education. Not surprisingly, we are beginning to see the movement of institutional research tools and personnel from in-the-shadows staffers to center stage.
Because of the push from programs and the powers that be, folks are once again feeling like they are “dying to adopt” something new—this time it is analytics and data use. All stages can be seen, sometimes in different people, sometimes in the same person over time: (1) Denial: “These data cannot be right!” (2) Anger: “Don’t these **#$@ administrators have anything better to do!” (3) Bargaining: “We can measure passing, but we’ll never really measure learning” (4) Depression: “I think it may be time to retire”, to (5) Acceptance: “I wonder what these data really mean?”
However, as was the case with technology, we’re always better if we don’t drive this as a fad or as a top-down directive. We really don’t have to slam people through stages of death and dying. The folks at CCSSE, ATD, and ETS note that we’re much better off inviting faculty and staff to the table and inviting them to help shape the process.
For example, Steve Mittlestet at Richland College in Dallas Texas—a college that recently became the first institution of higher education in the country to receive the Malcolm Baldridge Quality Award—states that his college focused first on building a culture that valued learning together. He cautions colleagues that when beginning to do work on analytics, outcomes, and data you have to stop the natural tendency to adopt a culture of blame (e.g., who is failing with these students!), and promote a culture of wonder (e.g., I wonder what we can do together to turn developmental math around?). Talented faculty members are trained to wonder. Give them good data, invite them to the table, and be willing to start the conversation. When we can substantially improve student learning in the process, this can be a compelling conversation indeed!
Learning together is never easy. But, it’s at the heart of quality work in the academy. To do our part in the process, NISOD will be hosting an analytics summit at our annual conference in May 2007. We set this kind of stage because we know that talented teachers strongly believe in the CASE method—Copy And Steal Everything. Folks who care deeply about reaching and teaching students have always enjoyed learning together, always enjoyed the task of tackling tough problems with talented people. Now, with better information at our fingertips, we’ll have more tools in the toolkit than ever before.
This time let’s make it clear, we’re not “dying to adopt” a new trend; but we are dying to learn together, so we can help our students learn for a lifetime.