A special thanks to Gerald Napoles, a doctoral student from the College of Education at the University of Texas at Austin, who sent us this link to an interesting story about college presidents: College Presidents’ Blogs Opens Door to Controversy: Some Get More than They Bargained For.
What’s somewhat striking about the article is how it portrays the blog as a somewhat new phenomenon in making information more public, allowing anonymous attacks, and stirring up controversy. I hate to break it to the author, but these are the same challenges that group e-mail, listservs, and bulletin boards—or just basic web sites—have posed for more than a decade. Indeed, folks have been using web tools to attack administrators and faculty for years. Whether it’s “Corruption at LaGuardia Community College” (an attack that has been going on so long it’s almost comical) or http://www.myprofessorsucks.com/, the online world is not shy about ganging up on folks. Indeed, the folks at the Cluetrain Manefesto have long posited that this postmodern pattern will be a pernicious day-to-day affair for organizations trying to serve any clientele.
This always-on, often-attacked phenomenon is one of the major arguments for extreme authenticity; because in a connected and transparent world, you’re not going to hide anything for long. What we’re getting more used to however, is what folks in small towns have known for a long time. In tight communication circles, some stories are true and others are just really interesting or really inappropriate.
I am concerned that the presidents from this story aren’t more careful about moderating their blogs. While moderation slows down the conversation a bit, it’s the only fair thing to do for your online participants. There are plenty of other open air communication vehicles for folks to vent; why would you allow your blog to be hijacked by the hyperbolic? Good online teachers have known this for years. It’s one of the reasons many choose moderated threaded discussions over chats for class online discussions.
In short, let’s not flog the blog. Let’s instead learn how to leverage it more effectively as an ongoing communication vehicle for communities committed to learning together.
Friday, November 24, 2006
Tuesday, November 14, 2006
Are Student’s More Engaged Online or In-Class?
The National Survey of Student Engagement (NSSE) just released their 2006 Annual Report. This survey of 260,000 randomly selected students from 523 colleges and universities had some interesting findings, not the least of which was that online learning students report the same-or-higher overall engagement scores when compared to in-class students. The data actually make sense if you think about the high numbers of students in large-lecture classrooms in the US who at best feign engagement throughout the semester en route to taking two multiple-choice tests (mid-term and final exams) that measure their “learning.” However, online students did report lower active and collaborative learning scores than their in-class counterparts.
Given that most students will experience a blend of teaching and learning methods—online and in class—we need to explore these data carefully to see what works best in which context to achieve specific learning objectives. However, something NSSE is being criticized for is the private nature of much of their data. Unlike its sister survey CCSSE, which demands public reporting from all participating institutions, many NSSE institutions are able to keep their data from students. I guess some institutions don’t want to engage their students or us in conversations about engagement!
Given that most students will experience a blend of teaching and learning methods—online and in class—we need to explore these data carefully to see what works best in which context to achieve specific learning objectives. However, something NSSE is being criticized for is the private nature of much of their data. Unlike its sister survey CCSSE, which demands public reporting from all participating institutions, many NSSE institutions are able to keep their data from students. I guess some institutions don’t want to engage their students or us in conversations about engagement!
Saturday, November 04, 2006
Dying to Learn Together
In the run up to the dotcom boom and bust, Cindy Miles and I published an article called Are you Dying to Use Technology. In the piece we made the case that you could track many faculty members’ adoption of technology along Elizabeth Kubler-Ross’ stages of death and dying. Denial, anger, bargaining, depression, and acceptance were everywhere, as the uses of E-mail, PowerPoint, and the Web were driving toward the mainstream.
Casey Green’s annual survey of higher education’s adoption of technology—the Campus Computing Survey—is pretty clear that these basic technologies have arrived. Indeed, over the last fifteen years, we have moved technology and technology leaders from the basement to the boardroom (literally in some cases). Most campuses now employ a chief technology officer and target an increasing amount of their infrastructure budgets toward technology.
This technology mainstreaming notwithstanding, conversations about technology are changing. Folks today are not wondering whether their colleges will leverage technology, but whether or not they are getting a true return on investment—is student learning really being improved with all these bits and bytes. And it’s not just about technology use. There are national, state, and local dialogues about all of our administrative and academic strategies and their effectiveness.
Enter a whole new phase of “dying to use” adoption. This time, it’s about research and analytics, the use of data to inform decisions about reaching and teaching students, or evidence-based education. The beginnings of this movement can be traced back to the days of total quality management and then the learning revolution. Terry O’Banion’s A Learning College for the 21st Century is the seminal work in the latter movement. In this book and in speeches nationally and internationally, he argued that colleges in the future would need to consistently and doggedly answer two key questions: (1) do our policies, procedures, and practices improve and expand learning?; and, more important, (2) how do we know?
As I’ve noted here before, groups like the Community College Survey of Student Engagement have begun to collect data directly from students, and use national benchmarks, to try to answer these questions. Projects like the Lumina Foundation’s Achieving the Dream are pulling colleges together to develop common data definitions (e.g., what is a full-time student, part-time student, course success) so they can start using common data across multiple institutions to really get a handle on what works in driving access and success in community colleges. The Education Testing Service (ETS) recently held a summit in Charlotte called Building a culture of Evidence from the Ground Up that explored these programs and others in community colleges. The reports from the presenters noted that it is hard, but incredibly substantial and rewarding work. It really is making a difference.
CCSSE, Achieving the Dream, and ETS Summits are happening in the context of accrediting agencies demanding plans from institutions to define and measure learning outcomes as part of their accreditation reviews. Moreover, the federal government is beginning to poke—and poke hard—at our use of data with reports like the one recently published by the Spellings Commission on the Future of Higher Education. Not surprisingly, we are beginning to see the movement of institutional research tools and personnel from in-the-shadows staffers to center stage.
Because of the push from programs and the powers that be, folks are once again feeling like they are “dying to adopt” something new—this time it is analytics and data use. All stages can be seen, sometimes in different people, sometimes in the same person over time: (1) Denial: “These data cannot be right!” (2) Anger: “Don’t these **#$@ administrators have anything better to do!” (3) Bargaining: “We can measure passing, but we’ll never really measure learning” (4) Depression: “I think it may be time to retire”, to (5) Acceptance: “I wonder what these data really mean?”
However, as was the case with technology, we’re always better if we don’t drive this as a fad or as a top-down directive. We really don’t have to slam people through stages of death and dying. The folks at CCSSE, ATD, and ETS note that we’re much better off inviting faculty and staff to the table and inviting them to help shape the process.
For example, Steve Mittlestet at Richland College in Dallas Texas—a college that recently became the first institution of higher education in the country to receive the Malcolm Baldridge Quality Award—states that his college focused first on building a culture that valued learning together. He cautions colleagues that when beginning to do work on analytics, outcomes, and data you have to stop the natural tendency to adopt a culture of blame (e.g., who is failing with these students!), and promote a culture of wonder (e.g., I wonder what we can do together to turn developmental math around?). Talented faculty members are trained to wonder. Give them good data, invite them to the table, and be willing to start the conversation. When we can substantially improve student learning in the process, this can be a compelling conversation indeed!
Learning together is never easy. But, it’s at the heart of quality work in the academy. To do our part in the process, NISOD will be hosting an analytics summit at our annual conference in May 2007. We set this kind of stage because we know that talented teachers strongly believe in the CASE method—Copy And Steal Everything. Folks who care deeply about reaching and teaching students have always enjoyed learning together, always enjoyed the task of tackling tough problems with talented people. Now, with better information at our fingertips, we’ll have more tools in the toolkit than ever before.
This time let’s make it clear, we’re not “dying to adopt” a new trend; but we are dying to learn together, so we can help our students learn for a lifetime.
Casey Green’s annual survey of higher education’s adoption of technology—the Campus Computing Survey—is pretty clear that these basic technologies have arrived. Indeed, over the last fifteen years, we have moved technology and technology leaders from the basement to the boardroom (literally in some cases). Most campuses now employ a chief technology officer and target an increasing amount of their infrastructure budgets toward technology.
This technology mainstreaming notwithstanding, conversations about technology are changing. Folks today are not wondering whether their colleges will leverage technology, but whether or not they are getting a true return on investment—is student learning really being improved with all these bits and bytes. And it’s not just about technology use. There are national, state, and local dialogues about all of our administrative and academic strategies and their effectiveness.
Enter a whole new phase of “dying to use” adoption. This time, it’s about research and analytics, the use of data to inform decisions about reaching and teaching students, or evidence-based education. The beginnings of this movement can be traced back to the days of total quality management and then the learning revolution. Terry O’Banion’s A Learning College for the 21st Century is the seminal work in the latter movement. In this book and in speeches nationally and internationally, he argued that colleges in the future would need to consistently and doggedly answer two key questions: (1) do our policies, procedures, and practices improve and expand learning?; and, more important, (2) how do we know?
As I’ve noted here before, groups like the Community College Survey of Student Engagement have begun to collect data directly from students, and use national benchmarks, to try to answer these questions. Projects like the Lumina Foundation’s Achieving the Dream are pulling colleges together to develop common data definitions (e.g., what is a full-time student, part-time student, course success) so they can start using common data across multiple institutions to really get a handle on what works in driving access and success in community colleges. The Education Testing Service (ETS) recently held a summit in Charlotte called Building a culture of Evidence from the Ground Up that explored these programs and others in community colleges. The reports from the presenters noted that it is hard, but incredibly substantial and rewarding work. It really is making a difference.
CCSSE, Achieving the Dream, and ETS Summits are happening in the context of accrediting agencies demanding plans from institutions to define and measure learning outcomes as part of their accreditation reviews. Moreover, the federal government is beginning to poke—and poke hard—at our use of data with reports like the one recently published by the Spellings Commission on the Future of Higher Education. Not surprisingly, we are beginning to see the movement of institutional research tools and personnel from in-the-shadows staffers to center stage.
Because of the push from programs and the powers that be, folks are once again feeling like they are “dying to adopt” something new—this time it is analytics and data use. All stages can be seen, sometimes in different people, sometimes in the same person over time: (1) Denial: “These data cannot be right!” (2) Anger: “Don’t these **#$@ administrators have anything better to do!” (3) Bargaining: “We can measure passing, but we’ll never really measure learning” (4) Depression: “I think it may be time to retire”, to (5) Acceptance: “I wonder what these data really mean?”
However, as was the case with technology, we’re always better if we don’t drive this as a fad or as a top-down directive. We really don’t have to slam people through stages of death and dying. The folks at CCSSE, ATD, and ETS note that we’re much better off inviting faculty and staff to the table and inviting them to help shape the process.
For example, Steve Mittlestet at Richland College in Dallas Texas—a college that recently became the first institution of higher education in the country to receive the Malcolm Baldridge Quality Award—states that his college focused first on building a culture that valued learning together. He cautions colleagues that when beginning to do work on analytics, outcomes, and data you have to stop the natural tendency to adopt a culture of blame (e.g., who is failing with these students!), and promote a culture of wonder (e.g., I wonder what we can do together to turn developmental math around?). Talented faculty members are trained to wonder. Give them good data, invite them to the table, and be willing to start the conversation. When we can substantially improve student learning in the process, this can be a compelling conversation indeed!
Learning together is never easy. But, it’s at the heart of quality work in the academy. To do our part in the process, NISOD will be hosting an analytics summit at our annual conference in May 2007. We set this kind of stage because we know that talented teachers strongly believe in the CASE method—Copy And Steal Everything. Folks who care deeply about reaching and teaching students have always enjoyed learning together, always enjoyed the task of tackling tough problems with talented people. Now, with better information at our fingertips, we’ll have more tools in the toolkit than ever before.
This time let’s make it clear, we’re not “dying to adopt” a new trend; but we are dying to learn together, so we can help our students learn for a lifetime.
Subscribe to:
Posts (Atom)