In keeping with the theme of using technology to improve access to education, I thought I'd share some thoughts about some reading I've done recently on why ICTs haven't caught on more, and how that fits into general theories about how people learn - or in some cases, how we don't learn.
One of my favorite blogs out there right now is Creating Passionate Users. Kathy Sierra writes about how designers and marketers of software and other technological products can do more to help people use their products successfully. However, there is plenty in her site that can be transferred to education, whether or not you're using the latest information technology. A post that Kathy wrote some time back that turned me into a regular reader is titled Most classroom learning sucks. That doesn't sound like something a teacher will want to read, but here's how it starts:
The best learning occurs in a stimulating, active, challenging, interesting, engaging environment. It's how the brain works. The best learning occcurs when you move at least some part of your body. The best learning occurs when you're actively involved in co-constructing knowledge in your own head, not passively reading or listening... Forcing people to sit in a chair and listen to (or read) dry, formal words (with perhaps only a few token images thrown in) is the slowest, least effective, and most painful path to learning.
So why do we all do it that way? I recently found an article by Grandon Gill, 5 (Really) Hard things about using the Internet in Higher Education. Grandon discusses some general obstacles to adopting new technologies in education, but in my opinion none of the excuses he gives (lack of models, having to keep up with changes, not being understood by others), is exclusive to ICTs in education, but rather are the same old excuses used by anyone who resists innovation. By the end of the article you realize that Grandon is actually in favor of being a pioneer, but I think he creates a picture of innovators and early adopters working in isolation from each other and fighting alone against the current, when that really isn't the case.
Usually with innovations (and not just the ICT kind), you get a few enthusiastic early users, who might enjoy using the innovation for all kinds of
wrong reasons purposes not intended by its creator. These early users tend not to be good examples for the ones who follow, those who try to implement those innovations either out of coercion or a misunderstanding of the innovation's purpose. Like I hinted at before in this site, innovations are best adopted when they help us do something that already needed to be done.
In Grandon's case, he teaches in an MIS department, and found that the new ICTs tools available for teaching create opportunities for different ways of learning. Student-to-student interaction becomes easier and more creative, and the role of the teacher changes into that of a facilitator. Of course if a teacher isn't ready to become a facilitator and co-learner, technology integration in the classroom is going to be a rough ride.
Lastly, David Pollard wrote on why collaboration tools are so underused. Like, Grandon Gill, several of the reasons he gives have more to do with how people think communication, group work, and learning are supposed to happen, and then have trouble with tools that create opportunities for new paradigms. I'm very enthusiastic about the potential for wikis and podcasts and other cool web 2.0 tools for helping people learn and create knowledge. But if people don't know how to work well together in the same room, then the best wikis, intranets and telecom systems in the world won't help.
So why don't we learn more, or better? I believe that we try too often to learn the wrong thing: a tool that doesn't fit the job we're trying to do, a skill that impresses but doesn't satisfy, gaining more information rather than gaining wisdom. It's always easier to figure out how to go somewhere if you first know where you're going.