Below is my conversation with Ken Koedinger, Director of the Pittsburgh Science of Learning Center (PSLC) and Professor at the Human Computer Interaction Institute at Carnegie Mellon University. This is part two of my report on the PSLC conference. Part one is here.
What is the purpose of the PSLC?
To leverage technologies and use the scientific approach to understand what makes robust student learning, to get long term outcomes for performance that matter. We want to find the small things that are really effective and that make a difference, in interactive instruction, in materials, in multimedia. By leverage I mean that we already have the technology infrastructure to do research for instruction into the science of learning, we just need to use it better.
Just like Google is doing experiments to constantly improve their user experience, we can and should be doing the same in education; we can make slightly different assignments, collect data on the educational outcomes, and select the best practices and techniques.
Sometimes there is a chasm between science and practice. There are findings in cognitive psychology that are not being connected to education practice, sometimes we know more in the scientific world than is being applied. Other times the problems of education are not being addressed by cognitive science. What we know in our theories of learning come both from the academics and practice, but also from what it takes to get machines to learn. We want to apply these different sources of information, and use technology to both teach and measure the results.
How did you start?
On the one hand, we had a growing feeling that we had been successful in creating a cognitive tutor for math courses (that became the company Carnegie Learning). But there was so much else that needed to be done. We needed to be better at communicating what we were doing from a scientific and educational standpoint.
On the other hand, there was funding available from the NSF. I was one of those people who said we ought to go for that; there was a cluster of folks from psychology, computer science, human-computer interface, and language technologies, at Carnegie Mellon University and the University of Pittsburgh, and we thought we could put together a competitive proposal around robust learning which would take advantage of the technologies we’d been developing to do experiments, acquire data, and perform instruction.
Our model was a combination of a research hospital and the Hubble telescope. A research hospital treats patients, and also measures and reports on what works best. Those are twin parts of its mission. Our LearnLab does the same for education.
So much of astronomy was driven by the technology of the Hubble telescope, which allowed us to see areas of the universe we could never see before. Technology allows us to get such richer data about instruction and results, today we are much better able to measure and analyze, to know what is working.
Somehow we all think that because we all have our own minds that learn, that we know how we learn, but that’s not enough. There are some fundamental scientific questions about the way the mind works, and how people learn, and the only way to understand it is to get more data and analyze it.
What was the purpose of the PSLC conference that you held today (February 18, 2009)?
We want to reach out to industry and make a connection to what is going on there, to push it forward and be proactive instead of just trickle down research. Another reason is that the NSF is saying that we need a way to make the center sustaining, beyond the next 5–10 years of funding that they will provide. Engineering centers at universities have done that through corporate affiliate programs, and we thought this conference might be a way to start up a similar program for the PSLC.
For the attendees, they were coming to see an atmosphere of scientifically based research in education. They have pressures to improve, and the science and technology we employ might make sense. Some of our tools and course offerings might be attractive. They could learn from the science we use to determine principles of effective instruction:
Many current texts and on-line tutors use examples,but PSLC research is showing the ratio of examples to problems should be much higher than it is now. More generally, we are discovering methods to help students understand the deeper underlying concepts and ignore the irrelevant details in the areas that they have to learn.
Here is a great example of deep underlying concepts versus irrelevant details. If you put together a bunch of physics problems and ask novice students to sort them by group, they’ll organize the problems based on the way they look: all the pulley problems, all the inclined plane problems, etc. If you ask experts, they will group the problems based on the physics principles: conservation of energy problems, conservation of momentum problems, etc.
There are various techniques or principles of learning that help students get that robust understanding and learning, and we are presenting many today. These principles are also documented in our wiki.
What are your take-aways from the conference?
I was very pleased with the engagement of the participants, both the number and the level of involvement; people traveled here from all over the country.
There was a lot of learning and exchange of information that was productive from both sides. I know that folks from the companies are going back with new ideas about what they should do with their companies. They are going to copy some of these things, just as it should be. We don’t expect to make money from all of our ideas, and we are very happy if things get copied.
The researchers here also learned a lot from these interactions, sometimes about products that already exist and are pretty powerful, sometimes better than the great stuff that we thought we were doing. We learned about the needs of the industry folks, which are causing us to think about our tools, and whether they can become products.
We hope that we got people to start thinking that they’ve got to start getting data, or start analyzing the data they have, to see how people are learning and what is effective.
I know a lot of the industrial people are thinking very had about how they can take advantage of some of these tools or research.
We had a lot of students here at various stages, and there has already been a summer internship arranged. Other students are talking about positions as well.
What do you feel are some of the PSLC’s biggest accomplishments?
Most important is the creation of the DataShop, which is an online repository of learning interactions and a set of analysis and reporting tools. This is a great opportunity to push science forward, and it continues to grow.
Having the DataShop allows us not just to run a study, but also to look at the students in the study to see what happens in the weeks after. We did one study using an online help-seeking tutor, an automated tutor that prompted students to seek out help when they needed it. Weeks later, we could see students’ help seeking skills increased. We’ve also had at least 40 papers on secondary analyses of the data in DataShop, apart from the original primary research.
The idea of in vivo education experiments; we can use courses to test instructional treatments, just as we test medical treatments in research hospitals. We’ve done 170 different experiments. We’ve learned a lot of lessons even if we don’t have the full answer yet. We have to change the education viewpoint, to make research part of what schools do, to make it part of their charter. We do better with that at the college level, because there is some commitment to research. But, it’s hard to get say a chemistry researcher to admit that you can do research on chemistry instruction.
If there were some discretionary budget in K12 institutions, it might help make them appreciate that doing research, improving instruction, is part of their reason for being. Possibly the for-profits will lead by showing how better techniques decrease their costs or improve their product.
We’ve had many great insights. One quickly; in physics we contrasted whether a video should explain the steps of solving a problem, go through the steps, or ask the student to try to explain to themselves what is going on and make connections to their real-world knowledge. The first surprising finding was that when the professor explains the steps, it does no good at all. Students who received explanations did no better on later robust learning tests than students who simply saw the professor demonstrate the steps in the solution without explanation.
But prompting students to provide their own explanations of the steps; that has a profound effect. That was the second surprise. Students who were prompted to self-explain not only learned better in the electricity unit where the prompts were given, but those students also got better at learning physics. Even though the prompts were no longer present in the next unit on magnetism, students who had be prompted to “self-explain” in the prior unit, learned magnetism more quickly and more effectively.
Another insight: you can’t just give one example, and then have students solve problems. You need to provide one worked example per critical decision point, per skill, or per concept. When we give students problems that require multiple skills, students often flounder. They don’t have enough information to learn to solve these complex problems. Students need numerous repeated examples of worked out problem solutions. The ideal relationship between worked examples and problems to solve is one to one. This is completely at odds with what we do in homework now in math and science. Ninety percent of what is on a homework assignment is problems to solve. Half of the activity should be to study solutions that illustrate each skill, and possibly ask students to explain them. This result is relevant to industry training, too, and applies to paper-based homework as much as it does to computer-based homework.
We’ve been able to detect things in computer tutor data streams that you wouldn’t think you can detect. If there is a time gap in what a student was doing, we can tell if the student was on-task or off-task; we can predict if a gap was productive or not. It turns out that there are patterns. If there is evidence of struggle like lots of errors, a pause, and then a quick resolution, the student was very likely to be talking to a teacher or fellow student about the content. If there is a pause, and then the same normal pattern of behavior, the student was not on-task – perhaps talking about weekend plans. So now, if we want to assess student effort, we can count just the gaps that were off task. We found that off-task time gaps are correlated with slower learning, but the on-task time gaps are not.
There is some chance we can determine emotional state: confusion, boredom, or frustration, based on the timing and quality of student interactions. This data is just starting to come out. If I know that a student should know the answer, but they are making certain types of error responses, perhaps they are gaming the system. If we can detect the student’s disposition, we can intervene to make the learning time more productive.
What if someone wanted to learn more or get involved with the PSLC?
We have the LearnLab site which has links to the Principles of Learning wiki and the DataShop. Companies can contact us and can become sponsors, gaining access to our research, tools, and researchers. We intend to repeat this conference next year. And, during the summer, we conduct 1-week intensive courses that incorporate our tools and findings.
Probably the best person to contact is Michael Bett, the PSLC Managing Director at mbett@cs.cmu.edu.
Additional Announcement from Farimah and me
Join us at the SIIA Ed Tech Government Forum, March 17-18, Washington, DC, an event that will translate the Stimulus and other federal funding into actionable market intelligence.
This annual Forum offers an excellent opportunity for you to learn from government, education, and policy leaders how public policies, programs and legislation will affect the education business, schools, and districts. Confirmed speakers include Ilene Berman, NGA; Alice Cain, House Ed & Labor Committee; Mike Cohen, Achieve; Dan Domenech, AASA; Chris Minnich, CCSSO; Andy Rotherham, Ed Sector; Gene Sanders, Cleveland Schools CEO; and the most knowledgeable person on US government Education policies we know, the SIIA's Mark Schneiderman.
You can obtain a $100 discount if you enter “PRMFF9” when you register online here: http://siia.net/etgf/2009/register.asp