Thursday, December 19, 2013

What We Know About How Children Learn Math - And How It Can Help Us Close the Achievement Gap

(This week's post is the first half of a two-part article I wrote for Footnote1.com.)

When it comes to math, American students lag behind their counterparts in many European and Asian countries, as do American adults. Our nation’s fourth graders are outperformed in math by students from Singapore, Korea, Japan, Northern Ireland, and Hong Kong, while the U.S. ranks 19th in adult math skills among advanced democracies. These problems exist despite the fact that we spend $1.3 trillion a year – nearly 9% of the American GDP – on education. Why is such a promising system failing its students?

Read the full article...

Thursday, September 12, 2013

What's Holding Education Back? (Hint: Check Your Assumptions about Learning & Teaching)

Orlando-Ferguson-flat-earth-map edit

The World Has Moved On, But Education Hasn’t 

The world has changed dramatically during the past century. Most domains have changed right along with it, thanks in part to major advances in science and technology. Architecture, for example, has evolved with the invention of new kinds of materials, powerful computer-based design tools, and more sophisticated models of environmental impact.

Modern medicine, with innovations like brain scanners, artificial hearts, and gene therapy, would be virtually unrecognizable to a physician from the 1900’s. We could tell similar stories about the transformation of engineering, agriculture, and even business administration. Take a practitioner from the 1900’s in any of these fields and bring them into the present day and they would be thoroughly bewildered and unable to perform the job.

But what about education? If we transported a teacher from the 1900’s and put her in front of a classroom today, she would be able to take over the class without much trouble. There have been some changes, of course. Whiteboards and projectors have replaced blackboards, for example, and students now use laptops and iPads instead of slates. In World History, there’s over a century’s worth of new material to cover. But changes like these are largely cosmetic. Superficial differences aside, the way we educate today is fundamentally the same as it was one hundred and fifty years ago.

What’s Holding Education Back? 

Why has education stayed basically the same for so long while other domains like architecture and medicine have been completely transformed?

One possible explanation is that there’s simply no room for improvement – that education is already as good as it gets. We’ve explored this in previous posts, though, and based on the evidence I’d say this can’t possibly be true (for example, read this, this, and this).

A second possibility is that we don’t know enough yet – that we are still waiting for the big breakthroughs in science and technology that will enable education to advance the way architecture and medicine have. But the fact is that we already know far more about effective learning and teaching than we actually apply in mainstream educational practice. The root problem does not seem to be a lack of good ideas or proven methods.

I’d like to suggest a third possibility. What if education is being held back by a number of common assumptions about learning and teaching that seem completely obvious to most people but that are nonetheless completely and utterly wrong? What if these assumptions are so obvious and so deep-seated that many people aren’t even aware they are assumptions, and what if education can’t move forward until we surface these assumptions, examine them critically, and get people to revise them?

If this sounds far-fetched, consider these cases from the history of science:

  • Geography & Navigation: People assumed the earth was flat because if you look around it’s obvious to anyone that it is flat. Because of this assumption, sailors wouldn’t sail out of sight of land for fear of falling off the edge. Challenging this assumption freed people up to sail anywhere – it opened up the whole world to humanity.
  • Astronomy: People assumed the earth is stationary and sits at the center of the universe. Just look up in the sky – the earth is obviously holding still and everything else is circling around it (otherwise we’d all feel pretty dizzy, right?). Even after this assumption was challenged and evidence collected to demonstrate how wrong it was, it took a few centuries to bring everyone around. Changing it opened up the heavens to humanity. Space exploration and communications satellites are just two technologies that would not be possible today under the original (obvious but erroneous) assumption.
  • Biology & Medicine: Quite recently – at least as late as the 19th century - people generally assumed that a disease epidemic like cholera or the Black Death could be caused by a miasma – a cloud of toxic air released by rotting material. After all, if there is a bad smell in the air where a lot of people are getting sick, the most obvious explanation is that the air causing the bad smell must also be causing the bad illness. Once again: obvious, but wrong. Public health has improved greatly since people stopped trying to avoid miasma and started trying to avoid physical contact with people who are carrying disease-causing viruses and bacteria.

The list goes on and on…

So - what about education today? Is it possible that humanity is at this very moment living with some assumptions about learning and teaching that are so obvious and so deep-seated that they are not even recognized as assumptions but taken as incontrovertible facts?

I believe we are.

And not just one such assumption – loads and loads of them. And I propose that these obvious, virtually universal, and yet entirely misleading assumptions are a major reason education has stalled while nearly every other major domain of human endeavor has raced ahead. The same way that the flat earth assumption left most of the world unexplored, these assumptions lead us to educate students in ways that leave most of the subject matter unlearned.

These are bold claims. Let me provide a specific example.

It’s obvious to most people that engagement drives learning. It’s a very widespread assumption. In fact, it’s what leads people to take boring materials like math or chemistry flashcards and routinely attempt to inject “fun” into them by adding unrelated cartoons, competitions, sticker prizes, and the like.

But what if that obvious and deep-seated assumption is wrong? What if the learning actually drives the feeling of engagement instead of the other way around? Moreover, what if trying to artificially inject fun into the mix only gives the illusion of successful education – while actually degrading the quality of learning? There are reasons to believe that this is, in fact, the case.

Do Our Assumptions Really Make a Difference? 

You might well ask, "Does it really matter which is true - whether learning drives engagement or engagement drives learning?"

Yes, it matters a lot. To see why, let’s pose a similar question about one of our historical examples: “Does it really matter whether we assume the earth is flat or round?”

Consider:

If the earth is flat, then we should stay close to shore.
If the earth is a sphere, then we can sail anywhere.

Similarly:

If engagement drives learning then we should be able to produce high-quality learning even if we start with low-quality material by over-compensating with fun.

On the other hand…

If learning drives engagement, then we actually have to start with high-quality learning experiences if we expect to produce high-quality learning outcomes. Instead of “injecting” fun to make the learning happen, we’ll know the learning is happening when we see students engaging deeply with the subject matter itself. In this view, “fun” (or engagement) is not something one puts into the teaching so much as something one expects to see coming out of the learning.

The two different assumptions lead to two contradictory conclusions about how to educate effectively. Assumptions are important because they determine the strategies we use to pursue our goals, and some strategies work much better than others.

As the historical examples cited above illustrate, one way to change the world is to change widespread assumptions that seem obvious to everyone but in fact are simply wrong. It may be that easy – and that difficult – to start bringing education into the twenty-first century.

What do you think?

Wednesday, August 28, 2013

Teaching Math with Minecraft (An Impromptu Education Experiment)

In a previous post, we explored the following question:

How can less studying produce more learning?

We considered examples from chemistry, arithmetic, and foreign language studies where small differences in approach could produce large differences in learning outcomes - on the order of twice the learning in half the time (or better).

This week, I wanted to push this thinking one step further to explore the related question:

How difficult is it to design experiences that reliably produce more learning with less studying?

To do this, I set up an impromptu experiment. I chose some core math concepts as the subject matter and Minecraft as the learning medium. I spent about an hour preparing. I invited a second-grader known as "Swifty7777" (his Minecraft handle) to join me for a conversation, which I recorded. The full conversation (excerpted below) lasted more than half an hour and covered a lot of ground - from the definition of "number" to the commutative property of addition to the relationship between addition and multiplication. We explored some topics, like square roots, that were not part of my original plan but that emerged during the conversation. We both had fun and the time flew by.

The video below is a shorter excerpt of our conversation in which we explore five questions:
* What is a number?
* What is an even number?
* What is an odd number?
* What is a prime number?
* What is a square root?



What do you think of this experiment?
What do you think we can learn from it?

If you liked this "learning with Minecraft" video, check out these others:
Minecraft Math #2: Understanding Addition, Multiplication & Commutativity
Minecraft Scientists Ep. #1: Fishin' In The Rain

Monday, August 19, 2013

Children and Technology: How Should We Manage Kids' Screen Time?

Jessie Willcox Smith, A Child's Garden of Verses
(Originally posted at nativebrain.com)
Is technology good or bad for my child?
This question is on a lot of people’s minds.  If you’ve ever seen a child with a touchscreen computer like a smartphone or an iPad, it’s easy to understand why.  The devices seem to enchant kids like few things that have come before – reliably absorbing them for a surprisingly long time.  And good luck taking one away!
Many parents experience conflicting feelings about their kids’ powerful attraction to touchscreen computers.  On one hand, it can be challenging (and exhausting) trying to keep a child content all day long – especially during long car trips or waits at restaurants and the doctor’s office.  Having a reliable “high tech pacifier” sometimes comes in very handy.  And the fact that kids can engage with interactive apps instead of just passively viewing videos means that they might even benefit somehow, by learning problem solving skills through games, for example, or expressing themselves through digital finger painting.
On the other hand, many parents worry about the opportunity costs of “screen time” – that is, time when kids aren’t exercising their bodies, interacting with other people, or experiencing the “real” world.  Others fear that the devices may in fact be too engaging – that once a child has visited the world of Angry Birds and Fruit Ninjas they might never want to come back…
The apparent paradox of digital technology
Parents are all over the map on how to manage their kids’ access to touchscreen computers.  At one extreme, some kids have unrestricted, unmonitored use of their own personal devices and spend tens of hours each week with them.  At the opposite extreme, some families try to keep their kids completely “screen-free” for as many years as possible.
Parents frequently ask some version of the following question:
How should I manage my child’s time in the digital world so it doesn’t interfere with their understanding and appreciation of the “real” world?
In this post, I want to explore how this question sheds light on the conflict many parents experience concerning their children’s use of technology, and how we might reframe the issue in a way that can help us move beyond that conflict.
I find it interesting that at the heart of this question is a kind of paradox, in that the “digital” world is at the same time seen as somehow less real yet more compelling than the offline (or “real”) world.
Think for a moment: what else in our lives is both less real and also more compelling than the alternatives?  Junk food and Ponzi schemes come to mind.  Junk food is less nutritious than whole food, but when given a choice, people – especially kids – often find the junk food more appealing.  Ponzi schemes are financially disastrous compared to legitimate financial investments and yet many people are lured by their false promise of quick riches.  If these are the kinds of associations that come to mind for people when they think about children and touchscreens, then it’s no wonder they experience ambivalence and uneasiness regarding children’s use of the technology!
If we stop for a moment and reflect, though, we realize that such comparisons can’t possibly be appropriate.  Touchscreen computers are simply a means for distributing content, like dinner plates or printer paper.  Dinner plates can deliver either junk food or whole food.  Printer paper can deliver a contract for shares in a Ponzi scheme or a U.S. government bond.  Similarly, touchscreen computers can deliver effective, developmentally appropriate learning experiences or “chewing gum for the eyes.”  In all three cases, to label the plate, the paper, or the touchscreen computer as “good” or “bad” in absolute terms is to confuse the delivery medium with the contents delivered.
In short: it stands to reason that touchscreen computers are not inherently good or bad for children, any more than dinner plates or printer paper are inherently good or bad for them.   It doesn’t, for example, make sense to compare the devices directly to junk food or to whole food; they can be used to serve up the digital equivalent of either type.  It all depends on how we choose to use them.
So what’s a parent to do?
While this shift in perspective does not provide hard guidelines for how to manage kids’ access to digital technology, it can help shift the questions that are generating conflict in parents’ minds.  In particular, the either-or question “Is digital technology good or bad for my child?” causes ongoing stress for parents because there appear to be big consequences for getting the answer wrong – but the question stated that way doesn’t actually have an answer.  The result is that parents constantly agonize over whether they are doing the right thing for their child, with no relief in sight.
A variation of that question asks, “How much screen time is OK for my child?”  This question certainly makes it easier to provide specific guidelines – various organizations have come out with clear recommendations such as “no screen time through age two,” or “limiting screen time to one hour per day is OK,” etc.  But this is like asking “how long should my child spend at the table with a dinner plate in front of her during the day?”  Setting an arbitrary time limit doesn’t make a whole lot of sense. The important question is: What are they consuming from that dinner plate and how much of it?
Similarly, a more useful question regarding technology for parents is: “What are your kids doing on the touchscreen computer, and how much of each type of activity is appropriate?”  If all the child wants to do is watch nonsensical cartoons on an iPad, then a parent might reasonably decide to limit the daily amount of time spent on that activity. But what about the case of a three-year-old boy I know, who became so completely engrossed in learning all the countries of the world, their capitals, and where to place them on a map, that he rapidly mastered them all.  Ask yourself: would you allow the child to spend hours – even an entire day – studying geography using a paper atlas or a globe?   Now ask yourself: do you have a principled rationale for arbitrarily limiting his time engaged in the same activity on an iPad?  If you do, then well and good.  If you don’t, then imagining how you would manage the activity off the device can be a good guide for deciding how to manage their activity on the device.
Summary and Take-Aways
Parents want the best for their kids, and they experience unpleasant stress when they don’t know what course of action is best.  Here’s a quick summary on this issue with regard to children’s use of touchscreen computers:
  • Avoid the question, “Is technology good or bad for my child?”  It’s a trap with no way out.
  • Move beyond the question, “How much screen time is OK for my child?”  It’s like asking how long your child should sit at the dinner table – not very meaningful.
  • Let this question be your guide: “How much do I value what my child is doing and learning from a particular experience (whether they are doing it online or offline)?”  Each parent is in the best position to answer that question for their child and to make a judgment about how much time they think is appropriate based on their values.  It may not be quite as easy as setting arbitrary time limits for your child based on third-party recommendations, but at the end of the day it should leave you feeling more empowered to make good decisions on your child’s behalf and less stressed about whether you are doing the “right thing” for them.

Wednesday, August 7, 2013

Why a touchscreen (tablet or iPad) is better than a mouse and keyboard for young learners

W. Christian Wir gratulieren
(This article was originally posted at http://www.nativebrain.com/blog.)
The Idea, In Brief
As schools acquire significant numbers of tablets (Androids and iPads), administrators face questions regarding how these devices relate to the other learning technologies they already own.  In particular, should tablets be thought of as a less expensive and more portable version of desktop and laptop computers?  Or should they be thought of as a new type of educational technology altogether?
In this article, I argue that although today’s tablet computers are probably incremental innovations when it comes to adult productivity (“getting work done”), they are better thought of as major innovations when they are used as learning technologies, especially for children.  I highlight two issues in particular:
  1. Touchscreen computers enable many children as young as one year old (and in some cases even younger) to engage in independent learning activities on the computer for the first time, and
  2. Although there is not yet much empirical data on the relative benefits of touchscreen computers compared to desktop and laptop models, learning theory suggests we can expect faster learning and greater knowledge transfer from learning on a touchscreen computer compared to one equipped with a keyboard and mouse.
A Brief History of Educational Technology
Major technological innovations have arrived in waves over the past century, each bringing with it new opportunities and challenges for educators.
The Broadcast Epoch
Early technologies included radio and television.  These broadcast media opened up new possibilities for educators, but were highly inflexible and therefore difficult to incorporate into lessons.  The invention of recording and playback technologies – especially videocassettes and laser disks – gave teachers control over the time and place of presentation so that they could incorporate them systematically into their teaching plans.  Despite some clever and heroic efforts to make these technologies interactive, however, their capacity for interaction and adaptation were very limited and consequently they were still used mostly to “push” the same set of canned programs to all learners.
The Interactive Epoch
The emergence of affordable personal computers changed the game.  First, the desktop computer arrived in the late 1970’s, followed a few years later by smaller and more portable laptop models.  Computers were different from broadcast technologies in many ways, perhaps most notably from an educational standpoint in terms of their ability to respond to user input.  That is, unlike broadcast television and audio programs, computer programs could change their behavior in response to different user actions, opening the door to more interactive, individualized, and dynamic learning experiences.
The Networked Epoch
The World Wide Web became mainstream in the 1990’s, ushering in the era of massively networked communication.  The Web created opportunities to aggregate data across users and analyze patterns to provide a more social, customized, and targeted experience – consider, for example, the targeted book recommendations from Amazon.com, the free self-study courses available via iTunes University, and the massive information sharing among educators and other stakeholders taking place on blogs, Facebook, and Twitter.
In each technology epoch there have been major shifts – such as the rise of personal computing and the emergence of ubiquitous social networks – that have created qualitatively new kinds of experiences and opportunities, and there have been more incremental advances – such as the laser disk following VHS and the laptop following the desktop computer – that have not so much enabled truly new experiences as they have expanded the availability, usefulness, and flexibility of existing opportunities in more modest ways.
Currently, in the second decade of the twenty-first century, we are experiencing the rapid global adoption of touchscreen devices like smartphones and iPads.  Which raises the question…
Are touchscreen computers incremental extensions of desktop and laptop computers or a qualitatively new category of technology?
I would suggest they are both, but for different audiences.
For many adults, in particular, the jury still seems to be out on this question. Tablet sales are growing fast, but few adults who use their computers for production work such as word processing, software development, or video editing are replacing their laptop or desktop computers with tablets. The devices are still underpowered and the touchscreen interface tends to make them awkward to use for those types of tasks.  Most such people are evidently buying tablets in addition to their primary computer and using them for lighter, more consumption-oriented tasks like surfing the web, reading email, and playing games.  From an adult’s perspective, touchscreen computers appear to be more like incremental extensions of desktop and laptop computers.
And that’s the trap – the fact that adults experience tablet computers as less powerful but more portable versions of laptop computers is likely to blind them to the fact that for young children, touchscreen computers are truly revolutionary innovations – in the sense that they have the potential to enable entirely novel kinds of experiences.  Two important benefits of touchscreen computers as learning technologies are that they enable access to independent learning activities to some groups of learners for the first time and they support faster learning that is also more transferable to the “real world.”
In terms of accessibility, touchscreen computers represent the first personal computer revolution for children
For the youngest children, who can’t effectively use a keyboard or mouse, the arrival of tablet computers is analogous to the arrival of the first personal computers for adults in the 1970’s. Tablets make personal computing – with all of its interactivity, adaptiveness, and dynamism – fully accessible to millions of young children for the first time.  Why?  Because the tap and swipe gestures used to manipulate virtual objects on a tablet computer can be made very similar to the kinds of gestures that children would spontaneously use on physical objects in the real world.
Using a keyboard or mouse requires learning a new skill, and that skill is both counter-intuitive and idiosyncratic to the domain of computer use.  Where else in life, for example, do we touch something in one place (the left mouse button, for example) while looking in a completely different place for the effect of that action (a point on the computer monitor that is three feet away from the hand doing the pressing)?  This situation is not only unnatural – it also involves applying a skill that is unrelated to the task at hand and – for young children at least – is generally more complex than the task they are actually trying to accomplish in the first place.
Think about that for a second.  If the game we are playing with a child is to identify which of three objects is a particular color (“which one is blue?”), then the response we want from them is not terribly complex – they should, for example, point to or grab the one blue object out of the three different colored objects arrayed in front of them.  Such a response generally requires only very gross motor control, and the action is very intuitive – they look at the object and then grab at the same spot where they are looking.  Our brains evolved to handle this kind of task almost reflexively very early in life.
If we expect the child to respond using a mouse, however, suddenly the task requires not only fine motor control – placing the tiny cursor on the virtual blue object using the mouse – but also a complex coordination of counterintuitive fine motor actions: look straight ahead at a cursor moving on a vertically oriented screen while moving a mouse sitting off to your side across a horizontally oriented table, and then hold the mouse perfectly steady while pressing the correct button on it to indicate your selection.  The child can know the right answer while being completely unable to express it using this complex interface, which can make a straightforward, enjoyable learning activity both puzzling and frustrating.
For children, the touchscreen is the first computer interface that does not automatically add a high barrier to entry for engaging in an activity that is completely unrelated to the activity itself.  Consequently, many more children can engage in a greater range of independent learning activities than ever before.
Touchscreen interfaces support learning that is far more transferable to the “real world” than keyboards and mice
Beyond making the interactive capabilities of computers fully accessible to many children for the first time, however, there is an even more important educational benefit of touchscreen computers compared to their desktop and laptop cousins.  Learning theory suggests that we can expect the natural gestures used with the touchscreen interface to support faster learning andgreater knowledge transfer from computer to real world than an otherwise identical activity accessed on a desktop or laptop computer via keyboard and mouse.
To illustrate why this is true, imagine a boy who dreams of being a great chef but grows up in a small apartment with limited facilities.  The only kitchen appliances are a fridge/freezer and a microwave oven.  The boy helps his parents prepare food every day – a combination of frozen meals and dishes improvised from fresh ingredients.  Over time, his skill grows, so that he can successfully prepare a variety of dishes with hardly a glance at recipes or recommended preparation instructions. The boy loves to cook and he has received a lot of positive feedback for his skill at operating the microwave.
Now imagine that this same boy lands his first job – as a sous chef in a restaurant.  He shows up for his first day eager to build on his skills only to find that he doesn’t know how to perform even the most basic tasks like slicing and dicing vegetables or pre-heating the conventional oven – and he is alarmed to see people putting metal containers into it!  What is going on?  The boy’s knowledge of food preparation is all mixed up with the peculiarities of the tool he uses for cooking – the microwave oven.
Similarly, when children engage in certain types of learning via keyboard and mouse, the peculiarities of the interface can become mixed up with the subject matter they are learning.  For example, imagine a child is working on developing her visual spatial skill by playing a “Tangrams” type puzzle game that involves dragging and rotating pieces into position.  She uses the “Tab” key to select a puzzle piece, the space bar to pick it up, the arrow keys to move it into position, and then the “[” and “]” keys to rotate the piece left and right.  The girl might become very proficient at this type of task.  But how much will proficiency in this activity transfer to physical challenges in the real world where the actions required to drag and rotate pieces into place are completely different, and where there are no space bars or arrow keys?  The child in this scenario is like the boy who steps into a fully equipped kitchen for the first time – she is likely to find that her keyboard-and-mouse knowledge does not apply readily to real-world situations.
The advantage of the touch screen is that the actions required to complete a task can be made much more similar to actions taken on physical objects – dragging a virtual object requires touching the object and dragging it with one or more fingers, for example, and rotating the object can be done by rotating one’s fingers on the screen.
The actions are of course not exactly the same as they would be with physical objects, but the point is that they are much more similar and that should support both faster learning and greater transfer.  In fact, by combining the benefits of real-world interaction with the adaptive affordances of computer technology, in some cases touchscreen computers like iPads can provide entirely novel learning experiences that would not be possible in either the real world or on a computer with keyboard and mouse.
But don’t take my word for it – check out some examples of young children using iPads:
You won’t see them doing that on a laptop computer!
The iPad has only been around for a little over two years.  As of this writing, empirical research on children’s use of tablet computers is scant.  As one can plainly see from videos like those above, however, touchscreens have made computers accessible to a large number of children in unprecedented ways.  Even more exciting, learning theory suggests that tablet computers like the iPad can support higher quality learning than desktop or laptop computers.
Adults would do well to keep these considerations in mind when making choices on behalf of the children in their care.  Despite appearances, all learning technologies are not created equal.

Monday, August 5, 2013

How Can Less Studying Produce More Learning?

Frederick Alfred Slocombe Wandering thoughts In a previous post, I introduced two notable examples of applied learning science: SHERLOCK and RightStart. They demonstrate that the difference between an average instructional design and an optimal design can be huge - much bigger than most people realize. In this post, I use those examples as a jumping off point to explore two questions:

1) What could cause such large differences in learning outcomes if the basic "subject matter" being taught doesn't change?

2) How unusual are the SHERLOCK and RightStart results? In particular, should we only expect those kinds of results from formal, long-term, well-funded research studies?

But first, I need to introduce the following key insight.

Key Insight: Don't Mistake Subject Matter for Knowledge

When I was in high school, I ran a tutoring business. I loved helping my peers learn about subjects like math, science, and writing. As a bonus, not only did I deepen my own understanding of these subjects, but I also learned a ton about how other people come to understand them.

One of the more interesting insights I gleaned was this:
Subject matter taught is not the same as subject matter learned.  

Let me illustrate with an example based on actual events. 

One of my tutoring clients was a good friend named Susan. She was a diligent student in all subjects and a strong writer, but she struggled with math and science. One day we were studying math at her house. As I was packing up to leave at the end of the hour, Susan groaned, "Now I have to study for tomorrow's chemistry test. I hate chemistry."

"Why?" I asked.

"There's too much to remember!" she complained.

"Show me," I said.

She pulled out a stack of homemade flashcards and flipped the first one onto the table. Written on it was the formula for the "Ideal gas law" in chemistry:

PV = nRT

"There are all these formulas," Susan said, "and I can't keep them straight in my head because they all look alike." Then she dealt out a bunch more cards to illustrate her point:

P = nRT / V

V = nRT / P

n = PV / (RT)

R = PV / (nT)

T = PV / (nR)

I stared at the cards. "Susan, that's algebra." 

"No, this is for my chemistry test," she insisted.

"No, I mean those aren't six different formulas," I explained, "It's the same formula written six different ways. If you start with any one of them you can get to all five of the others using simple algebra. In fact, you just used the same algebra in some of your math homework during the last hour."

Her eyes widened with dawning recognition. "Oooohhhhhh - I never realized you could use algebra in another subject like that!"

No wonder chemistry was so hard for her! She had to memorize about five times as much subject matter as the student sitting next to her who realized he could use algebra there. Note that both students would have been exposed to the same subject matter - algebra and chemistry. It would have been their knowledge (or understanding) of the two subjects that was organized slightly differently. But the implications for future learning were not slight at all - they were quite huge.  Huge on the scale of SHERLOCK and RightStart, in fact. The picture below illustrates this scenario.


This example also suggests a way to think about how SHERLOCK and RightStart could produce such large gains compared to other curricula covering similar subject matter. In particular, if we imagine two students taking the exact same classes at the exact same time, we can see how one student could easily spend twice as much time as the other student to learn half as much material with less understanding. It stands to reason that a curriculum designed to ensure that every student has mastered key concepts and skills before moving on could produce dramatically better learning outcomes than a curriculum that leaves it up to each student to find their own way - even if the subject matter is ostensibly the same in both designs.

And we should note that even though Susan eventually made the connection between algebra and chemistry, she had already suffered (unnecessarily) through years of tedious studying just to make a passing grade in science while watching some of her peers seem to breeze through with top marks. How often does a single, critical misstep like this prevent a student from pursuing - or even exploring - entire categories of career? The stakes are very high in education - people's life outcomes hang in the balance.

Example: Arithmetic

The chemistry example is not unique - far from it. Consider a similar example from arithmetic - memorizing the times tables from 1x1 up to 12x12.

Memorizing all of these multiplication facts would involve 144 flashcards: 1x1, 1x2, 2x1, 2x2, and so on, all the way up to 12x12 (as shown in the left panel of the next figure).

But if the student knows the commutative property of multiplication (which means, for example, that 1x2 = 2x1) then suddenly there are only 78 facts to remember (plus one rule), as shown in the right panel in the figure below. The student who doesn't understand the commutative property has to memorize nearly twice as much information as the student who does. The same observation applies to learning the addition tables.



Example: Foreign languages

The problem is not limited to math and science, either. Consider foreign language studies. Linguists use the term "cognate" to describe words in different languages that derive from the same origin. For example, "university" in English and "universidad" in Spanish are cognates, as are "city" / "ciudad" and "accident" / "accidente." The Spanish student who recognizes the general patterns by which cognates are related (for instance: "-ty" in English becomes "-dad" in Spanish and vice versa) will have quite a bit less to learn than the student who doesn't pick up on those patterns.

Just the tip of the iceberg...

As these examples from familiar school subjects illustrate, knowing what (or how much) subject matter is being taught doesn't tell us what (or how much) subject matter is being learned. The differences are not on the order of 1% or 10% either - even in these simple cases the swings are closer to 200% to 500%. 

But these simple cases represent just the tip of the iceberg. As the example  from Susan illustrates, the differences accumulate and compound as multiple subject areas interact (or not), and as new knowledge is layered on top of old. 

At still deeper levels of analysis where we apply insights from Cognitive Science, we find that issues arise when knowledge is stored in one type of memory system in the brain that should really be stored in a completely different type of memory system.  For example, a child could memorize the steps involved in tying one's shoes as declarative facts (Step 1: hold the shoelaces apart near the tips, Step 2: cross the right lace over the left and exchange the tips between hands, ...), but to actually tie their shoes effortlessly they will need to transfer that information to procedural memory. That should be obvious in the case of tying shoes, but it applies equally well to the difference between being able to recite the "six key features of a persuasive essay" (declarative facts) and the procedural knowledge required to actually compose an essay that influences people.

What do you think?

In this post we have explored the question of whether the dramatic learning gains documented in projects like SHERLOCK and RightStart are likely to be rare - perhaps only discoverable and accessible through systematic, long-term, and expensive formal research projects - or whether they are more commonplace and readily accessible by teachers in regular classrooms. I think the examples above provide pretty compelling evidence for the latter. 

What do you think? What experiences or examples can you share to push the conversation forward?

Monday, July 29, 2013

How Can Education Experiments Make Education Better?

The Scientists In a previous post, I suggested that to make education better we should “Stop arguing about what people think will work in education and start experimenting to determine what actually works.”

There were a number of reader comments (thanks to Jeff, Stacie, and Suzanne in particular) that helped me to understand that the word “experiment” may carry some extra baggage that wasn't part of what I had in mind here.  In this post, I’ll try to clarify what I mean and offer an example.


How not to run education experiments


First, a few things I don’t mean when I talk about “education experiments”:
  • I don’t mean we should bring scientists in white lab coats into the classroom so that they can better understand how our children’s brains work – that is cognitive science, not education science.
  • I don’t mean we should bring academics in white lab coats into the classroom to test their speculative or radical new math curriculum for the very first time on our children for the benefit of future generations.
Don’t get me wrong – I think both basic and applied academic research are extremely important and have their place (with appropriate safeguards and informed consent all around). In particular, these kinds of research can benefit society greatly over the long term. But academic research is not what I had in mind here for two reasons:
  1. I am more interested in research that has a good chance of benefiting the participating students in the short term, and
  2. I am interested in research that leverages the teacher’s special knowledge and insight about the capabilities and needs of those particular students.

A different take on education experiments (with an example)


What I am proposing is more akin to action research, or design-based research.  To clarify what I have in mind, I’ll give an example from real life.

In my work as an educational designer and consultant, I have the pleasure of talking with some extremely dedicated teachers. Sometimes I meet them when they send a message, out of the blue, with a specific idea or question about how to do more for their students. Below is an example of one such message I received about a year ago.



Key points to note about the situation:
  • This teacher has identified a problem: Something isn’t working right, and my students are struggling in math.
  • This teacher has a hypothesis about the root cause of the problem: I believe some of my students lack number sense, which prevents them from understanding the more complex material in the math curriculum.
  • This teacher has done some research (most likely on her own time) and identified an option that she believes could address the root cause better than the current curriculum alone: I have found an iPad math curriculum that I believe will solve the problem by developing my students’ number sense.
  • This teacher has encountered a barrier that prevents her from trying the option: I don’t have all the technology I need (wifi) to use the app in my classroom.
  • This teacher is actively seeking creative ways to overcome the barrier: Is there a way I can use the app-based curriculum without wifi? (In my experience, teachers are willing to go to great lengths to make something work if they believe it will help their students.)
This situation has all the trappings of a good old-fashioned experiment. What happens next?

Here are two common outcomes:
  • Either: The teacher gets no support to overcome the final barrier, she can’t even try out the option she believes will help her students, and her students continue to struggle with math – probably for life (because math misunderstanding – like math understanding – is cumulative). Sadly, this is probably what happens to proactive teachers and their students most of the time. The teacher takes the ball to the 99 yard line and for any number of reasons can’t carry it the final yard alone so it sits there. The students miss out. The dedicated teacher gets a little more burnt out – maybe this is even the last straw and she leaves the profession. Everyone loses. 
  • Or: The teacher finds a creative way around the barrier, tries out the math app with her students this year, and based on her results she decides whether to use it again (or try something else) with her class next year.  The teacher goes to great lengths to get her students more of what they need. Her twenty-five students benefit. She feels empowered - maybe she can make a difference through diligence, resourcefulness, and a lot of hard work. But…did her idea work? How well? What’s the evidence that it did or didn't? Did it work for some kids and not others? If so, which ones? What did she do to make it work? That teacher has a lot of valuable, actionable insight as a result of her experiment, but it was a private experiment and so the world will never know about it.
The answers to those last questions would have been handy six months later when I received this message from another teacher:
It’s déjà vu all over again.

Key points to note about this situation:
  • This teacher has independently identified the same problem: Something isn’t working right, and my students are struggling in math.
  • This teacher has independently produced the same hypothesis about the root cause of the problem: I believe some of my students lack number sense, which prevents them from understanding the more complex material in the math curriculum.
  • This teacher has done some research (most likely on her own time) and independently identified the same option that she believes could address the root cause better than the current curriculum alone: I have found an iPad math curriculum that I believe will solve the problem by developing my students’ number sense.
  • This teacher has encountered a slightly different barrier that prevents her from trying the option: I don’t have all the technology I need (iPads) to use the app in my classroom.
  • This teacher is also actively seeking creative ways to overcome the barrier: Is there a way I can adapt the app to a whole-class format so I can make it work with the technology I do have available?
The teacher from California observed a pattern, had a hypothesis, came up with a test of that hypothesis, and – if she was persistent and lucky enough – was able to run the experiment.  But no one except perhaps a few colleagues will ever hear about it, whether it was successful or not. As a result, the teacher in Tennessee has to do all of the same work over again, from scratch, without guidance or data to help her make informed decisions. Which raises a few questions:
  • If these two teachers face the exact same problem with teaching math, how many others are there? Tens of thousands? Hundreds of thousands?
  • How much teacher time, energy, and good will are we wasting with all of this duplicated effort?
  • How many more teachers would use an effective option that had already been vetted by another teacher if they knew about it and didn’t have to re-invent it from scratch or take the risk of "being first"?
  • Looking beyond first grade math, how many students would benefit if every teacher didn’t have to reinvent the wheel like this on every big and little problem they encounter in the classroom? Presumably every single student. That includes your children, and mine.
Here’s the punch line:
This is going on all the time.  Millions of such private experiments are presumably being conducted by teachers every day in classrooms across the country and around the globe. The only thing that’s missing to capture the value of that activity is a bit of systematic record keeping and a way to share results.

Where do we go from here?

I hope that helps clarify what I meant when I said we need to stop arguing about what people think will work in education and start experimenting to determine what actually works

In fact, I now realize that I mis-spoke when I first said that. We don't need to start experimenting - that part is already happening. As a first step we simply need to start sharing what teachers have discovered about what works and doesn't work as they try – sometimes desperately and at significant personal cost – to give our children more of what they need to succeed in school and in life.

Monday, July 22, 2013

Can We Make Education Better, or Is This Really As Good As It Gets?

Adriaen van Ostade 007

I attended a summer conference years ago at Brown University. The conference theme had to do with making brain research more useful in practice. I was a doctoral student at the time, working in the emerging domain of Educational Neuroscience. The field was so new it wasn't even established yet, it was starting to generate some buzz, and this was one of the first conferences of its kind. I was pretty excited to be there.

One talk was by a cognitive neuroscientist renowned for his brain imaging studies of memory and perception (how we see, hear, etc.). This particular presentation, though, was about a study he was doing on learning. Learning and motivation have been my primary interests for a long time, and so I was keenly interested in the talk. During the Q&A following the presentation I asked, "Do you have ideas about how this kind of brain research might help us make education better?"

Based on prior experience, I expected one of two answers.

Either:
"Yes, I have several ideas about how this might be useful down the road.  For example..."

Or:
"No, this laboratory work is too preliminary and too far removed from educational practice for me to speculate about that."

His actual answer was far more interesting (and surprising).  He said:
"I think we do pretty well in education already. I don't really think it's possible to do much better."

I was taken aback. First, he dismissed my question instead of answering it. But, more importantly – could he really be suggesting that education is about as good as it can possibly be? And that a deeper understanding of the biology and psychology of learning won't lead to better education? At the time it hadn't occurred to me that people - especially people who study learning - could seriously hold that view.

But he got me thinking…

Why should anyone believe that education could be better than it is?  And how much room for improvement is there, really?

These are quite important questions.  After all, if the best-case scenario would be, say, 10% improvement over what we do now, then it probably wouldn’t be worth the extra cost.  But if, on the other hand, we could double the return on our educational investment (100% improvement), then that would really be something to get excited about!

How much more efficient could education be?
First, let’s think about educational efficiency.  One way to determine that is to compare how long it takes students to master a specific amount of content under different instructional designs. One telling example comes from the SHERLOCK Project, in which cognitive scientists demonstrated that they could replace four years of on-the-job training (in troubleshooting electronics) with 20-25 hours of computer-based training to produce the same expert-level performance. Four years of training compacted into less than three days through the systematic application of Learning Science.  The authors of this article point out that “SHERLOCK achieves this stunning result in two ways, by affording the opportunity for extensive practice and by creating educationally effective instructional conditions” (p. 9).

So, based on this example, we can estimate that in at least some domains education could be made not just 10% or even 100% more efficient than it currently is through optimal instructional design, but on the order of 100x – or 10,000% – more efficient!

Wow. I’d say that puts the score at:
Me: 1
Cognitive Neuroscientist: 0

How much more effective could education be?
Now let’s think about educational effectiveness. One way to determine that is to compare levels of student achievement in a subject area like math resulting from different instructional designs. The cognitive scientists behind the RightStart project did just that.  In one study, they followed three groups of children for about five years (from preschool through third grade), measuring their math achievement each year.

  • Group 1 was a high-resource group – children from middle- and upper-income homes who attended a magnet school with an enriched math curriculum (not specifically informed by Cognitive Science).
  • Group 2 was a high-ability group – children from lower-income homes who were identified as having superior math achievement when they arrived at school. They received no special or enriched math instruction.
  • Group 3 was the experimental (RightStart) group – typical children from lower-income homes with no special aptitude for math.  In addition to their regular school curriculum, these children received about 20 hours of intensive supplementary instruction in “number sense” (the conceptual foundations of math).  The special number sense curriculum was explicitly informed by Cognitive Science.

The RightStart group started out with the lowest math achievement of the three.  Over the course of five years, however, they reached higher levels of achievement than either the high-resource or the high-ability group.  Evidently, optimal instructional design based systematically on Learning Science can take the children who would normally sit at the bottom of the class and reliably put them beyond what is currently the top of the class – and they’ll stay there for years.

Wow again.  I’d say that brings the score to:
Me: 2
Cognitive neuroscientist: 0

Cognitive Neuroscience is not Education Science
Based on available evidence, it seems the Cognitive Neuroscientist was just plain wrong in saying that education today is about as good as it gets. But he’s a senior scientist studying learning – so where’s the disconnect?

I think Clarke’s first law applies here:
When a distinguished but elderly scientist states that something is possible, he is almost certainly right. When he states that something is impossible, he is very probably wrong.

What I have further come to understand in the decades since (thanks in no small part to this book by Dewey) is that there is a vast difference between a Cognitive Neuroscientist who studies learning and an Education Scientist who studies education. Much could be said about the differences between them. For now, I’ll simply point out that the former apply systematic methods of inquiry to describe neurological and psychological processes, while the latter apply systematic methods of inquiry to improve educational processes. The difference is analogous to that between the chemical sciences and the medical sciences.

I have spent much of the past two decades helping people produce the learning outcomes they desire (education) by applying systematic methods of inquiry (science). I am an Education Scientist. In retrospect, I think the Cognitive Neuroscientist was not wrong to dismiss my question. I think I was wrong to ask it of him - in the same way you wouldn't ask a chemist (even a brilliant one) to prescribe your medication. That's simply not his area of expertise.

Long experience in the field supported by evidence from programs like SHERLOCK and RightStart have led me to believe that we can make education better across the board. Not a little better - a whole lot better. Imagine, for example, that our schools could produce twice the learning in half the time at a fraction of current cost. I believe this is feasible today, in existing schools, with our current students, teachers, and facilities.

The way we can make education substantially better is with a straightforward three-step process. This process builds on the one applied successfully by the scientists behind both SHERLOCK and RightStart:

  • Base educational designs on scientific models of human learning.  There's over a century's worth of relevant research (including work in Cognitive Neuroscience) that is hardly being used in educational practice. We need to start using it. 
  • Use technology to dramatically expand access to high-quality learning experiences while lowering cost. Today, we could basically create for every learner a "personal-tutor-in-a-box." Students could work at their own pace on a personalized curriculum. The system could collect rich data on student performance. Teachers could spend more time actually teaching.
  • Stop arguing about what people think will work in education and start experimenting observing, acting, and sharing more systematically to determine what actually works. Simply apply scientific thinking to educational processes the same way we do to every other major domain of human endeavor like engineering, medicine, agriculture, and economics. 

Wash. Rinse. Repeat.

Today, we have all of the components we need to make education much better. There is definitely a way. The question is, do we have the will to make it happen?

Monday, July 15, 2013

I Believe We Can Make Education Much, Much Better Than It Is Today

1650 Jansson Map of the Ancient World - Geographicus - OrbisTerrarum-jansson-1650 Welcome to The Education Scientist. The purpose of this blog is to provide a forum for discussing how we can make education systematically better – especially in the short term – through methods proven to work in other applied domains like medicine and agriculture. 

I value openness and transparency. This first post is intended to tell you where I stand and where I am headed so you can decide if you are interested in walking with me for a while. The summary below is a 50,000 foot overview of the landscape. I plan to expand on each of these points in future posts.

Here, in a nutshell, is my point of departure:
  • I believe that learning may well be the ultimate Good.
  • I believe that the purpose of formal education systems (like schools) is to facilitate specific learning outcomes desired by a community.
  • I believe that compulsory education can be a powerful public Good, but that it brings with it certain moral obligations on the part of those who provide it.
  • I believe that our systems of education (public and private alike) produce less than 50% of the learning that even the average student is capable of. In particular, I believe that we have the capability to produce twice the learning we are currently producing in half the time at a fraction of current cost.
  • I believe that if we chose to we could accomplish this today, for the average student, in our existing public and private educational institutions.
  • I believe that the stakes in this area are very, very high. In particular, children’s life courses are being determined powerfully by the quality of their education. The fate of entire societies will in turn be determined by their citizens’ life courses.
  • Because the stakes are what they are, I believe that knowing how to provide better education for every child creates a moral obligation to do so – for every child – and to act with all possible haste. Every year another cohort of children – our children, grandchildren, nieces, nephews, and friends – advances through the system without getting what they need to survive and thrive in the world we have created for them.
Education Science is the systematic application of methods of inquiry to drive the educational outcomes that a society desires. I believe that Education Science can enable us to produce dramatically better learning over time, in the same way that Medical Science, Agricultural Science, and Political Science enable us to produce (respectively) better health, global nutrition, and political stability over time.

I believe it is within our capability to make education much, much better than it is today. If you believe this too, then I hope you will walk with me a while by joining in the conversation here.