On the reductionism of analytics in education

I had the great pleasure (and distinct discomfort) of listening to Virginia Tech’s Gardner Campbell speak on learning analytics this week, through my haphazard participation in the Learning Analytics 2012 MOOC. Haphazard, I say, because I am so busy at work I can hardly spare any time to connect outside of it, whether through more structured means like the Learning Analytics course or less structured like Twitter and Facebook. Discomfort, I say, because Campbell launched some pointed criticisms of the current reductionist approach to learning analytics that prevails in education today. Yes, it prevails at Pearson too, not because we have bad motives, but because the process of education and learning is so complex that we feel compelled to simplify it in some way to make any sense of it.

M-theory vs. the x-y plane

Campbell drew an analogy to cosmology, contrasting 11-dimensional m-theory with the planar (two-dimensional) Cartesian coordinate system. He suggested that current work in learning analytics is like working in the x-y plane when we know that education and learning takes place in at least 11-dimensions.

Learning analytics, as practiced today, is reductionist to an extreme. We are reducing too many dimensions into too few. More than that, we are describing and analyzing only those things that we can describe and analyze, when what matters exists at a totally different level and complexity. We are missing emergent properties of educational and learning processes by focusing on the few things we can measure and by trying to automate what decisions and actions might be automated.

As I was writing this post, @webmink Simon Phipps tweeted about his post leaving room for mystery, in which he proposed that some problems will remain unsolved, some systems unanalyzed:

The real world is deliciously complex, and there will always be mysteries – systems too complex for us to analyse. It seems to me that one of the keys to maturing is learning to identify those systems and leave room for them to be mysteries, without discarding the rest of rational life.

Then Simon shared a definition of reductionism with me:

This echoes exactly what Campbell said in his presentation:

My fear is that computers as they are used in learning analytics mean that people will work on simpler questions. They may be complicated in terms of the scale of the data but they’re not conceptually rich. They won’t be trying more concepts or playing with new ideas.

We’ll have a map that makes the territory far simpler than it truly is and we’ll design school to that, not to the true complexity.

Reductionism in analyzing online discussion threads

Last week in a meeting one of my colleagues pointed out the inherent reductionism of our approach to the problem of measuring and characterizing student interactivity and learning via discussion threads. He pointed this out not as a criticism but as recognition and acknowledgement. We are applying a custom-developed coding scheme to threaded discussion posts. We code each post into one of four categories based on the pattern of topics discussed in each post and across the thread. We capture what topics were introduced, how they relate to topics in previous posts, and how they relate to the main discussion topic. We cannot capture all the details and complexity of what people have written and how they have interacted. We certainly aren’t paying any attention to the broader experiences and connections that individual students bring to the discussion. But we are trying nevertheless to capture some important kinds of meaning and interaction in the posts via our coding scheme.

This is, at heart, the analytics endeavor: to take very messy humanly-meaningful information and transform it into numbers that a computer can manipulate. It can be done in more sophisticated and subtle ways or more crude and careless ways, but it is always reductionist. It does not fully capture the human experience of learning. We can’t model learning in all its complexity.

The math is not the territory

I see it as critical in data analysis to remember that our numbers are useful shorthand — easy to manipulate, summarize, visualize, and report upon — but they are not the thing we are interested in. We use them because there is something else non-quantitative we are interested in, something human (at least in social sciences like education).

Campbell said,

We tend to believe the math is the territory and we tend to organize ourselves around just what we’re able to measure instead of organizing ourselves around creating better measurements of what we know to be nearly unimaginably complex.

The math is not the territory — the codes and numbers we use to represent human understanding and action and connection are not the territory — the visualizations are not it either.

Learning as delicious mystery?

Simon suggested some things are too complex to be answerable and should be left as mysteries. Is learning something that should be left unanalyzed? Certainly not, although aspects of it are mysteriously wonderful and not amenable to quantitative or qualitative analysis. There’s too much at stake — for individual students who benefit from success defined in many different ways, for the government that funds or subsidizes much of their education, for the citizenry that benefits from an educated populace.

I believe analytics can help, but I feel humble about its possibilities, more so than ever after listening to Campbell speak. I used to call my stance “cynicism” but I think I will reframe it as “humbleness” which makes it seem like there is some chance of success. As uncomfortable as it was, I’m glad I sat in on Campbell’s talk and listened to it again this morning to think about it further.

About these ads

4 responses to “On the reductionism of analytics in education

  1. So one day i went looking for the things I thought should be there by now, and they hadn’t been made yet. Instead I found elaborate parasitic systems that fed from the human subconscious and its panoply of skills: intuition, information processing, and scenario-evaluation to name a few. This man-made parasite touted itself in grand style as the epitome of civilization, the enable of higher function; indeed, the builder of modern man. But as I looked under its crusty crust; as I peeled off the layered layer, I found it merely relied on man’s mind to take form. It was trying to pass itself off as the master, relegating the source of its own power to anecdotal importance, and posing as that which made the beast that is man into the man that is civil.
    Woe is me! I had killed another little god!
    But then as I thought more, all was good. The brain, as we all know, controls the body. The masses of bacteria and other organisms in our gut perform magnificently; and they form intricate and complex systems that enable Life itself. I would not be surprised to learn that some of the cells that first arose with life itself are still present in our gut.
    So likewise the system is but a artifact of our current civilization. And what is that compared to the power of man?

  2. Pingback: LAK11 | Pearltrees

  3. All I can say is “perfectly stated”. We attempt to reduce things- not out of ill-intent, but so we can reduce something we can analyze at scale to hopefully identify big trends/big wins, etc… I think the issue is that over time, many seem to forget that the “bottom line” numbers are the truth and scope of all activities vs a very gross, generalized, summary. THAT is the key issue I see repeating in these types of exercises. Higher up you go in an org, the more they want every complex issue summarized to a bullet point or elevator pitch and “can’t be bothered with details”- but isn’t that exactly where the devil lives? ;)

  4. Jordan Carswell

    Behaviorism vs. Cognitivism – Learning analytics as it is used now seems to align with Skinner’s insistence that learning be measured according to what can be studied as external behavior. This is the basic assumption of behaviorism. Gardner seems to posit learning within the individual, which more closely aligns with cognitivist theories. It seems like analytics can be useful for analyzing simple activities that can be measured like class attendance, turning in assignments, etc. These can be valuable markers for whether a student is in a position to learn. But unless you are a strict behaviorist, you probably will not believe that there is any data analysis that can actually represent whether or not learning has taken place.