Tag Archives: learning analytics

But what about learning? Recognizing the signaling function of higher education

I spent just a couple days at the Learning Analytics and Knowledge 2012 conference this past week in Vancouver. Because of personal stuff going on I couldn’t attend the whole conference, but by watching some live streaming, following the conference twitter channel, reading some live blogs, and browsing some of the papers afterward I put together a decent understanding of the topics covered and the issues that came up. Of course, I had far less time than I wanted to connect with people there, but I have hopes that I can address that over the next year online and then go all out for LAK13 in Belgium.

Some attendees felt a disconnect that many vendors and researchers are looking at retention (students re-enrolling across terms) and course completion rather than learning itself. But aren’t we doing learning analytics, they asked?

At Pearson, I am working on exactly the problem of retention and course success. I work mainly with data from Pearson’s LearningStudio hosted learning management system. I sometimes have access to learning outcomes data but only very rarely; I can’t count on having that.

The data scientist is not drunk

Is my work a a case of the drunk looking for her keys under the streetlight? I don’t think so, and it’s not just do to with the profitability of institutions or their need to demonstrate adequate program completion rates and year-over-year retention to accreditation bodies.

Students enroll in higher education programs for many reasons, not all of which have to do with learning. We would all hope that after two or four or more years taking coursework a student will have better skills and cognitive capacity than when she started. But bachelor’s and associate’s degree programs today require students to take many classes that are not relevant to them or to their future work, that merely serve as hurdles to jump over to acquire the degree. The learning is not enough, because education serves a signaling purpose in addition to working to improve a student’s cognitive capacity.

As people working in education, we have to recognize the reality of higher education today, that it is only partially about learning. I’m talking from a purely U.S.-based perspective, as that’s my focus. In the U.S. today, bachelor’s degrees are a basic entry ticket to a decent job in many cases. In fact, credential inflation means that now a master’s degree is required for entry into many professions.

Some economists think higher ed degrees primarily function as signals to employers, that they are not mainly about increased learning or cognitive capacity. On this theory, a student’s degree shows that he has qualities valued by employers: conformity, conscientiousness, a willingness to defer gratification. If signaling is true in some situations or in some ways, that means it’s not enough for a student to take some classes and merely learn what they need to use on the job. They need to get through an entire program, arbitrary requirements and uninteresting or useless classes included.

There’s good evidence that signaling theories of education do not tell the whole story about higher education, that returns to education do indeed reflect the additional skills that graduates in various disciplines bring to the job market. The signaling theory may be true in part but as we might hope, education is also about actual preparation and learning–I don’t mean to say it is not.

I also don’t mean to reduce higher education to job market preparation, though I do question degrees that don’t represent a sound economic investment. The cost of post-secondary education today is such that we can’t divorce it from its role in linking people with economic opportunity.

Are learning and course completion orthogonal?

One presenter called completion and learning orthogonal outcomes. At least one attendee in that session took issue with that. We would all hope this is not the case — we hope that students don’t successfully complete classes without learning anything — but can’t everyone think of a class they were required to take but didn’t take anything away from it? I just completed a Ph.D. and virtually all of the cognate classes were worthless, annoying and time-consuming efforts that I had to complete if I wanted my degree. There was almost no learning taking place in those courses, and I am a highly motivated and engaged learner, taking courses that I thought would be interesting. Sometimes students do just need to complete a course, learning or no.

Certainly we should work toward making every course a worthwhile learning experience for students but in the real world there are always going to be some classes that aren’t that for one reason or another. Assuming that completing a particular program is a good thing for a particular student (a somewhat questionable assumption in this era of heavy student loan debt and low-value degrees), helping them get through all their courses successfully regardless of learning is a good in itself.


Links for March 4, 2012

Who’ll have the means to analyze our learning? [Tony Searl/Neoteny]. In response to Pearson and INITE’s announcement of plans to open an online university for Mexicans, Searl asks “who will have the means to analyse our learning in the near future?” and “Will a few dominant learning data companies emerge, or can learning analytics remain an in house cottage industry?”

Social learning Analytics: Five approaches [PDF] [Rebecca Ferguson and Simon Buckingham]. Five categories of social learning analytics: social network analytics, discourse analytics, content analytics, disposition analytics, context analytics. All things I want to learn more about.

4chan’s Chris Poole: Facebook & Google are doing it wrong [Jon Mitchell/ReadWriteWeb]. Google and Facebook have a crude notion of identity.  4chan’s Chris Poole says we are like multi-faceted diamonds.

“The portrait of identity online is often painted in black and white,” Poole said. “Who you are online is who you are offline.” That rosy view of identity is complemented with a similarly oversimplified view of anonymity. People think of anonymity as dark and chaotic, Poole said.

But human identity doesn’t work like that online or offline. We present ourselves differently in different contexts, and that’s key to our creativity and self-expression. “It’s not ‘who you share with,’ it’s ‘who you share as,'” Poole told us. “Identity is prismatic.”

Permission to be horrible and other ways to generate creativity [Suzanne Axtell interview of Denise R. Jacobs/O’Reilly Radar].

“… there’s such a limited definition of creativity in our culture. People treat artists as if they’re off in their own world or put them on a pedestal. But it’s a misconception that technical people aren’t creative. Developers and coders and database architects are extremely creative, just as scientists are. They have to come up with solutions and code that have never been written before. If that’s not creativity, I don’t know what is.

I’m reading “A Whole New Mind” by Daniel H. Pink, which explores how right-brain is the new wave. We’re entering a new conceptual, high-touch era whereas before we were in a very analytical era. Our industry, the technical industry, is actually a perfect in-between point of left brain and right brain. You have to have both, a whole-brain approach, to be successful in our industry.”

Colleges misassign many to remedial classes, studies find [Tamar Lewin/NY Times]. This is something learning analytics ought to be able to fix.

On the reductionism of analytics in education

I had the great pleasure (and distinct discomfort) of listening to Virginia Tech’s Gardner Campbell speak on learning analytics this week, through my haphazard participation in the Learning Analytics 2012 MOOC. Haphazard, I say, because I am so busy at work I can hardly spare any time to connect outside of it, whether through more structured means like the Learning Analytics course or less structured like Twitter and Facebook. Discomfort, I say, because Campbell launched some pointed criticisms of the current reductionist approach to learning analytics that prevails in education today. Yes, it prevails at Pearson too, not because we have bad motives, but because the process of education and learning is so complex that we feel compelled to simplify it in some way to make any sense of it.

M-theory vs. the x-y plane

Campbell drew an analogy to cosmology, contrasting 11-dimensional m-theory with the planar (two-dimensional) Cartesian coordinate system. He suggested that current work in learning analytics is like working in the x-y plane when we know that education and learning takes place in at least 11-dimensions.

Learning analytics, as practiced today, is reductionist to an extreme. We are reducing too many dimensions into too few. More than that, we are describing and analyzing only those things that we can describe and analyze, when what matters exists at a totally different level and complexity. We are missing emergent properties of educational and learning processes by focusing on the few things we can measure and by trying to automate what decisions and actions might be automated.

As I was writing this post, @webmink Simon Phipps tweeted about his post leaving room for mystery, in which he proposed that some problems will remain unsolved, some systems unanalyzed:

The real world is deliciously complex, and there will always be mysteries – systems too complex for us to analyse. It seems to me that one of the keys to maturing is learning to identify those systems and leave room for them to be mysteries, without discarding the rest of rational life.

Then Simon shared a definition of reductionism with me:

This echoes exactly what Campbell said in his presentation:

My fear is that computers as they are used in learning analytics mean that people will work on simpler questions. They may be complicated in terms of the scale of the data but they’re not conceptually rich. They won’t be trying more concepts or playing with new ideas.

We’ll have a map that makes the territory far simpler than it truly is and we’ll design school to that, not to the true complexity.

Reductionism in analyzing online discussion threads

Last week in a meeting one of my colleagues pointed out the inherent reductionism of our approach to the problem of measuring and characterizing student interactivity and learning via discussion threads. He pointed this out not as a criticism but as recognition and acknowledgement. We are applying a custom-developed coding scheme to threaded discussion posts. We code each post into one of four categories based on the pattern of topics discussed in each post and across the thread. We capture what topics were introduced, how they relate to topics in previous posts, and how they relate to the main discussion topic. We cannot capture all the details and complexity of what people have written and how they have interacted. We certainly aren’t paying any attention to the broader experiences and connections that individual students bring to the discussion. But we are trying nevertheless to capture some important kinds of meaning and interaction in the posts via our coding scheme.

This is, at heart, the analytics endeavor: to take very messy humanly-meaningful information and transform it into numbers that a computer can manipulate. It can be done in more sophisticated and subtle ways or more crude and careless ways, but it is always reductionist. It does not fully capture the human experience of learning. We can’t model learning in all its complexity.

The math is not the territory

I see it as critical in data analysis to remember that our numbers are useful shorthand — easy to manipulate, summarize, visualize, and report upon — but they are not the thing we are interested in. We use them because there is something else non-quantitative we are interested in, something human (at least in social sciences like education).

Campbell said,

We tend to believe the math is the territory and we tend to organize ourselves around just what we’re able to measure instead of organizing ourselves around creating better measurements of what we know to be nearly unimaginably complex.

The math is not the territory — the codes and numbers we use to represent human understanding and action and connection are not the territory — the visualizations are not it either.

Learning as delicious mystery?

Simon suggested some things are too complex to be answerable and should be left as mysteries. Is learning something that should be left unanalyzed? Certainly not, although aspects of it are mysteriously wonderful and not amenable to quantitative or qualitative analysis. There’s too much at stake — for individual students who benefit from success defined in many different ways, for the government that funds or subsidizes much of their education, for the citizenry that benefits from an educated populace.

I believe analytics can help, but I feel humble about its possibilities, more so than ever after listening to Campbell speak. I used to call my stance “cynicism” but I think I will reframe it as “humbleness” which makes it seem like there is some chance of success. As uncomfortable as it was, I’m glad I sat in on Campbell’s talk and listened to it again this morning to think about it further.