Tag Archives: #lak12

But what about learning? Recognizing the signaling function of higher education

I spent just a couple days at the Learning Analytics and Knowledge 2012 conference this past week in Vancouver. Because of personal stuff going on I couldn’t attend the whole conference, but by watching some live streaming, following the conference twitter channel, reading some live blogs, and browsing some of the papers afterward I put together a decent understanding of the topics covered and the issues that came up. Of course, I had far less time than I wanted to connect with people there, but I have hopes that I can address that over the next year online and then go all out for LAK13 in Belgium.

Some attendees felt a disconnect that many vendors and researchers are looking at retention (students re-enrolling across terms) and course completion rather than learning itself. But aren’t we doing learning analytics, they asked?

At Pearson, I am working on exactly the problem of retention and course success. I work mainly with data from Pearson’s LearningStudio hosted learning management system. I sometimes have access to learning outcomes data but only very rarely; I can’t count on having that.

The data scientist is not drunk

Is my work a a case of the drunk looking for her keys under the streetlight? I don’t think so, and it’s not just do to with the profitability of institutions or their need to demonstrate adequate program completion rates and year-over-year retention to accreditation bodies.

Students enroll in higher education programs for many reasons, not all of which have to do with learning. We would all hope that after two or four or more years taking coursework a student will have better skills and cognitive capacity than when she started. But bachelor’s and associate’s degree programs today require students to take many classes that are not relevant to them or to their future work, that merely serve as hurdles to jump over to acquire the degree. The learning is not enough, because education serves a signaling purpose in addition to working to improve a student’s cognitive capacity.

As people working in education, we have to recognize the reality of higher education today, that it is only partially about learning. I’m talking from a purely U.S.-based perspective, as that’s my focus. In the U.S. today, bachelor’s degrees are a basic entry ticket to a decent job in many cases. In fact, credential inflation means that now a master’s degree is required for entry into many professions.

Some economists think higher ed degrees primarily function as signals to employers, that they are not mainly about increased learning or cognitive capacity. On this theory, a student’s degree shows that he has qualities valued by employers: conformity, conscientiousness, a willingness to defer gratification. If signaling is true in some situations or in some ways, that means it’s not enough for a student to take some classes and merely learn what they need to use on the job. They need to get through an entire program, arbitrary requirements and uninteresting or useless classes included.

There’s good evidence that signaling theories of education do not tell the whole story about higher education, that returns to education do indeed reflect the additional skills that graduates in various disciplines bring to the job market. The signaling theory may be true in part but as we might hope, education is also about actual preparation and learning–I don’t mean to say it is not.

I also don’t mean to reduce higher education to job market preparation, though I do question degrees that don’t represent a sound economic investment. The cost of post-secondary education today is such that we can’t divorce it from its role in linking people with economic opportunity.

Are learning and course completion orthogonal?

One presenter called completion and learning orthogonal outcomes. At least one attendee in that session took issue with that. We would all hope this is not the case — we hope that students don’t successfully complete classes without learning anything — but can’t everyone think of a class they were required to take but didn’t take anything away from it? I just completed a Ph.D. and virtually all of the cognate classes were worthless, annoying and time-consuming efforts that I had to complete if I wanted my degree. There was almost no learning taking place in those courses, and I am a highly motivated and engaged learner, taking courses that I thought would be interesting. Sometimes students do just need to complete a course, learning or no.

Certainly we should work toward making every course a worthwhile learning experience for students but in the real world there are always going to be some classes that aren’t that for one reason or another. Assuming that completing a particular program is a good thing for a particular student (a somewhat questionable assumption in this era of heavy student loan debt and low-value degrees), helping them get through all their courses successfully regardless of learning is a good in itself.


On the reductionism of analytics in education

I had the great pleasure (and distinct discomfort) of listening to Virginia Tech’s Gardner Campbell speak on learning analytics this week, through my haphazard participation in the Learning Analytics 2012 MOOC. Haphazard, I say, because I am so busy at work I can hardly spare any time to connect outside of it, whether through more structured means like the Learning Analytics course or less structured like Twitter and Facebook. Discomfort, I say, because Campbell launched some pointed criticisms of the current reductionist approach to learning analytics that prevails in education today. Yes, it prevails at Pearson too, not because we have bad motives, but because the process of education and learning is so complex that we feel compelled to simplify it in some way to make any sense of it.

M-theory vs. the x-y plane

Campbell drew an analogy to cosmology, contrasting 11-dimensional m-theory with the planar (two-dimensional) Cartesian coordinate system. He suggested that current work in learning analytics is like working in the x-y plane when we know that education and learning takes place in at least 11-dimensions.

Learning analytics, as practiced today, is reductionist to an extreme. We are reducing too many dimensions into too few. More than that, we are describing and analyzing only those things that we can describe and analyze, when what matters exists at a totally different level and complexity. We are missing emergent properties of educational and learning processes by focusing on the few things we can measure and by trying to automate what decisions and actions might be automated.

As I was writing this post, @webmink Simon Phipps tweeted about his post leaving room for mystery, in which he proposed that some problems will remain unsolved, some systems unanalyzed:

The real world is deliciously complex, and there will always be mysteries – systems too complex for us to analyse. It seems to me that one of the keys to maturing is learning to identify those systems and leave room for them to be mysteries, without discarding the rest of rational life.

Then Simon shared a definition of reductionism with me:

This echoes exactly what Campbell said in his presentation:

My fear is that computers as they are used in learning analytics mean that people will work on simpler questions. They may be complicated in terms of the scale of the data but they’re not conceptually rich. They won’t be trying more concepts or playing with new ideas.

We’ll have a map that makes the territory far simpler than it truly is and we’ll design school to that, not to the true complexity.

Reductionism in analyzing online discussion threads

Last week in a meeting one of my colleagues pointed out the inherent reductionism of our approach to the problem of measuring and characterizing student interactivity and learning via discussion threads. He pointed this out not as a criticism but as recognition and acknowledgement. We are applying a custom-developed coding scheme to threaded discussion posts. We code each post into one of four categories based on the pattern of topics discussed in each post and across the thread. We capture what topics were introduced, how they relate to topics in previous posts, and how they relate to the main discussion topic. We cannot capture all the details and complexity of what people have written and how they have interacted. We certainly aren’t paying any attention to the broader experiences and connections that individual students bring to the discussion. But we are trying nevertheless to capture some important kinds of meaning and interaction in the posts via our coding scheme.

This is, at heart, the analytics endeavor: to take very messy humanly-meaningful information and transform it into numbers that a computer can manipulate. It can be done in more sophisticated and subtle ways or more crude and careless ways, but it is always reductionist. It does not fully capture the human experience of learning. We can’t model learning in all its complexity.

The math is not the territory

I see it as critical in data analysis to remember that our numbers are useful shorthand — easy to manipulate, summarize, visualize, and report upon — but they are not the thing we are interested in. We use them because there is something else non-quantitative we are interested in, something human (at least in social sciences like education).

Campbell said,

We tend to believe the math is the territory and we tend to organize ourselves around just what we’re able to measure instead of organizing ourselves around creating better measurements of what we know to be nearly unimaginably complex.

The math is not the territory — the codes and numbers we use to represent human understanding and action and connection are not the territory — the visualizations are not it either.

Learning as delicious mystery?

Simon suggested some things are too complex to be answerable and should be left as mysteries. Is learning something that should be left unanalyzed? Certainly not, although aspects of it are mysteriously wonderful and not amenable to quantitative or qualitative analysis. There’s too much at stake — for individual students who benefit from success defined in many different ways, for the government that funds or subsidizes much of their education, for the citizenry that benefits from an educated populace.

I believe analytics can help, but I feel humble about its possibilities, more so than ever after listening to Campbell speak. I used to call my stance “cynicism” but I think I will reframe it as “humbleness” which makes it seem like there is some chance of success. As uncomfortable as it was, I’m glad I sat in on Campbell’s talk and listened to it again this morning to think about it further.

Getting ready for connected learning

Here’s a cool idea: the web enables a connectivist learning style based on network navigation, where “learning is the process of creating connections and developing a network.” Seems to me before you can learn connectedly, though, you need to first learn in more socially and contextually constrained ways.

Background: Three generations of distance education pedagogies

In this week’s Learning Analytics 2012 (LAK12) web session, Dragan Gasevic pointed us at an interesting paper describing three generations of distance education: cognitive-behaviorist, social constructivist, and connectivist. From Anderson and Dron (2011):

Anderson and Dron did not claim that the connectivist model would replace the cognitive-behaviorist or social-constructivist models but said that “all three current and future generations of [distance education] pedagogy have an important place in a well-rounded educational experience.”

These three models co-exist online today

LAK12 is itself an example of a course built in the connectivist paradigm, but just because a course is massive, open, and online doesn’t mean that it’s connectivist. For example, the Stanford machine learning class offered last fall was a (very effective) example of a cognitive-behaviorist approach. Students watched videos on their own schedule. Regular quizzes and homework assignments checked understanding. Andrew Ng was content creator and sage on the stage. While there was a Q&A forum available, the course design did not rely on them. A student could use them or not.

Typical online college courses today are often built in the social-constructivist mode, with instructors seeking to design and run courses that encourage many-to-many engagement through discussion threads and group projects. Does the addition of social features drive learning? It seems to be an article of faith among instructional designers today that it does. I’m not up on the research so I can’t say — but I can say that in online courses I’ve reviewed and taken, I don’t see evidence that social features have been designed in such a way that they make a difference in learning.

When are the different approaches useful?

I am thinking that whether a cognitive-behaviorist or constructivist or connectivist approach is best depends upon the preparation and goals of the learner. Maybe something like this:

I suspect that a student needs to gain basic grounding and fluency in a subject before constructivist approaches will be useful. An elementary schooler needs to learn to read and write and do arithmetic before you can do a group science project, for example. And it seems like a connectivist approach will be most effective once you already have some intermediate and contextual knowledge of a subject before trying to navigate out from it.

What do you think? When are cognitive-behaviorist vs. social constructivist vs. connectivist approaches to learning most useful? Do you think you need to have achieved a certain level of contextual and subject knowledge before connected learning is effective?