Skip to main content

Sharing and commenting on things I see that I find important.

dougbelshaw.com

twitter.com/dajbelshaw

dynamicskillset.com

discours.es/feed

 

You don't have to speak the same language to adopt badges

3 min read

Pine cone 

...you just have to adopt the same metadata standard.

 

I'm not sure where to start with this Inside Higher Education article by Colin Mathews entitled Unwelcome Innovation.

Here are some assertions from the article:

  • "Early on, digital badges often used Boy and Girl Scout badges as an analogy, but the more direct precursor of the current generation of badge solutions is video games." (Nope.)
  • "Badge adherents aim to address the “value” and portability of badges by attaching proof of skills to the badges themselves. This is the same idea behind e-portfolios..." (No, e-portfolios are fundamentally different to badges)
  • "Credentials, in and of themselves, are a solved problem." (Ha! If only.)
  • "What’s clear is this: it’s far, far more important to simply document existing credentials than to invent new ones, or a new language to describe them." (No, that just makes it easier to preserve the status quo.)
  • "Connecting students’ skills and ambitions to the pathways to a career is a big deal, but it doesn’t require a new language that’s based on techno-solutionist fantasies." (Yes it does: words have power to describe new realities.)

It's unclear what point the author is trying to make. He assumes that Open Badges is, somehow, solely focused on Higher Education. This is far from the case. He also begins the article by saying that to "better communicate the value and variety of people’s skills to employers" is "very valuable". This is exactly what Open Badges offers! Oh, and I'm calling B.S. on his claim that he was part of the "biggest, most comprehensive badge experiment that no one has heard of". 

Ultimately, I don't think that Mathews, who looks like he's got skin in the game with a siloed competitor to badges, would know a metadata standard if one turned into a wet fish and slapped him the face. Perpetuating what we've got in Higher Education isn't working in terms of employability of graduates. And for everyone else outside of the ivory tower, what's the problem with creating a new learning currency?

Ordinarily, I would merely roll my eyes at this kind of article, as it's the type of thing you find on a startup's blog that no-one ever reads. But seeing as Inside Higher Ed saw fit to publish it on their site, here's what I can be bothered to provide by means of pointing out flaws in an article that, unlike the rest of us, is by an author tearing down rather than building up.

Image via Nomad Pictures

 

 

General purpose computers give kids wide walls

3 min read

Rainbow light

I really like this article by Mark Wilson at Fast Co.Design, railing against how dumb supposedly 'educational' toys actually are. It reminds me of the situation that I've seen with my own kids where they're more interested in the box that the educational toy was shipped in than the contents received as a Christmas present from a well-meaning relative.

The author goes on to cite Pokemon Go which not only has 'invisible lessons' (saving up points, investing in the right power-ups) but actually gets kids out and about. In fact, video games are a great way to prepare kids for the adult world. And yes, I speak from experience... 

Research has found that video games can spur problem solving, interest in history, social skills, and even exercise (when kids tried to mock the moves they saw virtual athletes make.) Almost every off-the-shelf dummy video game you play requires a basic version of the scientific method to complete. Here you are, on Super Mario Bros. level one. The first time you play, you run right into the goomba mushroom and die. The second time, you hypothesize how to avoid death. "Maybe I can jump over him. Maybe I can jump on him!" You test. And correct theories are rewarded with progress in the game. So what if you’re not publishing a grand conclusion. Your conclusion is burning Bowser alive and partying with the princess at the last castle. Meanwhile, all those coding apps that are all the rage? They aren’t proven to work. And they teach such baseline principles that there's not much gained. Meanwhile, handing a kid who is curious about coding real resources—maybe a plan, an Arduino, and some LEDs—could do the job better. And it would give them the opportunity to build something real they might actually want to play with when the lesson is done.

I'm not a big fan of silos. The best preparation for adult life is diversity and learning how to do things in unexpected new ways. Educational toys prescribe the outcome, set limits on what's possible, and aren't even very 'fun'. 

Bottom line: Life is short. Let’s not spend it with stupid educational toys and apps that won’t teach our kids much of anything they couldn’t learn somewhere else, while probably having more fun playing in the process.

We're far too keen to take the shortcut to what we think would be a good outcome for our little darlings. What they need is a broad education, one which has (what Mitch Resnick would call) 'wide walls'. Instead, we constrain what's possible, effectively conspiring with governments and large organisations to create what Cory Doctorow calls the war on general purpose computing.

Image via Nomad Pictures

 

The Feedback Loop From Hell

An excerpt on Lifehacker from Mark Manson's new book.

We joke online about “first-world problems,” but we really have become victims of our own success. Stress-related health issues, anxiety disorders, and cases of depression have skyrocketed over the past thirty years, despite the fact that everyone has a flat-screen TV and can have their groceries delivered. Our crisis is no longer material; it’s existential, it’s spiritual. We have so much fucking stuff and so many opportunities that we don’t even know what to give a fuck about anymore.

Because there’s an infinite amount of things we can now see or know, there are also an infinite number of ways we can discover that we don’t measure up, that we’re not good enough, that things aren’t as great as they could be. And this rips us apart inside.

True.

 

Shane Parrish | My Morning Routine

I’ve also stopped reading the newspaper. Looking at the opportunity cost of my time, I began to realize I was getting more value from reading other sources of information.

While I haven't completely given up scanning the newspaper every day (we subscribe to The i) my main reason for picking it up is to do the crossword. I highly recommend reading The News: A User’s Manual by Alain de Botton if you haven't come across it yet...

 

Jersey is lovely

I'm here for some work with a new client. And no, my American friends, not New Jersey. The island of Jersey is here:

Jersey

I'm not going to even try and explain the relationship and history of Jersey with regard to the UK, especially in these times of near-Brexit...

 

The 'unplugging' narrative is now a non-fiction genre to itself

2 min read

Plants in tall grass

From the Sherry Turkle and Nicholas Carr school of 'these new-ish things cause us to act differently so they must be bad' comes this article by Andrew Sullivan. An example of his florid prose:

Think of how rarely you now use the phone to speak to someone. A text is far easier, quicker, less burdensome. A phone call could take longer; it could force you to encounter that person’s idiosyncrasies or digressions or unexpected emotional needs. Remember when you left voice-mail messages — or actually listened to one? Emojis now suffice. Or take the difference between trying to seduce someone at a bar and flipping through Tinder profiles to find a better match. One is deeply inefficient and requires spending (possibly wasting) considerable time; the other turns dozens and dozens of humans into clothes on an endlessly extending rack.

The author, who had a job that required him to be 'on' 24/7, reflects on his unplugging via a meditation retreat. This 'unplugging' story is almost a non-fiction genre by itself these days. I'm not entirely sure how instructive it is, given that people can get addicted to pretty much anything

I mean, for goodness' sake, perhaps the guy actually has psychological issues that meant he was over-compensating with his use of technology? This section would suggest so:

I was a lonely boy who spent many hours outside in the copses and woodlands of my native Sussex, in England. I had explored this landscape with friends, but also alone — playing imaginary scenarios in my head, creating little nooks where I could hang and sometimes read, learning every little pathway through the woods and marking each flower or weed or fungus that I stumbled on. But I was also escaping a home where my mother had collapsed with bipolar disorder after the birth of my younger brother and had never really recovered. She was in and out of hospitals for much of my youth and adolescence, and her condition made it hard for her to hide her pain and suffering from her sensitive oldest son.

That must have been awful. But let's not extrapolate from anecdotes and personal experiences to the whole human condition, eh?

Image via Nomad Pictures

 

Competency grids are not the future of HR

3 min read

Many thanks to Amy Burvall who brought this Harvard Business Review article to my attention. The author, Michelle Weise, starts off well:

How can companies get a better idea of which skills employees and job candidates have? While university degrees and grades have done that job for a long time, they’ve done it imperfectly. In today’s rapidly evolving knowledge economy, badges, nanodegrees, and certificates have aimed to bridge the gap – but also leave a lot to be desired. While HR departments are eager for better “people analytics,” that concept is still fuzzy. And simply collecting data is not enough – to be used, data has to be presented usefully.

I agree. We need a better way to represent people in a holistic way in the digital world. That's why I'm still an advocate for Open Badges. I don't think you can dismiss them with an assertion like "...but also leave a lot to be desired" How? In what ways?

Weise goes on to give examples from the world of GitHub, but fails to take into account that 'contribution streaks' tell you nothing about the content of what the person actually did. Similarly, behold the horror that is this competency grid heatmap from 'The Human Factor' by Burning Glass Technologies:

Competency grid heatmap

While I've got no problem with the 'soft skills' mentioned down the left-hand side, I can't think of a more demeaning way to represent a human being than as a list of numbers.

The problem is that we're trying to make a broken system more efficient, which is madness. We're trying to remove the human element at the same time as saying it's the thing we value the most.

Better people analytics – and better ways of visualizing and interacting with that data – will not only help managers and recruiters do a better job of matching people with jobs but will also help each of us develop a more accurate picture of our strengths and weaknesses. We’ll be able to send clearer signals to the market about all that we can do.

I think the best thing to do is to embrace the weird and wonderful world of alternative credentials like badges. A world where you still have to explain yourself, tell stories, and show evidence of you as a three-dimensional human being. The more we have to be accountable to algorithms, the worse the world gets for all of us.

 

The revolution will be gamified

3 min read

American flag draped on van 

In The Free-Time Paradox in America, Derek Thompson explores a very modern problem:

Twentysomething male high-school grads used to be the most dependable working cohort in America. Today one in five are now essentially idle. The employment rate of this group has fallen 10 percentage points just this century, and it has triggered a cultural, economic, and social decline. "These younger, lower-skilled men are now less likely to work, less likely to marry, and more likely to live with parents or close relatives,” he said.

So, what are are these young, non-working men doing with their time? Three quarters of their additional leisure time is spent with video games, Hurst’s research has shown. And these young men are happy—or, at least, they self-report higher satisfaction than this age group used to, even when its employment rate was 10 percentage points higher.

The problem, just as pointed out in a recent episode of the 99% Invisible podcast, is that short-term gains can mask long-term losses. The examples given in the podcast were financial, but in this case they're psychological and sociological:

It is a relief to know that one can be poor, young, and unemployed, and yet fairly content with life; indeed, one of the hallmarks of a decent society is that it can make even poverty bearable. But the long-term prospects of these men may be even bleaker than their present. As Hurst and others have emphasized, these young men have disconnected from both the labor market and the dating pool. They are on track to grow up without spouses, families, or a work history. They may grow up to be rudderless middle-aged men, hovering around the poverty line, trapped in the narcotic undertow of cheap entertainment while the labor market fails to present them with adequate working opportunities.

The author goes on to look at the 'paradox' that the wealthier and more successful you are, the more you're likely to work. He outlines three theories explaining this:

  1. The availability of attractive work for poor men (especially black men) is falling, as the availability of cheap entertainment is rising.
  2. Social forces cultivate a conspicuous industriousness (even workaholism) among affluent college graduates.
  3. Leisure is getting “leaky.”

The last one for me is the most interesting as I think it explains the causes rather than the symptoms. The work that can't (currently) be easily outsourced or automated is knowledge work. This kind of work doesn't have a specific location in terms of where it can or should be done, and it also the kind of work that sometimes doesn't feel like work, and also relies on the  autonomy of the individual to be effective.

We're about to enter a time of 'moral panic' around virtual reality and augmented reality. Although decent VR/AR hardware is currently expensive, it,  like everything technological, will come down in price. We may then have a very real problem. 

Religion isn't the opiate of the masses. For better or worse, entertainment, including television and video games, and even the 24-hour news media, is the drug that stops societal progress.

Image via Nomad Pictures

 

Bikeshedding

2 min read

Email guidelines

When you join any new organisation, there's jargon terms that you need to get up-to-speed on. These are often acronyms or shorthands to people within a defined community. At Mozilla, a term I heard a lot was "bikeshedding". I didn't really know what I meant, so I asked.

In a nutshell, and as explained here, the concept comes from a book called Parkinson's Law:

Parkinson shows how you can go in to the board of directors and get approval for building a multi-million or even billion dollar atomic power plant, but if you want to build a bike shed you will be tangled up in endless discussions.

Parkinson explains that this is because an atomic plant is so vast, so expensive and so complicated that people cannot grasp it, and rather than try, they fall back on the assumption that somebody else checked all the details before it got this far. Richard P. Feynmann [sic] gives a couple of interesting, and very much to the point, examples relating to Los Alamos in his books.

A bike shed on the other hand. Anyone can build one of those over a weekend, and still have time to watch the game on TV. So no matter how well prepared, no matter how reasonable you are with your proposal, somebody will seize the chance to show that he is doing his job, that he is paying attention, that he is *here*.

In Denmark we call it "setting your fingerprint". It is about personal pride and prestige, it is about being able to point somewhere and say "There! *I* did that." It is a strong trait in politicians, but present in most people given the chance. Just think about footsteps in wet cement.

It's a useful concept to bear in mind, but even more useful would be an email program that implemented the prompts the author suggests — see the image at the top of this post!

 

Learning styles, heuristics, and employability skills

3 min read

 Two chairs in sunshine

There's nothing wrong with a recent article on the (excellent) site The Conversation. Nothing at all. With the descriptive, if slightly unwieldy title, Students are not hard-wired to learn in different ways – we need to stop using unproven, harmful methods, the article is one of a series.

In our series, Better Teachers, we’ll explore how to improve teacher education in Australia. We’ll look at what the evidence says on a range of themes including how to raise the status of the profession and measure and improve teacher quality.

They say this as if telling people the best ways to teach leads to better teaching. By the same analogy, the books I've collected on my shelves should lead to me being a more knowledgeable person. Unfortunately, it doesn't work like that.

The problem, as I see it, is that people expect both the ends and the means of a change to be pure and unsullied. Let's take learning styles as an example. The article is absolutely right to point out that there's no such thing as a fixed best way for each of us to learn.

If learning styles exist at all, these are not “hard wired” and are at most simply preferences. What we prefer is neither fixed for all time nor always what is best for us.

That's fine, but what the author seemingly fails to grasp is the importance of second-order effects. I've seen time and again examples of people exploring learning styles as 'different ways to teach the same thing' and realising that it's OK to mix things up a bit. 

In this sense, learning styles can best be thought of as an heuristic, an "approach to problem solving, learning, or discovery that employs a practical method not guaranteed to be optimal or perfect, but sufficient for the immediate goals". Another definition is "enabling a person to discover or learn something for themselves". We need to do this for teachers, who are, lest we forget, also learners themselves.

As I keep saying with my work on ambiguity, terms (like learning styles!) are not fixed once and for all time. So terminology becomes, for an indeterminate amount of time productively ambiguous before they slip off into dead metaphors.

I can see something similar at the moment about 'employability skills'. There's much to critique there, but if it's the term du jour why not use it for something constructive? The notion of 'learning styles', at it's most reductive, is harmful. But used metaphorically and as a gateway to further self-discovery for teacher, I'd argue, it's potentially a useful term.

Image via Nomad Pictures