Bring it on…

Unless you’ve been living a) under a rock or b) in Europe, Asia or Africa, you no doubt are aware that this Sunday/Monday is the SUPERBOWL!!!! Yes! Huge men padded to look even bigger hurling themselves from one end of green rectangle to the other! So the digital art I want to show you today is American football-related. And I have to say “American football” because to us here in Europe, football = soccer. And for the record, in soccer the contact between ball and foot is much more frequent than in the American namesake, so why the Americans couldn’t think of another name is beyond me. Just sayin’.

Now, I’m not a football fan (of either version of the game), but you’ll see that that’s not a prerequisite to enjoy this website: The Giferator, a collaboration between the NFL and Google. You create a moving gif with an aggressive message, and the results – which you can post, share or send – range from fun to outright hilarious. It’s pure pumped-up, bad-ass energy, the same reason why millions will be tuning in tonight at 6:30pm Eastern Time (12:30am for the UK, 1:30am for us here in Spain). Me, I’ll be tuning in for the ads.

the giferator


You can give it a try here. Have fun!

A question of choice

Long form vs short form, skimming vs reading

Andrew Sullivan is quitting blogging. “Who?”, you may ask (to which I will respond “Where have you been?”). Andrew Sullivan was one of the pioneer political bloggers before blogging was even a thing, back in 2000, and since then he has consistently been pushing boundaries. He has written for Time, The Atlantic, and The Daily Beast, and in 2013 left to start his own media company, The Dish. The business model he eventually settled on, the paywall, showed that good writing does not have to be no-cost, and that there is a demand for clean, ad-free design. He ended up with 30,000 subscribers, generating $1 million of income a year. Pretty good when you consider how often we’re told that paywalls don’t work because people don’t want to pay for content.

So if he was doing so well, why is he quitting? Here is where it gets really interesting. As he explains on his blog:

“…I am saturated in digital life and I want to return to the actual world again. I’m a human being before I am a writer; and a writer before I am a blogger, and although it’s been a joy and a privilege to have helped pioneer a genuinely new form of writing, I yearn for other, older forms. I want to read again, slowly, carefully. I want to absorb a difficult book and walk around in my own thoughts with it for a while. I want to have an idea and let it slowly take shape, rather than be instantly blogged.”

I was literally weeping by the end of the passage. Not only because he conveys so beautifully the emotion of this decision. Also because the decision itself is beautiful, and will, I have no doubt, instruct and inspire us all for many years to come.

As he has always done, Andrew shows us that we have a choice. We don’t like the information overload? Filter. Switch off. Whatever. We feel too much pressure to be “always on”? It really isn’t necessary to be “always on”. Yes, readers come to expect a certain standard. And yes, they will be disappointed when those standards are not met. But that is not the end of the world. Readers are more adaptable and forgiving than we are led to believe.

image by S. Zolkin via Unsplash

image by S. Zolkin via Unsplash

Andrew also highlights, in a very poignant way, the divide between the two mentalities. Much (although not enough) has been said about the change in our reading and learning habits that has resulted from information bombardment and the need for clicks. Nicholas Carr (I’ve written about him elsewhere) puts it beautifully in The Shallows:

“And so we as the Internet to keep interrupting us, in ever more and different ways. We willingly accept the loss of concentration and focus, the division of our attention and the fragmentation of our thoughts, in return for the wealth of compelling or at least diverting information we receive.”

And while I don’t agree with him on quite a lot of what he implies, on this he is right. It is very, very common these days to lose the ability to focus and think. I don’t blame our current media society, though. And I strongly believe that too much access to information is preferable to not enough.

Obviously we lose some abilities as technology progresses, but we gain so much more. As Enrique Dans pointed out in a thoughtful article a few weeks ago, we don’t know how to use the sextant anymore, but does that make us worse off?

Blaming the technology for our decrease in ability is like blaming the oven because our cake didn’t turn out well. True, cakes have been spoilt by ovens (been there), but generally it’s us not knowing how to use it properly that’s the underlying cause. And while the technology we have in our hands does condition us to a certain extent (one of my current “thinking” research products), ultimately we do still retain control over our faculties. If we want to, of course. For so many it’s easier to not even think about it and to “go with the flow”, or more accurately, “to try to keep up”.

That constant “running to stand still” generates stress. And that is what affects our ability to deep read. The deep reading that Mr. Sullivan and Mr. Carr refer to is an ability that needs regular exercising. Knowing how to read is not the same as knowing how to deep read. Deep reading takes practice and time, of which we seem to have less and less as our obligations multiply. We get married, we have children, we struggle to advance in our jobs or even just to hold on to them. I’ve lost count of the number of intelligent, cultured people in my age bracket who have said to me recently “I can’t remember the last time I read”. They don’t have the time, and if they find the time, they feel guilty because other things are deemed more morally imperative. I have so been there.

image from Death to the Stock Photo

image from Death to the Stock Photo

At the end of 2013 I sold my e-commerce business, after an exceptionally stressful year and a half. I decided to take some time off to get my brain back, because not only could I not read, I couldn’t even think clearly. I started by grabbing a philosophy book off my shelf and working my way through it. It was a short book (The Problems of Philosophy by Bertrand Russell, if you’re wondering), and it took me ages. I had to re-read sections again and again and again, because I couldn’t concentrate. Then I moved on to history, then back to philosophy, and so on, until I started to make mental connections again. My brain had started working. But it took a while. I had let myself get “out of shape” mentally. Regular workouts got me back in form.

Now that I’ve started work on another business (I’ll tell you more later), I’m trying very hard to maintain my mental exercise regime. On most days I wake up early and spend a couple of hours with my nose in a book. Just like a regular physical workout leaves you feeling great physically, the mental kind does the same for your mind. It feels really good.

But obviously, it’s not for everyone. Many, many people are just not interested, even if they have the time. They’re thrilled with the always-on, quick fix of clickbait and short reads. But that’s the beauty of the age we now live in. We all, to a certain extent, have a choice. I really enjoy the immediacy and breadth of social media. I spend a lot of time on it. And, I’ve always loved reading about ideas and theory. I spend time on that, too. The rush when two concepts connect in the brain to open a window of understanding produces a shot of adrenalin that gets me out of my comfy armchair and sets me pacing around the living room. I love that feeling. Others don’t. For me, finding the time to work on both reading styles is worth the effort. I do not have to limit my choice to one or the other, just as others do not have to choose both. And there is no “correct” reading style. Whether we want to be contemplative, entertainingly superficial or both, with a bit of juggling and a lot of work, we can be.

Having shown us remarkable talent, persistence and personality in one area of communication, Andrew will no doubt now do the same in another. I love his phrase “walk around in my own thoughts”, that sounds like my idea of anti-social bliss. I completely get his motives for making this decision, and am perhaps slightly envious of the stimulating and life-affirming changes he has coming up. I have just programmed myself a reminder to re-read his post six years from now.

That one of the most successful and renowned bloggers of our time is signing off, is huge. That it is for personal reasons, even more so. And it is inspirational. I really do believe in the trite saying that when one door closes, another one opens. Thank you so much, Andrew, for reminding us all of that.

Smart shoes, not just because they look good

We’ve talked about wearables before, and about how they are beginning to mix functionality with elegant design. Heartbeat monitors that seductively dangle around your neck. Temperature trackers that gracefully adorn your wrist. Today we’re going to look at a sector of “wearables” that has especially complicated design issues, but which can end up making a significant impact in the healthcare and quality-of-life field. A sub-group that is often overlooked, trodden upon, underfoot… Smart shoes.


Runscribe running tracker

Obviously, I mean smart as in “intelligent”, not smart as in “elegant”, the type you wear to your wedding. Type “smart shoes” in Google and you get a lot of offers for shiny brogues or dainty heels. I imagine that entries for intelligent, life-enhancing shoes will climb their way up (sorry) the ranks as this sector starts to take off, as more and more realize the potential for healthcare, fitness and even lifestyle. And until then, I shall continue to call them smart shoes, so that they fit happily with their smartwatch and smart glass brethren.

But what could a smart shoe do that a smart watch can’t, I hear you ask?

For example, Path helps people who have trouble walking, such as sufferers of Parkinson’s disease. It clips to the tip of the shoe, and projects parallel lines onto the next 50cm of the floor, thus laying down a guide for the opposite foot. Pressure sensors inserted into the soles of the shoes trigger vibrations with a foot hits the ground. This compensates for the loss of sensation commonly associated with the disease.

GPS Smart Shoe

Well, at least they look comfortable…

GPS Smart Shoe helps with the care of Alzheimers’ sufferers by transmitting the shoe’s (and it’s wearer’s, presumably) location to a dedicated website thanks to a tracker in the heel, although its battery life is only a couple of days. The GPS Smart Sole hides the tracker in the insole, which can be fitted into most walking shoes, and has a year-long battery life. Mercifully I don’t have experience with this, but it must be terrifying when someone you’re caring for wanders off and neither you nor they have any idea where they are. Now, they can be located on Google Maps. You even receive an alert when they leave or enter a room or a building.

Lechal smart shoe

image of Lechal shoe via Techcrunch

And, of course, there are smart shoes for fun. The casual shoe Lechal (which means “take me along” in Hindi, and “of the milk” in Spanish – let’s go with the Hindi interpretation, it is being developed in India, after all) links with Google Maps and guides you using haptics. It buzzes on the left of the left foot if you need to turn left, right if you need to turn right. So no longer will you need to stop on a street corner to squint at a map. Imagine the implications for the sight-impaired. Oh, and while it’s at it, it counts your steps and calories consumed.

And if Kickstarter is anything to go by, more interesting innovations are on the way. Digitsole will not only track your activity, but will also heat up and keep your tootsies warm on those cold winter runs. The project raised well over twice the $40,000 it was hoping for on the crowdfunding platform back in November, and should hit the market in April. Runscribe is for serious runners, and clips onto your shoes to collect data on how you land, and to analyse your running style. After raising more than 5x their original goal of $50,000 on Kickstarter, the company plans to start shipping the product early in 2015.


Runscribe, to analyze your running technique

So, smart shoes for healthcare, and for fitness… What about smart shoes for fashion? Yes, I mean smart smart shoes? That could, perhaps, buzz when it was time to go home. We could call it the Cinderella. 😉

— x —

For more on wearables, check out my Flipboard magazine!


View my Flipboard Magazine.

For, or against?

The inexplicable question of taste…

Marmite, I mean. Do you like it, or do you hate it? It’s not something you can be ambivalent about. In case you don’t know what it is (which unless you’ve spent time in the UK is quite possible), it’s a savoury vegetarian spread made out of yeast extract. It’s black and shiny and sticky and very intense. It has occasionally been described as a cross between cheese and shoe polish. Me, I love it. My husband and kids hate it. And the cool thing is, the folks at Marmite embrace this fervent and irrevocable split and even incorporate “Love it or hate it” in the commercials. The latest one is a bit emotional (don’t watch it if you’re feeling weepy), featuring the rescue of neglected Marmite jars (check out the little boy’s face at the end, I’m guessing he’s not a fan):

I gave a speech last week on how the relentless focus on innovation in the business world is more empowering than threatening, and how it gives us all an incentive, an excuse if you will, to embrace our inner creative… Something which the Marmite product developers certainly seem to have done with their latest product: a Marmite-chocolate Easter egg. Seriously.

Marmite-flavoured Easter egg

image via Mashable

Me, I want to try one. I think it sounds good. Everyone else I know is hoping that it’s a joke. I tried a curry-flavoured chocolate once, and loved it. I’m not actually a huge chocolate fan, so a bit of Marmite-flavour sounds like an improvement. It turns out that the combination is not that new, in 2010 a Marmite-chocolate bar hit the shelves, and has actually sold quite well, in spite of being called “deeply nauseating” by chef Rowley Leigh.

marmite chocolate bar 2

What does this have to do with technology and its impact on our lives, I hear you ask? Why am I even telling you about this? Because it has a lot to do with innovation. Which is more important than ever in the fast-paced business environment that technological development has led us into. An indirect connection, true, but so? I’m ok with that if you are…

Kinetic Poetry and the meaning of words

This installation by Australian group 313RGB is a curious example of kinetic poetry, or “poetry in motion”, or “interactive poetry”, or whatever you want to call it. You stand in front of a screen and “move” the words around with your hands. Without actually touching anything, of course.


It does make you re-think what is a word is. Is it a manipulable thing? Or is it an intrinsic meaning? I’ve always believed it was both. In which case, is the word a tool, that exists for the meaning to get out? Or is it a thing in its own right, that can be played with, handled, moved around? If the word changes shape, position or context, does the meaning change too?

And if a word is meaning, then how can it also have meaning? I believe that it depends on how the word is being used.

This train of thought reminds me of a passage in “Stop What You’re Doing and Read This”, by Tim Parks:

“Maybe we can start by reminding ourselves what a strange art form writing is.

For there is no artefact as such: unlike painting or sculpture, there is no image to contemplate, there is no object you can walk around and admire. No one is going to say you must not touch. No alarm will go off if you get too close.

You don’t have to travel to enjoy a piece of writing.

And there is no performance, either. Strictly speaking. Unlike concerts or plays, you don’t have to queue for tickets or worry whether you’re near the front.

You can’t take a photo.

Then a book has no fixed duration. Unlike music, you don’t have to respect its timing., accepting, along with others, an experience of the same length.

You can’t dance to it You can’t sing along.

Instead, there are signs on paper. Or on a screen.

… Only the sequence of signs matters… The experience is the sequence. “

“The experience is the sequence.” (my italics). Without sequence, without context, the words are just things, objects, that have an intrinsic meaning. With context, the words are meanings represented by a thing.

After reading this passage for the first time, I had to put the book down and stare into space for a while as it dawned on me that I had just caught a glimpse of the power and the potential of words.

Kinetic poetry takes that power and puts it in our hands. We control the sequence. And it is the sequence that gives words a contextual meaning. But in controlling the sequence, we manipulate the words, which become more “things” than meanings. A word becomes a plastic object to be played with. But it is also laden with meaning, which changes according to the context.

Think about why we have words. To communicate. So, a word is a thing that was “invented” to convey meaning. Now, it is taking a life of its own. Its “thingness” is becoming important. And yet, it is still impossible to completely divorce a word from its meaning (or its many meanings). But could we divorce a meaning from its word?

Deep thoughts for a Sunday morning. But worthwhile ones, that enhance the wonder of reading. Enjoy your Sunday! Read something beautiful!

The Internet and the World Wide Web are not the same thing. Please take note.

I’m not one to complain. Really. Well, hardly ever. But here goes: I’ve always been a fan of the magazine Time, but it’s going downhill so fast in terms of quality of reporting and design (its new web is so much harder to navigate than its old one) that I’m even thinking of cancelling the subscription that I have held for about 12 years… There, I got that off my chest, and it wasn’t too painful.

The cause of the most recent slump in my esteem for the publication is from an article back in December claiming that Sir Tim Berners-Lee is the inventor of the Internet. He’s not. He’s a brilliant man, who gave the Internet a usability and accessibility that powers the unimaginable amount of traffic that flows across it daily. But, I’m pretty sure that you got to this blog not through the Internet as such, but through the World Wide Web. And that, Sir Tim Berners-Lee did invent (along with several colleagues, as he would be the first to tell you).

click on image to go to the article at - image taken 22/1/15

click on image to go to the article at – image taken 22/1/15

See, here’s the thing: the Internet and the World Wide Web are not the same thing. And for a magazine of the reputation of Time to think they are, is worrying. Maybe the article was written by an intern – but shouldn’t the tech editors pick up on that? And the thing is, it probably underlies a misconception that is much more extended than I realized.

Even such a brilliant thinker as Nicholas Carr seems to get the two terms confused. His book “The Shallows” (definitely worth reading, even if you don’t agree with everything he claims – he does provoke serious contemplation) carries the subtitle: “How the Internet is changing the way we think, read and remember” (or “What the Internet is doing to our brains”, if you have the American version). The main theme of the book is that the overload of information that the Internet gives us is hindering our brain’s ability to think deeply, and to absorb long-form content such as, you know, books. Like the ones he writes.

Some sample quotes:

“The Net seizes our attention only to scatter it.”

“On the Web, there is no such thing as leisurely browsing. We want to gather as much information as quickly as our eyes and fingers can move.”

“And so we ask the Internet to keep interrupting us, in ever more and different ways.”

“The Web… places more pressure on our working memory, not only diverting resources form our higher reasoning faculties but obstructing the consolidation of long-term memories and the development of schemas. The calculator, a powerful but highly specialized tool, turned out to be an aid to memory. The Web is a technology of forgetfulness.”

He appears to use the two terms interchangeably. And, whether you agree with him or not (I don’t), blaming the Internet for our misuse of the information overload is like blaming the oven in which we bake our cookies, for making us fat. It’s really not the Internet that distracts us. It’s what’s using it. What’s using it is the World Wide Web.

Let’s look at the differences between the two concepts, and settle this once and for all.

Imagine a book. A paper book, not an e-book. The paper, the cover and the binding, that’s the Internet. The print on the paper and the cover, that’s the World Wide Web.

The Internet is the hardware. The World Wide Web is the software.

The Internet is a network of computers, linked by cables or by beamed waves. The World Wide Web is a way of exchanging information.

Web pages, hyperlinks and browsers, that’s the World Wide Web. Your fibre optic cable, wifi modem or whatever it is that gets you connected, that’s the Internet.

The Web cannot exist without the Internet, but the Internet can exist without the Web. The World Wide Web runs on the Internet. It depends on the Internet. But the Internet does not need the World Wide Web. Other stuff is done on the Internet that does not involve the World Wide Web, such as emailing. The original Internet forums did not need the World Wide Web, they used Usenet, a different protocol. It’s still in use today, mainly for internal message boards and discussion groups. And the Internet uses the FTP protocol to directly transfer information between computers. If you’ve ever uploaded something to a server using Filezilla or something similar, you’re not passing through the World Wide Web. You are using the Internet, though.

If you can’t access any Web pages, it’s probably a problem with your Internet connection. Not your Web connection. You can’t say “the Web is down”, or “the Web crashed”, like you can with the Internet. It’s a bit like saying “the cake crashed” when it’s really your oven that won’t turn on.

The first Web page, programmed and uploaded by Tim Berners-Lee

The first Web page, programmed and uploaded by Tim Berners-Lee

Right, enough geekiness for one day. Is the difference even important, I hear you ask? Yes, it is. Because how we see something affects very much the uses that we can come up with. If we understand that the Internet is separate, distinct, not the same thing, then we can start to imagine what else we can do with it. The World Wide Web is but one way of spreading information via the Internet. It’s convenient, flexible, colourful and fun, and it is definitely the flavour of the 21st century so far. But with the development and rollout of the Internet of Things, which is gearing up to be massive, data is increasingly being transferred via other protocols. And as the Internet of Things – in which data is beamed from one object to another (sensors, blood pressure bands that wirelessly beam information to your doctor’s smartphone, toasters connected to the TV…) – takes off, we will need a new form of connectivity. This is already happening. In no way will it replace the World Wide Web (and there will be some overlap as some IoT information is communicated to us humans via Web pages). It can run parallel, using the Internet, without which neither would be possible.

Separating our notions of the Internet and the World Wide Web is very important if we are going to create the knowledge economy that will continue to propel civilization to new technological heights. Separating them is also important if we are going to continue to innovate in the distribution of information and ideas. Who knows where else we could take the Internet? Who knows what else it could end up doing for us? What if the hyper-connected society that we live in, thanks to the World Wide Web and the IoT ecosystem, is just the beginning?

Human and digital and Beyoncé

Ok, confession time, what’s your favourite Beyoncé song? And don’t tell me you don’t have one, we all do. Or if not favourite, at least one that you find less irritating than the others. For me, “Halo”, not too much of a surprise there, I imagine. I’m not really into her style of music, I confess (I prefer Jay-Z, go figure), but I very much admire her talent and intelligence. Have you listened to the words of “Pretty Hurts”?

Pretty hurts, we shine the light on whatever’s worst
We try to fix something but you can’t fix what you can’t see
It’s the soul that needs the surgery

And as far as technology innovators go, she’s way up there. The scenography of her Billboard Awards performance is spectacular. Surprising. Do you remember the digital and human dance that I wrote about a couple of weeks ago? It’s similar to that, in that the boundary between human and digital is being blurred. But we’re talking about a lot of colour, a lot of movement and a lot of sass.

Beyonce's Billboards Awards

Check it out:

It’s worth watching several times (even on silent, if you must), just for the backdrop effects. The first three minutes, anyway, after that the troupe of dancers and the fireworks take over.

To produce this, Beyoncé worked with New York-based ThinkBreatheLive, the geniuses behind Roger Waters’ “The Wall” tour projections. And her SuperBowl 2013 performance, also engineered by them, had some impressive projections. But not at this scale. It is exciting to think what this could be the beginning of, and to realize how much digital technology is even influencing how we consume live entertainment…


The Expansion of MOOCs and the Meaning of Life

The FT earlier this week had an interesting report on the expansion of MOOCs into China. Not, surprisingly enough, the expansion of China into MOOCs, which, given the size of their market and the qualification-centric nature of their educational system, would make a lot of sense. No, for now, foreign (”western”) MOOCs are starting to make big inroads there. Which, given the size of their market and the qualification-centric nature of their educational system, actually makes even more sense.

Hong Kong, image via Wikipedia

Hong Kong, image via Wikipedia

Guokr Mooc Academy launched in July 2003 and currently hosts just over 2000 courses via partnerships with Coursera, edX and other US and UK platforms. Unsurprisingly, the majority of the courses deal with the sciences and technology. But, and this got my attention, there is increasing interest in humanities courses. Including history.

One of Harvard’s MOOCs is a course called ChinaX. Subtitled in Chinese, it can be seen in China on Youku, the “allowed” alternative to YouTube. It offers a 10-part history of China, including the Tiananmen Square massacre, and so far it has escaped censorship. Of the courses’ approximately 60,000 students, about 10% are in China. That’s over 6000 students, enough to fill over 100 large university-size classrooms. And since the high demand will lead to other, similar courses, with similar or increasing percentage of Chinese students, this could end up attracting the attention of the authorities.

So, how long before censorship affects what MOOCs can offer? When does the “open” part of the acronym (Massive Open Online Courses) start to look less so? China is obviously not the only country that this could end up being an issue for. Will the fact that curious students have access to “alternative” views of history, change education systems? Politics? Nationalism?

Here is where the access vs quality debates get interesting.

I have on several occasions written (here, here and here) and spoken about the educational revolution that the MOOCs imply, how they open up worlds, distribute knowledge and empower everyone. I’ve taken many, and I’ve loved most of them. I enjoy the freedom of being able to come and go as I please, to sample new areas and to take courses I never expected to be able to take. I have gotten so much more out of MOOCs than I ever expected. I am a MOOC fan, because my expectations were realistic, or if you’d rather, low.

And I am relieved to see that the hype, both positive and negative, is beginning to die down. If you’ve ever taken a business course, you’re probably familiar with the Gartner hype cycle. An innovation takes off, captures headlines, and everyone is convinced that universities will close, the world will become smarter and education will never be the same again. Then, in an unreasonably short space of time, when the unrealistic expectations aren’t met, the innovation is deemed a complete failure, a wasted experiment, and the end of civilization as we know it. Then, as people start thinking, the innovation claws its way out of the trough of disillusionment, and starts really getting to work on shifting perceptions and processes.

MOOCs are there. You hardly ever hear of MOOCs replacing universities any more. Nor does anyone claim that they are useless. “Blended learning” is the catch-phrase of the edtech sector, and while schools are taking their time in incorporating this major efficiency (in which some lectures are delivered via video), it will become a standard feature of education in the years to come. Personally I love the idea of the “flipped classroom” in which students watch the lectures at home, as many times as they need, pausing and rewinding where necessary, while they do the exercises with the teacher and their peers in the classroom.

One aspect that the original MOOCs were criticized for was the lack of interaction with the professor. You watched the videos, you did the assignments, maybe you interacted via text with others in the class in the forums. But you didn’t get to enjoy first-hand the professor’s personality, you weren’t given the opportunity to catch his or her enthusiasm, you didn’t feel the validation of the professor knowing your name. And, you missed out on significant, dynamic, human debate, which is the backbone of most of the non-scientific higher education study.

Back to the China question: is it possible to learn history online? In what way is that different from reading a book? The multimedia effect brings the content a bit more to life, sure. But the value of the human connection and its role in inspiring students, opening their eyes and broadening their viewpoints should never be underestimated. Always, a human professor that cares can transmit and influence so much more than a well-designed screen.

But, the human touch is not always an option, either for economic, geographical, or – as in the case of ChinaX – political reasons. In this case, and countless others, access is enough. Those who claim that the human touch is essential, are leaning on a limited definition of the concept of higher education. As are those who claim that the information is the only important part.

by Chris Brignola, via Unsplash - lovely photograph, and I'd though you'd like the symbolism/reference to bridging worlds... ;)

by Chris Brignola, via Unsplash – lovely photograph, and I’d though you’d like the symbolism/reference to bridging worlds… ;)

One professor reaching tens or even hundreds of thousands of students, automatic grading, self-paced study… it’s easy to see why we all got very excited about the possibilities for developing nations and those who don’t have easy access, economically or geographically, to higher education. And it’s true that for a lot of the courses, a connection with the professor is not “necessary”. It is possible to absorb the information without it. But, to claim that MOOCs could replace traditional education is to misunderstand what education is for.

Understanding what education is for is actually almost impossible, as there is no one “objective”, or even definition. To say that it’s about “becoming a well-rounded citizen” is western-centric. For some, it’s about learning about life. For many, it’s about getting a well-paid job so that you can support your parents, spouse and children. As long as there are cultural and economic differences around the globe, there will never be agreement on this, and insisting that any one definition is the correct one is not very useful.

What is useful is that we are finally taking a good, long, hard look at the options. Going to a traditional university is still one of them, and it always will be. But it is becoming more and more of a luxury as costs rise and available time dwindles. Higher education is no longer the only road to success, if by “higher” education we mean a quality university with a wide breadth of subjects. “Further” education – a constant updating of skills and interests – is essential in today’s workforce, and will be even more so in the coming years as sectors grow and contract and as focus gives way to flexibility and creativity. MOOCs fill that need quite nicely. They are specialized, allowing for honing of skills, learning a new technical qualification or dabbling in something completely new. They are flexible, fitting around work and other life demands. They are frequently updated. And they are low-cost, or even free.

Is the human connection necessary? The answer to that depends very much on each individual’s goals and expectations. Many MOOCs are attempting to find an ideal connection/automation balance. Hangouts with the professors, group chats on Twitter, AMAs on Reddit… All these allow the students to feel “connected”, while at the same time allowing the professors unprecedented efficiency. Are these initiatives practical? They are more expensive than the full automation approach, in that they require more of the professor’s time, and a certain amount of set-up. But, and I speak from experience, even a brief contact with the professor is motivational, so much more so than speaking to his or her assistants, or the Community Managers on the discussion groups.

image from Deathtothestockphoto

image from Deathtothestockphoto

Maybe we’ll end up with tiered pricing, some courses offering both higher and lower interaction paths. Or, some courses will be free, with other, more “human” ones requiring a fee. I, personally, would be willing to pay for a “deeper” understanding and connection in many of the courses I’m interested in. But others I would like to “dabble” in for free, to start with, anyway.

The technological revolution at university and further education level is well under way, and its benefits are being enjoyed by millions. However, the technological revolution at school level has not yet found its groove. And it’s not because of a lack of choice. The explosion of technology attempting to revolutionize schools is dizzying, chaotic and at best, confusing. Over the next few years we will hopefully see a consolidation of this sector, so that we can focus on the important role of the teacher.

Personally, I believe that more resources should be spent on the human connection at primary school level. I’m sure that we all remember at least one school teacher that changed us, that showed us a path we were unaware of, that encouraged us and that opened doors to a future we didn’t know we wanted. If we could harness technology to make teachers’ workdays more productive, if tasks could be made more efficient and if time-consuming routine work could be automated, they would be able to focus on the ever-important human connection with their students. Not only could we finally set off a revolution in primary education similar to that experienced by higher education. We could also develop a generation that is more focussed, more optimistic and more motivated than any that have come before.

— x —

For more on online education, check out my Flipboard “Internet and Education”:

flipboard education

Multimedia reporting on a whole new level

Since we talked about journalism this week, I want to share with you the most beautiful example of multimedia reporting that I have seen so far. “Snowfall“, produced by the New York Times and written by John Branch, is surprising, breathtaking, and a seriously good piece of reporting on an avalanche, its victims and the aftermath.

From "Snowfall", The New York Times (click for link)

From “Snowfall”, The New York Times (click for link)

The topography of the article is embellished with animations, slideshows, videos and sound recordings, and really shows how multimedia reporting can bring a story to life. You hear the 911 calls, you see animations of the avalanche, you watch interviews of the survivors, you browse through slideshows of photographs of the people and places involved. Very in depth, very absorbing, even for a non-skier like me. It’s not hard to see why the article won a Pulitzer, a Peabody and even a Webby award. Click here to see the article in all its enriched glory.

From "Snowfall", The New York Times

From “Snowfall”, The New York Times

The New York Times has published other multimedia articles since then, most notably “A Game of Shark and Minnow”, also quite spectacular and gripping. But for me, Snowfall is the most beautiful and the most haunting.

From "A Game of Shark and Minnow", The New York Times

From “A Game of Shark and Minnow”, The New York Times

Is this the future of online long-form journalism? No and yes. No, in that it is more complicated and costly to produce these amazing effects. Snowfall took 6 months and 11 people to produce, and few media can afford that sort of outlay. Also, no in that not all long-form journalism lends itself to this kind of visual. Yes, in that costs are coming down all the time. Platforms such as Scroll Kit (recently bought by WordPress) make production of stories like these easier (I haven’t tried it yet, but I will and I’ll let you know). HTML effects are getting more sophisticated. Soon, if not already, producing that kind of story will take a few hours and only a medium level of skill. Also, yes in that the impact and excitement of this new form of story-telling does leave you wanting more. The number of web publishers that offer this type of reporting will increase. Their work will be easier because of the New York Times, ESPN and other pioneers of the genre. We should thank those who got the ball rolling.

Citizen journalism – democratizing what, exactly?

Among the way-too-many Christmas presents we got this year was the boxed DVD set of The Newsroom, Season 2. It’s a controversial series, but I confess that I love it: the dialogue, the ethics, the insults, the transient victories… The image of the socially-conscious, caffeinated and relentlessly focussed news producers and anchorpersons (or is it anchorpeople?) inspires hope and optimism that, with such determined professionals dedicated to bringing us the truth, injustices can be fixed and the world will become a fairer place. The series reflects, especially in Season 3, how much the news business has changed over the past few years. Traditionally newsrooms have relied on reporters on the ground for their information, perhaps seasoned official sources, occasionally a disgruntled leak. And The Newsroom features all of those. But an increasing role is being played by thematic channels, by page views, and especially by the social media.

image from istockphoto

image from istockphoto

Enter the “citizen journalist”. These days, with high-quality cameras in most pockets and handbags, and with publishing media just a few taps away, the role of “journalist” is changing. Citizen journalism, user-generated content, social reporting… Whatever you choose to call it, the concept is changing how we consume news. No longer are we limited to a few reputable channels and publications. What we consider “news” is popping up on Twitter, blogs, YouTube, Facebook, content-curation sites, Tumblr, even Instagram… We have an almost infinite array of choices of where to get our information from, and a vast range of possible sources.

But with so much “news” out there, how do we find the relevant stories and sources? How do we know what information to trust? Most of us have social media feeds that we follow, as well as bloggers whose writing we like. But are they “citizen reporters”? Being a commentator is not the same as being a journalist. Curating is not the same as reporting. And news is not the same as speculation.

For our daily news consumption, we look not only for trust-worthiness, but also reliability and dependability. We want to keep up with what’s going on in the world, and citizen journalists can’t do that for us. They can tell us about what’s happening around them, for a limited window of time. But once they start reporting continually, or once they broaden their brief to include other areas, once they start to accumulate a regular number of people who read what they produce, once they start to depend on that activity to make a living, they’re no longer “citizen journalists”. It becomes their job. They’re professional. Unaffiliated, maybe, but still professional.

image from istockphoto

image from istockphoto

However, the professionals are relying more and more on the amateurs, news media are using more and more on “unofficial” sources for their content. Dozens of platforms collect and channel citizen reporting, making it easier for news outlets to find user-generated content that they can use. As for trust issues, it is easier than ever to verify information. Did that landslide really just happen, or is it last year’s footage recycled? Is that bomb-wrecked market in Gaza or in Syria? Geo-location, cross-referencing, reverse image searches and source-confirming are a few clicks away, and can verify or disprove where an image or a text came from.

But back to the main theme: what is “citizen journalism”? Much has been written about it, much has been claimed, and as with anything digital that changes the way we do things, much of it is hype.

Citizen journalism is not a “disruption” of news. News is still news. The fact that there seems to be more of it than ever is more a case for better filters and better criteria for deciding what to believe and which sources to trust.

Nor is it a re-definition of reporting. Re-shaping it, possibly. Broadening its scope, definitely. But reporting is still about informing and communicating events that impact the community, be it local or global. “Reporting” was never about who was doing it.

Nor is it about “democratizing the news”. Yes, anyone can now become an “amateur journalist”, but that does not make the distribution and reception of the news democratic. That is still very much controlled by big media. A tweet from CNN will get much more traction than a tweet from your neighbour. It would be “democratic” if we could choose from all available feeds. But we simply can’t scan everything. The “breaking news” alerts that we actually see still tend to come from the “professionals”, given their reach.

Where do they get their news from? Increasingly from “citizen journalists”.

And there you have it. Citizen journalism is about democratizing the sources. Anyone can now be a source of news-worthy information. And this benefits anyone looking for location-specific, event-specific updates or problems. And here, “anyone” includes the main news outlets, traditional or otherwise. Platforms such as social media or user-fed aggregators make it relatively easy to locate and verify stories. Your location-specific, event-specific report can hit the headlines. You can call attention to problems, or inform of surprising and important developments. You can trigger international interest.

But your main value will be as a source. The main channels will need to verify and edit, and will no doubt incorporate your material, if it is useful, into a larger coverage. You contribute to the news. But you are going to need an audience, which on your own is difficult to achieve if you are not a professional writer with a long track record and a large following. Even the “citizen journalism” websites depend on commitment, reliability, and quality of reporting. Once you demonstrate all three, and especially once you end up getting paid by the outlets that take up your story, I would say you are a “journalist”.  No need to qualify it with “citizen”.

Which brings me to another point: I take issue with the term “citizen journalism”. So the professionals aren’t citizens? And anyway, citizens of where? “Social reporting” is also a weak term. It incorrectly implies that you limit yourself to posting on social media, or that you write about social events. “User-generated content” is just as misleading. It implies that you are a “user”, but of what? Of the Internet? I believe that the professionals are users, too. So what is a better name? “Amateur journalism” sounds a bit denigrating, even though it may be accurate.

I propose “civic reporting”, as it speaks to the motive, which I believe is one of the main differences between the professionals and the non-professionals. Professionals are supported, economically and often also logistically, but a media outlet. I’m not saying that their motives are purely economic, in fact I’m sure that they’re not, but journalism is how they make their living. Civic reporters aren’t in it for the money. Obviously they wouldn’t say no, but their main motive is to report, to get the news out, to draw attention to a situation or a problem. A responsibility, a duty, a civic instinct. And something we should encourage. Just as we should encourage the traditional newsrooms to re-think their formats, to come up with simple collaboration tools, to refine their verification protocols, and to accept that the power of communication is being redistributed.