Tag Archives: teaching

Virtual Worlds and Technology Futures

Last week I was privileged to be invited to give the closing keynote at an event called ReLIVE 11 (Research and Learning in Virtual Environments) at the Open University. This was certainly a big deal for me as I was in the company of some brilliant academic minds and some tech celebrities – plus, the OU is an important and well-known institution (despite the fact that I heard Leo Laporte say that he’d never heard of it on the MacBreak Weekly podcast I was listening to as I drove to Milton Keynes last Tuesday evening!).

I’d previously explained to the organisers that I hadn’t spent so much time exploring virtual worlds lately as I was doing three or four years ago at the height of IBM’s involvement with platforms such as Second Life and our own internal Metaverse. Having said that, I have spent more time with gaming platforms such as XBox and the Nintendo 3DS since then, and more recently also Minecraft. Naturally I did have that business perspective and story to share… and, as the closing keynote I had the interesting task of pulling together the threads we’d covered during the breakout sessions at the conference, as well as attempting to look ahead to what trends might be important in the future.

The video is online via the Open University website and the talk with Q&A lasted for about an hour. More coverage of ReLIVE 11 is aggregated on Lanyrd.

Summary

As I noted in the opening and closing sections of the talk – predictions of the future are a hit-and-miss affair. We may now have tablet computers arguably even cooler than the Star Trek padds and communicators, but I’m still waiting on my hoverboard. Nevertheless, I tried to frame the story of IBM’s exploration of virtual worlds and 3D environments with some discussion of trends. It also gave me an excuse to talk about Back to the Future, and a cool ad that Nike recently released tying back in to the movie.

I want to reiterate (as it may not have been clear from tweets that emerged during the event) that these were very much my own thoughts and not the views of my employer – in fact, I was attending the event in a personal capacity. So, per the presentation, my thoughts on trends to watch in the next five years:

  1. 3D Printing: I’ve seen RepRap and other 3D printers more often in the past couple of months than ever before, and it is clear that prototyping and fabrication are coming within financial and technical reach of more than just the early adopting minority. That’s not to say this is something I see going “mainstream” – but as access opens up, expect to see many more interesting things happening here.
  2. Social broadcast: I think “TV” is rapidly giving way to a more generalised broadcast media that is being consumed across multiple devices, remixed, shared, etc. I also think that social streams are adding to the experience of how these media are being consumed, as evidenced by hashtags broadcast on BBC programmes, and the ways in which conversations form online around events and video streams.  A nod to my friend Roo Reynolds too, a man constantly way ahead of his time…
  3. Touch and Gesture: we already know that the ways in which we interact with technology is evolving fast. Watch any child approach a large screen and attempt to press the screen, expecting their cartoon hero to become interactive. This is not going to stop – Microsoft have some amazing technology in this space with Kinect and we should get used to and embrace the changes as they happen if we want to evolve.
  4. Big Data: a nod to my own organisation’s Smarter Planet story, and an acknowledgement that every one of the major tech firms is investing in ways to store, mine, slice and analyse the increasing amounts of data flowing in from the environment and our personal signals. This is just a continuing story, but we’re at a point where it is a red hot topic. It would have been a good point to mention Watson, if I’d thought on my feet quickly enough!
  5. Identity: this is not so much something where we will see technical progress necessarily, as an area I think will be a threat, and difficult to resolve. The nymwars of Google+ are one edge of the issue. I believe that there is a real tension between the freewheeling days of the earlier Internet, the desire of individuals to make their own choices about identity (often for valid social reasons, other times for vanity), and corporations and political entities that want to close this situation down. This is going to be a tricky one.

So what of virtual worlds? Three words: Not Gone Away. They may have morphed, lost their early shine, the bubble burst – but we have a range of immersive experiences (and social, but not necessarily immersive ones) through which we interact. I mentioned Minecraft and how that is being used for teaching. I talked through IBM’s work with serious gaming. I spoke about the IBM Virtual Center briefly, and that’s online and used today – in fact Jack Mason just posted a nice deck on that which carries some statistics, if you want to learn more.

Thoughts on education

I clearly was not the most experienced individual in the room when it came to discussions about teaching and education, and I particularly enjoyed hearing different presenters at ReLIVE11 talk about how they are using OpenSim, OpenWonderland and other platforms. However – after my recent post on Raspberry Pi and my exploration of the Brighton Mini Maker Faire I’ve been thinking increasingly about Maker culture and how we could bring technology teaching back around to practical matters.  I was disappointed to read the Government’s (lack of) response to John Graham-Cumming’s recent letter on the same subject, though.

One of the things that I called out as a barrier to the adoption of immersive worlds and new technologies at work is something I’m calling The Empty Room Problem – the fact that unless you build it and then populate it, they will not necessarily come. I’ll be writing about this some more shortly, prompted by Derek Jones’ great blog post.

During the Q&A session I gave an answer to one of the questions which contained some ideas I’ve had on a possible curriculum – I’ll try to expand on those in the near future as well.

Teaching technology in the future – Raspberry Pi

Before you dismiss this as TL;DR – it’s a subject dear to my heart, and I believe that there’s some cool content as well as some storytelling – do give it a chance!

A sad state of affairs

I believe that we have lived through the best period to teach and learn about computers and technology, and that over the past few years we have been creating a void, a vacuum, in which progress may be diminished.

Google’s Eric Schmidt recently called out the British education system as holding back or dismissing our technology heritage. According to a ZDNet article on his speech in Edinburgh:

Schmidt said the UK’s approach to technology in education — not making IT compulsory as a subject at the GSCE-level and not providing enough support for science students at colleges — meant the country was “throwing away” its computing heritage.

See also the BBC and Guardian coverage of the story.

I can’t say I think he’s wrong, and I can’t say I’m surprised. Anyone who has heard me ranting about the state of things in a side conversation at any recent event in the past couple of years, will have heard me tell a similar story. When I was a lad – and I know that some of those who read this will be older, just let me reminisce without interrupting, OK? 🙂 – I grew up on an early Commodore PET with green screen, followed by BBC Micros, Acorn Electrons, etc. I’ve had a couple of occasions to look back on that era recently, with a visit to The National Museum of Computing at Bletchley Park, seeing the team from TNMoC visit the Brighton Mini Maker Faire, and through talking to folks at TransferSummit (of which more, in a moment). In my day, you plugged in the power, the machine made a satisfying BEEP! and you were presented with a black screen with the word BASIC and a > prompt. That’s just how things worked. To do anything else, you had to tell the computer to do it – and you learned a lot as you did so!

Without wanting to sound like some kind of old fogey – kids today never had it so good! They have grown up in an era where all they have ever known is a world where every computer is connected to the Internet, a giant brain which appears to be all-knowing (and I know that this is how a 3 or 4 year old thinks: my own younger family members have said “we’ll just look it up on the Internet, it knows everything”, without understanding that humans have known everything, and the computers just tell us what we’ve told them, at a basic level). They have fast, interactive machines which are dramatically more usable – and instead of bulky noisy systems which were just about user-serviceable, ideally when you had an antistatic wrist strap to hand… they have magic, thin, sheets of glass that can be controlled at the slightest touch.

That’s fantastic. It puts children today in a position where they can be more creative than ever before – I could barely edit low-quality digital scanned photos by the time I left school, let alone edit full HD video with a variety of awesome effects. So one thing we can teach them is how to use creative tools like… oh I don’t know… Office suites (capitalisation deliberate, sarcasm heavy).

The thing is – we don’t need to teach schoolchildren how to use a productivity tool like that. By the time they have sat watching us for 5 minutes aged 6, they intuitively “just get it”. Worse is the fact that we’ve nearly removed the ability to look under the covers at what makes the machines work – certainly in a hardware sense you’d need a very advanced knowledge of microelectronics to do anything with the innards of most smartphones, and software is often becoming more and more locked up to the whims of the hardware manufacturers (naming no Apples). Plus of course, everything is online. So what does this mean for the curiosity to take things apart either in hardware or software, see how they work, and build something new?

(the irony is not lost on me that as a History graduate, I’m an unusual spokesperson for this debate)

Makers and getting back to basics

KitTen, Uno, Nanode One of the reasons I’m excited by the trend towards making things – what I’ll term the Maker movement, in a nod to the Brighton Mini Maker Faire and the magazine that has inspired the events – is that it reflects both our natural human curiosity and interest in building things, and making them work. I also think that is part of the reason behind our interest in prototypable electronics like Arduino – we have gone through a period of making things smaller, more compressed and proprietary, and the pendulum is swinging back towards open hardware, simple construction, and ease of learning. This is a huge, great and important step, in my opinion.

Enter – a Raspberry Pi

So how can we take advantage of that trend towards discovery and learning, and combine it with small cheap electronics, to really make a difference? Well, you may have heard of the Raspberry Pi Foundation – it has had a fair amount of coverage in the UK anyway, with the promise of a new low-cost computing platform which could theoretically replicate the success of the BBC-sponsored, Acorn-built, BBC Microcomputers from the 1980s (and backed by one of the most successful computer games authors of that era). Those BBC Micro systems were rolled out across schools all over the UK, and pretty much anyone in the 30-40 age bracket will have learned to write some kind of BBC BASIC or LOGO code at some point in their education, and have looked at fractals and played a variety of classic 8-bit games. My first home computer was an Acorn Electron, an affordable beige “keyboard box” that could be plugged straight into a home TV in 1984, with games and programs loaded off a (then) common cassette player.

The folks at Raspberry Pi believe that having a cheap computer which can be presented as an education device could be a success. At the TransferSummit last week, I met Eben and Liz Upton from the project, and had a chance to play with the system first hand. I also made a quick film of this amazing little computer playing full HD video – and the excitement is obvious in the fact that it has received nearly 50k hits on YouTube in just 4 days, probably helped by an appearance on the Raspberry Pi blog and also in a feature on Geek.com!

One of the things that Eben spoke about was the idea that it would almost be more interesting for these things to boot to a Python prompt instead of a full Linux desktop (which it is well-capable of doing), in order to ignite kids’ imaginations and force them into doing something more creative than simply doing what they would do with any other computer. I kinda like that suggestion!

Risks, and what else can we do?

I’m excited. As I said several times to Liz and others at the event this week – it’s a British organisation with vision, with an amazing idea, a product that works, and the desire to really reconnect children – particularly those in the developing world – with technology and how to drive it.

I can see a  number of risks, but the last thing I want is to be a naysayer here – I really, really want these folks to succeed. However, just looking at the excitement amongst hobbyists like me, and reading some of the comments posted on my video already, I realise that there’s a danger that the supplies of these things will quickly be snapped up by those wanting to make funky small home systems for themselves, rather than the altruistic wanting to help youngsters to learn (heck, I want one! so I understand that!). Or, kids may see these as just another form-factor of computer of the kind they are used to, plug it in, go online, and do nothing different to what they are already capable of. Another issue is that a bare board (the initial version won’t have a box, although that would be easy enough to fab) and a lack of instructions or clear fixed “syllabus”, if you like, may discourage teachers now used to teaching desktop computing and productivity tools, from embracing the potential to help students to create. It’s also entirely possible that these things will simply be cloned elsewhere. For all of these reasons, I’m determined to do what I can to promote the Raspberry Pi concept as an educational tool, and to support the team behind it. It’s important. It deserves to be a massive success.

So, what else can we do?

One thing is to go and sign the brilliant Emma Mulqueeny (aka @hubmum)’s e-petition on the UK gov website. She’s campaigning for an earlier entry for programming into the classroom, at primary level, particularly to encourage more girls to take an interest in technology. I think this is a brilliant step. Nik Butler has posted about the importance of teaching this stuff, too, and I encourage you to read his post – I particularly support the way in which he refutes the list of reasons why this sort of teaching is allegedly a “bad” idea. He’s also talked about the Raspberry Pi on the Social Media White Noise podcast #70.

Another thing is to visit and support The National Museum of Computing, preferably with some kids you know – help them to see where we have come from and where we are going.

It’s obvious to me that we need to change the way we think about teaching IT, computing, and technology. Earlier teaching of programming is important. I also think that a basic understanding of how a computer system fits together would help, as well as a high-level understanding of the way in which the Internet works. Importantly though – and this rolls into a whole other passion of mine which I won’t rant about today – increasingly as we come together online, I think it is increasingly important to teach tolerance, understanding of other cultures, and good online community behaviour. How we collectively go about doing that, I’m not entirely sure – but it feels important.

Thanks for indulging me on this particularly long post – it really is a subject I care deeply about. And all that stuff about technology – from an historian and Arts student 😛

Historical perspectives

For those of you who have never read my About page, you may be surprised to know that as well as being a “techie”, I’m MA in Modern History (the story of how I came to have a career in technology is possibly less interesting than it might outwardly appear). As such, I wanted to take a moment to comment on a couple of things that have come up in the past week.

History teaching in the UK

I don’t remember my first history lesson, how I became aware of my own cultural background, or when or why I fell in love with the study of history. I just remember, when I came to choose exam subjects at 13/14, that for me History was a no-brainer, something I thoroughly enjoyed and wanted to dive deeper into. Despite my affinity for and interest in science (I was working on some Chemistry software for RISC OS with a friend of mine at the time), it was also a natural study for me to pursue into A-level and, eventually, as my Degree subject.

I won’t claim that the transition to a technical career was straightforward. It’s true that while (in my opinion) a History graduate has a range of flexible and totally transferable skills, recruitment out of universities in the UK 15 years ago (and, I suspect, even more so today) was limited in outlook. Although I’d a number of examples of technical knowledge and had my own business selling RISC OS software with a friend, many larger organisations simply wanted a science education, and I didn’t have one to show them. I was grateful of the UK Post Office taking a broader view of my skill set and taking me on as an IT Graduate (or, one of the “Graduates in IT Services”… yes, you work out that acronym… charming!).

Back to the subject though. Academically, philosophically, politically, and in the pursuit of knowledge and understanding, I believe that History is vitally important. What did I gain from devoting a number of years of my life to that study? Strong analytical skills spanning multiple media; broad and I believe, sensitive, cultural awareness (yes, really – from a Brit!); and an understanding of how we became the human race we are today.

Facts about history education in the UK :-(

This past week, Professor Niall Ferguson published an editorial piece in the Guardian claiming that British history teaching was at a point of crisis.

[aside: Niall Ferguson is the best lecturer I ever had… I clearly remember his first lecture to my fellow students and I, which began with the clanging industrial noises of Wagner’s Ring Cycle, immediately capturing the attention of even the most feckless and disinterested mid-90s Oxford student (although my female colleagues seemed captured not so much by the audio, but by the visuals and voice…)]

I was disappointed to read about the state of affairs described in his piece, and the accompanying article describing the loss of cohension in the UK History curriculum. Now let’s be clear – to an extent, I was always in a privileged position with regard to education generally and to History education as well. If things are really in such dire straits today I do despair – I don’t get the same sense of ignorance from friends of other nationalities, and whilst I don’t advocate any kind of imperialist triumphalism in British History education, by ignoring trends, and what Niall Ferguson calls the “long arc of time”, our children clearly miss out. I’m not going to trot out cliches about how we have to understand past mistakes to avoid repeating them – we do that regardless, it’s part of the human condition and pride. The point is: there’s excitement and interest in our story. And honestly, how annoyed would you be if every story you ever heard, read, listened to or attempted to understand, arrived in disjointed pieces that were impossible to lace together?

I hope the UK teaching profession, and the appropriate education authorities, listen to reason. And I hope that the apparent focus on science as the be-all-and-end-all of education learns to flex in favour of other subjects, too – speaking as a STEM Ambassador, myself.

History on the web

I’ve remarked before about the web as a historical source. The death of archive services like DejaNews (it was the archive for Usenet, and finally bought by Google, which turned it into Google Groups, before burying / de-emphasising access to older content) was a terrible thing, even if it does mean that it is now very difficult to locate evidence of my embarrassing mid-teen and early 20s days online! The move to the real-time web, and the increasing focus on sites like Twitter and Facebook (through which historical seach is both de-emphasised, and technically virtually impossible), is increasingly reducing the value of the  web as a historical resource.

Suw Charman has written about this issue this week, and it caught my attention particularly in the context of the other issues currently exercising my brain.

I return to a thought I’ve expressed previously: sites that revolve around EVENTS have an opportunity here. When I wrote about Lanyrd I said:

here’s what I think is a really cool feature. You can attach all kinds of “coverage” to an event, be it slides, audio, video, liveblogged information, blogged write-ups, etc etc. So your point-in-time event suddenly gains a social and historical footprint with an aggregation of all the content that grew up around it, which people can go back to.

The thing that really grabbed my attention this week was the seemingly-minor and gimmicky discovery that someone has created an entry for the 1945 Yalta meetingsh on Lanyrd. This is awesome – a demonstration of what it can provide, and what we need – the ability to tie content together and aggregate, link, and retain related information in the context of people and events. All of which is only really interesting if we have a population that understands where we (globally) have come from…