Project 1917 is a stunning example of social media bringing history to life. Social media has often been characterised as a distraction for most students, but in this case, it brings to life the events surrounding one of the most turbulent episodes in recent European history: the two revolutions that took place in Russia 100 years ago and went on to massively influence the entire twentieth century.
Using diaries and other records from the time, Project 1917 brings events to life through social media. All of the major characters, from Lenin to Trotsky and from Tsar Nicholas to Beria, have been given social media accounts and post their thoughts, albeit it with a 100 year delay. The effect is so compelling that some followers have mistaken the protagonists as real characters and responded to tweets and status updates. Project 1917 is a perfect way to bring important historical events to life and to update them for a modern audience.
Despite the attention that advances in article intelligence (AI) are receiving in the press at the moment, there has been very little comment on the implications of AI for the curriculum. To be fair, most of the press articles could be summarised as, “artificial intelligence/machine learning/deep learning is coming, be scared, let’s control it”. But deep down, we know that, “let’s control it” isn’t a particularly valid response. History shows us that trying to control the spread of useful technology is time consuming, labour intensive and ultimately futile. With the possible exception of nuclear weapons, which have limited (but deadly) applications, controlling technology advances is a non starter. We need a better plan. We need to avoid work that will be easily automated, but more importantly, we need to think of other ways in which humans can rise to the challenge of intelligent machines.
Despite all the aforementioned press attention, I have yet to see any consideration of how ‘human learning’ might respond to ‘machine learning’. Curriculum changes traditionally happen at a painfully slow pace, but we should be thinking about what we learn and when. To take one example, the incidence of software agents writing and disseminating information on the internet, so-called information bots, has been denounced as a threat to democracy by some writers. There have been calls to ban the bots and create institutions that fact check and arbitrate on the sole source of ‘the truth’. For my own part, I’m not sure that a Ministry of Truth is really the answer, but why are there futile calls to control a technology when a constructive response would attempt to educate the population, to make them more discerning about what constitutes the truth. Governments do have control over the curriculum, so a positive response would promote critical reasoning to the core curriculum rather than ban the bots.
What this case illustrates once again is that the technologies that impact our lives and careers are developing at a much faster pace than the education system. Its not enough to rely on university, much less school to give us all the skills we need to prosper in a modern job market. Fortunately, there are a huge number of resources available to provide an introduction to a topic or fill in the gaps, resources like Tutorhub, iTunes U, Coursera, EdX and Khan Academy. We have to change the way we think about education. Fifty years ago, it made sense to get the education over with at the start of your life, but from now on, education is never done.
I’m hearing a lot of talk in the media about one of the hottest areas of tech right now: artificial intelligence or AI. As is usual in this stage in the technology (over) promotion cycle, AI is going to change everything. Its going to make our life better and its going to make our life worse. Which, as with most things, is probably true depending on what your life looks like at the moment. If I can’t drive and AI powered driverless cars become available, the world becomes a much better place. If I’m a legal assistant whose job disappears because AI does the job better and cheaper, then things don’t look so good.
One thing is for sure, if you’re starting out in a career or thinking of changing, it makes sense to think about how the technology will impact your career prospects in the medium term future. If that current or prospective career involves assessing large amounts of data and drawing conclusions from it, be concerned. You might say, how many jobs does that apply to? To which the response is: a lot. Driving for instance. Or Radiologists. Or even doctors. All of these jobs take a large number of data points and try to draw conclusions from it: is it safe to drive at this speed, is that car going to hit me, is that a tumour? It turns out humans are very good at processing large volumes of informations and drawing conclusions, but for a lot of jobs, AI will be better. Where the technology is less successful is where a lot of human interaction is required and where there are unpredictable scenarios (fashion anyone?). And as we’ve seen in the last 12 months, humans are less rational and less predictable than economics might suppose, so there going to be many fields in which human ingenuity, powered by AI, will achieve breakthroughs.
Most of the prognostication about where AI will make its biggest impact is based on where the technology COULD be applied, but I’m cautious about making bold promises as to where it will and where it won’t make an impact, for one simple reason: money. Most forecasters try to understand the technology and think of what it can be applied to, but the reality is that will have the biggest impact where people can make the most money and we don’t know that yet. Any new technology requires huge amounts of capital to make it stick and it will be a few years before we really know if autonomous cars (for instance) are a mainstream or just a niche application. The same is true for other applications. So I’ll avoid making predictions and instead settle for the easy way out of offering advice. Think of a career, chase your dream, but at leat give a small amount of thought as to whether the perfect candidate for that job could be a computer.
Most people who participate in the sharing economy (of which Tutorhub is a part) do so because of its convenience, value or access to products and services that would otherwise be unavailable. But for some, its the reduction in waste that naturally follows sharing things that you aren’t using. I once rented an Airbnb from a guy in Bristol who was passionate about allowing other people to use his stuff when he wasn’t. He rented out spare rooms, he rented out his car when he didn’t use it. And it wasn’t because of the money – he really thought it was a waste for most cars to be parked up by the roadside for most of its life. He may have a point, some estimates place the amount of time the average car goes unused at over 90%.
Whilst its not immediately obvious how, this concept of ‘unused capacity’ that goes to waste applies to the tutors who offer their skills on Tutorhub. An education acquired over many years is a valuable commodity, an asset that can ‘earn’ a return. When its not earning, its being wasted. I’m not aware of any studies that have looked at the economic value of skills and knowledge acquired and not put to use, but presumably its a big number. In general, education is valued ‘for its own sake’, but in a knowledge economy, learning drives earning.
There’s another particular twist in the case of education and learning, because unlike a car or a spare room, knowledge wastes if we don’t use it. It ebbs away. Knowledge is more like a muscle than an object, it gets stronger with use.
What this means is that the sharing economy has particular power in education. It offers particular benefits for those who are willing to embrace it. Students get access to the knowledge they need in a way they want; they get access to rare and difficult skills, they get access at times they otherwise couldn’t without the pressure of peers in attendance. Tutors earn a return on their hard won knowledge at times and in a manner to suit them. They also get to use the knowledge, to keep it alive. And what would be worse, after all the effort that goes into becoming a graduate, post-grad or doc than to see that knowledge slowly drift away …