TEDxYouth@Manchester video live: What do your digital footprints say about you?

This is a very wee blog post/aside to share the video of my TEDxYouth@Manchester talk, “What do your digital footprints say about you?”:

You can read more on the whole experience of being part of this event in my blog post from late November.

It would appear that my first TEDx, much like my first Bright Club, was rather short and sweet (safely within my potential 14 minutes). I hope you enjoy it and I would recommend catching up with my fellow speakers’ talks:

Kat Arney

Click here to view the embedded video.

Ben Smith

Click here to view the embedded video.

VV Brown

Click here to view the embedded video.

Ben Garrod

Click here to view the embedded video.

I gather that the videos of the incredible teenage speakers and performers will follow soon.

 

Share/Bookmark

European Conference on Social Media (ECSM) 2016 Day Two – LiveBlog

Today I am again at the European Conference on Social Media 2016 and will be liveblogging the sessions. Today is a shorter conference day and I’ll be chairing a session and giving a poster so there may be a few breaks in the blog. As usual these notes are being taken live so any corrections, questions, etc. are very much welcomed. 

Keynote presentation: Dr Sue Greener,University of Brighton Business School, UK – Unlearning Learning with Social Media

I wanted to give you a topic this morning about my topic, which is learning. But not just learning in Higher Education, but also learning in the workplace. I encourage you to tweet throughout, do tweet me @suegonline.

Life is about learned behaviours. We learn habits and once we learn habits, they are hard to unlearn. But at the same time we also love new novel things – that’s why we love social media. You could call this a dichotomy – between habit and the new. A lovely aphorism from Maria de Beausarq: “The power of habit and the charm of novelty are the two adverse forces which explain the follies of mankind”. We see this dichotomy in psychology all the time.

Davis (1999) talks about habit as a barrier to creating thinking and innovation – the idea that “if someone did it this way, they must have had a good reason”. Glaveanu (2012) writes about “habitual creativity” – where expertise and master is brought about through the constant sharpening and adjusting of habitual practice to dynamic changes in context. As in learning a piano, or a language – practising all the time but gradually introducing flourishes and creativity.

Now, you may be wondering about whether this talk is about learning, or about skills… But I think both are very similar. Learning involves a whole range of skills – reading, note taking, evaluation, etc. Learning is a skill and has a degree of both routine and creativity. And learning is not just about those recognisable tasks… And I want to talk about “unlearning”, something that Alvin Toffler talks about in Future Shock, and he talks about 21st Century digital literacy, talking about learning and unlearning. I was started in elearning and the technology. The technology is what we fit around as habit, as mastery, but it’s all about learning. And when I was looking at Toffler’s work, when doing that PhD work, I was worried about learning theory – they all seemed over-engineered, too formal, too linear almost, too structured as pedagogy. I knew that the idea of learning styles – still in the literature and research – but I think of myself as having a learning palate – which I can pick and choose from, I pick the style of learning to suit the context. I personally learn best by learning by watching, by modelling from other people… Yesterday Britt Allan talked about “advertising literacy” and I hadn’t heard that before, but loved that phrase, it made sense to me, and that’s very much how I learn

Bandoura – triple reciprocal determinism – I found theories of learning I understood. He talked about behaviours, and learning from behaviours, and trialling ideas. That idea of not piling learning on learning, but instead about the idea of learning and unlearning, that makes sense. Hefler talks about organisational unlearning – giving it equal weight to learning. Yes, neural networks in the brain accumulates, but they also die away without use… And unlearning is something else.

So, what is unlearning? It is not negative. And it is not about forgetting/the unconcious giving up of knowledge – although it has been seen that way before in the 80s and 90s – we do forget things but that is not what unlearning is. And it is not just replacing the obsolete. But it is a purposive creative process as important as learning. Unlearning is about taking apart the pieces and reconstructing the meaning. It means we can build the foundations of our knowledge. Much under-researched as an area – therefore enticing. Rushmer and Davies (2004) talk about three things: Fading (over time); wiping (enforced from outside – often happens in the workplace, not comfortable); deep unlearning (from individual surprising experience producing inconsistency, changes of value). That deep unlearning can be gradual, or can be about “falling off a cliff” – when we find something surprising and need to decide to change.

And now to social media. Now, I don’t know about you but much of my unlearning takes place on social media. But why? If we go back to 1997, to the early social network 6degrees.com… Since then we have learned to communicate, to exchange information, assessing and evaluating information differently. Information is all around us, and we absorb it in a very social context. So, how much of our learning is from formal courses, and how much is informal learning? Formal learning is the tip of the iceberg, informal learning is bigger and is about rich tacit understanding. As educators we can try to overly control learning, even in e.g. closed facebook groups. But this is learning that benefits from space to work well.

So, informal learning is social and personal and often informal. Bourdieu (1992) talks about a habitus – a mindset – that is enduring but can be transformed by what takes place within and beyond it. Garrick (1998) and Boud (1999) suggested informal interactions with peers are predominent ways of learning at work. Wenger (1998) talks about social participation in the community as key to informal learning. Boud and Middleton (2003) talk about informal learning as being about mastery of organisational processes, negotiating the political and dealing with the atypical – those are things we don’t always embed in the degree process of course. So, how does this all fit with my idea of social media, this DIY media?

When this conference launched in Brighton in 2014 we had a Social Media Showcase with students, employers, academics, and school children. Last year we did a virtual showcase. And this year we did the Big Bang showcase – in a huge showground in Sussex. Over 8000 students from school children – and we were able to have conversations, have vox pops. Out of hundreds of conversations only 10 students did not use social media. And those that were active, they could write a long list (e.g. 8 or 9) of sites they use. My sense is that for this age range these presences are a little like stickers. For these kids informal learning is massive – from peers, from others, from celebrities. At that age perhaps causing a great deal of unlearning. They encounter information in schools but also from peers – which do they choose to trust and engage with? If ever there was a reason for teachers to understand social media, that was it.

At Brighton we have a “switch it on” policy – we ignore this stuff at our peril! You can always ask for devices to be switched off for a few minutes/a task. To exclude those spaces you are turning away and excluding that valuable informal learning, that bigger context. If we want to help people learn, we have to teach them where they are comfortable. And we must help people to evaluate what they see on social media – that is a critical thing for teachers. And social media is not just for kids… We are increasingly joining SnapChat and WhatsApp – less trackable conversations are appealing to older audiences too.

So, back to Rushmer and Davies (2004) and fading… Snapchat is about forgetting, fading. Wiping will be in place in all organisations but we have to think about how to deal more positively with resistance to change. Hislop, Bosley, Coombs and Holland 2013 – who I don’t totally agree with – have a typology of unlearning which is helpful. My thesis is that social media has some particular aspects – it is personal, ubiquitous and high speed. Data is transmitted in a hugely complex route, filtered through sites, through audiences… We have a huge dissemination of a (any) single story. Speed and serendipity are vital features of social media in action. And the experience is intimate – staring into a screen that makes it one-to-one even if in reality it is one-to-many. These interactions can change our mind. They can change our mind in referenda, they can change our mind in many ways. And they can be central to unlearning. That can be for good, and for bad. We will all have great examples of links, of ways we learn through social media. And it is less predictable than mainstream media, you can find surprises, you can catch enthusiasm – and I like to foster that even if I cannot control it!

So, can social media drive deep unlearning? I think all the signs are there. You should make up your own mind.

Q&A

Q1) I am not sure I totally understand what unlearning is… Learning is a process…

A1) I would relate this to the concept of cognitive dissonance – where there are two competing ideas that you must resolve and decide between. In social media I connect with people I like and trust, if they raise an idea that I didn’t previously agree or subscribe to, their raising of it has the ability to influence or change my mind, or at least means I reconsider that issue.

Q2) You talked about habitual creativity, and implied that as you get older you may forget/fade. I saw a presentation a few years back emphasising that you can learn by changing your habits – a walking route for instance.

A2) Absolutely. Things like changing your seat in a lecture theatre, changing a route etc. But social media can really shift your understanding.

Q3) I think you talked about two types of learning that don’t mix together. Many go to universities for the workplace to gather formal skills, that you call back on etc. But that requires some structure. And of course informal learning happens all the time. And the people I

A3) I agree that media stacking and multi tasking is not good… But at the same time in lectures, at conference, I find it useful to reflect, to engage with topics etc. that is very valuable.

Comment) In high school I remember girls knitting and learning too and doing very well.

A3) It is possible, and it is skills that we are developing. I’d agree that it can work, and that it can be helpful for students to be active and engaging rather than passively receiving.

Q4) Thank you for your interesting keynote. How can social media make real change?

A4) I’m not a politician – I wish I was. We have a tool that can strike at the heart of people. It can help form and shape opinion, but that can be bad as well as good…

Introduction to ECSM 2017 

The next conference will be in Vilnius, the capital city in Lithuania. Lithuania is one of three large modern Northern European baltic countries. We are part of EU, NATO, Euro etc. Vilnius has around 550K, and indeed Lithuania has 3 million people. We have a lovely old town, listed by UNESCO. We have technology sectors that we lead in, particularly green tech, and we have the fastest public wi-fi in the world, and third most affordable internet in the EU! People are lovely, well educated, and we speak many languages! We have 14 universities, we have research parks etc. Our campus is on outskirts of the city – but we have Uber and public transport – and the city centre is all walkable on foot. And our campus has excellent facilities and you are very welcome there. We have many researchers working on social technologies, and a journal for social technologies. And, to end, a short video…

And, on that lovely video, I am pausing the liveblog as I’ll be giving my poster on the Digital Footprint MOOC. Normal service will resume afterwards.

Further details of sessions attended to follow.

Share/Bookmark

Upcoming Events: Citizen Science & Media; PTAS Managing Your Digital Footprints Seminar

I am involved in organising, and very much looking forward to, two events this week which I think will be of interest to Edinburgh-based readers of this blog. Both are taking place on Thursday and I’ll try to either liveblog or summarise them here.

If you are are based at Edinburgh University do consider booking these events or sharing the details with your colleagues or contacts at the University. If you are based further afield you might still be interested in taking a look at these and following up some of the links etc.

Firstly we have the fourth seminar of the new(ish) University of Edinburgh Crowd Sourcing and Citizen Science network:

Citizen Science and the Mass Media

Thursday, 22nd October 2015, 12 – 1.30 pm, Paterson’s Land 1.21, Old Moray House, Holyrood Road, Edinburgh.

“This session will be an opportunity to look at how media and communications can be used to promote a CSCS project and to engage and develop the community around a project.

The kinds of issues that we hope will be covered will include aspects such as understanding the purpose and audience for your project; gaining exposure from a project; communicating these types of projects effectively; engaging the press; expectation management;  practical issues such as timing, use of interviewees and quotes, etc.

We will have two guest presenters, Dave Kilbey from Natural Apptitude Ltd, and Ally Tibbitt from STV, followed by plenty of time for questions and discussion. The session will be chaired by Nicola Osborne (EDINA), drawing on her experience working on the COBWEB project.”

I am really excited about this session as both Dave and Ally have really interesting backgrounds: Dave runs his own app company and has worked on a range of high profile projects so has some great insights into what makes a project appealing to the media, what makes the difference to that project’s success, etc; Ally works as STV and has a background in journalism but also in community engagement, particularly around social and environmental projects. I think the combination will make for an excellent lunchtime session. UoE staff and students can register for the event via Eventbright, here.

On the same day we have our Principal’s Teaching Award Scheme seminar for the Managing Your Digital Footprints project:

Social media, students and digital footprints (PTAS research findings)

Thursday, 22nd October 2015, 2 – 3.30pm, IAD Resources Room, 7 Bristo Square, George Square, Edinburgh.

“This short information and interactive session will present findings from the PTAS Digital Footprint research http://edin.ac/1d1qY4K

In order to understand how students are curating their digital presence, key findings from two student surveys (1457 responses) as well as data from 16 in-depth interviews with six students will be presented. This unique dataset provides an opportunity for us to critically reflect on the changing internet landscape and take stock of how students are currently using social media; how they are presenting themselves online; and what challenges they face, such as cyberbullying, viewing inappropriate content or whether they have the digital skills to successfully navigate in online spaces.

The session will also introduce the next phase of the Digital Footprint research: social media in a learning & teaching context.  There will be an opportunity to discuss e-professionalism and social media guidelines for inclusion in handbooks/VLEs, as well as other areas.”

I am also really excited about this event, at which Louise Connelly, Sian Bayne, and I will be talking about the early findings from our Managing Your Digital Footprints project, and some of the outputs from the research and campaign (find these at: www.ed.ac.uk/iad/digitalfootprint).

Although this event is open to University staff and students only (register via the Online Bookings system, here), we are disseminating this work at a variety of events, publications etc. Our recent ECSM 2015 paper is the best overview of the work to date but expect to see more here in the near future about how we are taking forward this work. Do also get in touch with Louise or I if you have any questions about the project or would be interested in hearing more about the project, some of the associated training, or the research findings as they emerge.

Share/Bookmark

DiCE Seminar: Logging Students: Understanding Learning One Click at a Time – Gregor Kennedy, Melbourne University [LiveBlog]

This afternoon I am attending a seminar from Gregor Kennedy, University of Melbourne, organised by the Digical Cultures and Education Research group at University of Edinburgh.

As usual this is a liveblog so please let me know if you see any typos, have corrections to suggest, etc. 

My background is in social psychology and I decided to change fields and move into educational technology. And when I started to make that change in direction… Well I was studying with my laptop but I love this New Yorker cover from 1997 which speaks to both technology and the many ways in which Academia doesn’t change.

I also do a lot of work on the environment, and the ways that technology effects change in the wider world, for instance the way that a library has gone from being about physical texts to a digital commons. And my work is around that user interface and mediation that occurs. And in the first 15 years of my career was in medical technology, and in interfaces around this.

Now, the world of Digital Education is dominated by big platforms, from early to mid-2000, enterprise teaching and learning systems that provide, administer, etc. Platforms like Blackboard, turnitin, Moodle, Echo. And we have tools like Twitter, blogging tools, YouTube, Facebook, Second Life also coming in. We also see those big game changers of Google and Wikipedia. And we have companies/tools like Smart Sparrow which are small adaptive learning widgets with analytics built into them. And we see new big provicers of Coursera, EdX, Future Learn, the mass teaching and learning platforms.

So, as educators we have these fantastic tools that enable us to track what students do. But we also can find ourselves in an Orwellian place, where that tracking is all the time and can be problematic. But you can use all that VLE data in ways that really benefits education and learning. Part of that data enables us to see the digital footprints that students make in this space. And my group really look at this issue of how we can track those footprints, and – crucially – how we can take something meaningful for that.

Two of the early influential theorists in this space are Tom Reeves and John Hedberg. Back in 2003 they wrote about the problematic nature of auditing student data trails, and the challenges of doing that. Meanwhile there has been other work and traditions, from the Intelligent Tutoring Systems in the 1970s onwards. But part of the reason I think Reeves and Hedberg didn’t think meaningful interactions would be possible is because, at their most basic level, the data we get out of these systems is about behaviour which is not directly related to cognition.

Now we have to be a bit careful about this… Some behavioural responses can be imbued with a notion of what a student is thinking, for instance free-text responses to a discussion list; responses to multiple choice questions. But for much of what we collect, and the modern contemporary learning analytics community is talking about, that cognition is absent. So that means we have to make assumptions about students intent, motivation, attitude…

Now, we have some examples of how those sort of assumptions going wrong can be problematic. For instance the Amazon recommendation system deals poorly with gifts or one off interests. Similarly Microsoft Clippy often gets it wrong. So that distinction between behaviour and cognition is a major part of what I want to talk about today, and how we can take meaningful understanding from that.

So I want to start with an example, the Cognition and Interaction project, which I work on with Barney Dalgarno, Charles Sturt University; Sue Bennett, University of Wollongong. We created quite flat interactive learning objects that could work with learners who were put in an fMRI machine, so we could see brain activity. For this project we wanted to look at how learning design changed cognition.

So, we had an “Observation Program” – a page turning task with content screens and an introductions with background terminology. They saw changes in parameters being made. And an “Exploration Program” where students changed parameters themselves and engaged directly with the material. Both of these approaches were trialled with two examples: Global Warming adn Blood Alcohol. Now which one would you expect to be more effective? Yup, Exploration. So we got the results through and we were pretty bummed out as there was very little difference between the two. But we did notice there was a great deal of variation in the test scores later on. And we were able to use this to classify Students Aproaches:

  • Systematic Exploration – trying a variable, seeing the result. Trying another, etc…
  • Non-Systemaic Exploration – changing stuff all over the place.
  • Observation group – observation

So we re-ran the analysis and found there was no difference between the Non-Systematic Exploration and the Observation group, but there was a difference between the Systematic Exploration and the other groups.

So, why is this interesting? Well firstly students do not do what they are supposed to do, or what we expect them to do. The intent that we have as designers and educators is not manifest in the way students engage in those tasks. And we do see this time and time again… the digital footprints that students leave show us how they fail to conform to the pedagogical intent of the online tasks we set for them. They don’t follow the script.

But we can find meaningful patterns of students behaviour using their digital footprints… interpreted through the lens of the learning design of the task. These patterns suggest different learning approaches and different learning outcomes…

Example 2: MOOCs & LARG

One thing, when we set up our MOOCs, we set up the Learning Analytics Research Group, and this brings people together from information technology, informatics, education, educational technology, etc. And this work is with members of this group.

So, I want to show you a small snapshot of this type of work. We have two MOOCs to compare here. Firstly Principles of Macroeconomics, a classic staged linear course, with timed release of content and assessment at the end. The other course is Discrete Optimization which is a bit more tricksy… All of the content is released at once and they can redo assessments as many times as they want. There is a linear suggested path but they are free to explore in their own way.

So, for these MOOCs we measured a bunch of stuff and I will focus on how frequently different types of students watched and revisited video lectures across each course. And we used State Transition diagrams. These state transitions illustrate the probability of people transitioning from State A to State B – the footfall or pathways they might take…

We created these diagrams for both courses and for a number of different ways of participating: Browsed – did no assessment; Participated – did not do well; Participated – did OK; Participated – did well. And as outcomes improve these transitions/the likelihoods of state transition increases. And the Discrete Optimisation MOOC saw a greater level of success.

So, again, we see patterns of engagement suggesting different learning strategies or approaches. But there is a directional challenge here – it is hard to know if people who revisit material more, do better… Or whether those who do better revisit content more. And that’s a classic question in education, how do you address and help those without aptitude…

So, the first two examples show interesting fundamental education questions… 

Example 3: Surgical Skills Simulation 

I’ve been working on this since about 2006. And this is about a 3D immersive haptic system for e-surgery. Not only is the surgeon able to see and have the sensation of performing a real operation, but the probe being used gives physical feedback. This is used in surgical education. So we have taken a variety of metrics – 15 records of 48 metrics per second – which capture how they use the surgical tools, what they do, etc.

What we wanted to do was provide personalised feedback to surgical trainees, to emulate what a surgeon watching this procedure might say – rather than factual/binery type feedback. And that feedback comes in based on their digital trace in the system… As they show novice like behaviour, feedback is provided in a staged way… But expert behaviour doesn’t trigger this, to avoid that Microsoft paperclip feedback type experience.

So, we trialled the approach with and without feedback. Both groups have similar patterns but the feedback has a definite impact. And the feedback from learners about that experience is pretty good.

So, can we take meaningful information from this data? Yes, it’s possible…

I started with these big buckets of data from VLEs etc… So I have four big ideas of how to make this useful…

1. Following footprints can help us understand how students approach learning tasks and the curriculum more broadly. Not so much whether they understand the concept or principle they are working on, and whether they got something correct or not… But more their learning and study strategies when faces with the various learning tasks online.

2. If we know how students approach those learning tasks and their study, it does give us insight into their cognitice and learning processes… Which we can link to their leanring outcomes. This method is a wonderful resource for educational research!

3. Knowing how students approach learning tasks is incredible useful for teachiers and educational designers. We can see in fine detail how the educational tasks we create and design are “working” with students – the issue of pedagogical intent, response, etc.

4. Knowing how students approach learning tasks is increadibly useful for designing intervantions with students. Even in open and complex digital learning environments we can use students digital footprints as a basis for individualised feedback, and advise students on approaches adopted.

So, I think that gives you an idea about my take on learning analytics. There are ways we can use this in quite mundane ways but in educational research and working across disciplines we have the potential to really crack some of those big challenges in education.

Q&A

Q1) For the MOOC example… Was there any flipping of approaches for the different courses or A/B testing. Was there any difference in attainment and achievement?

A1) The idea of changing the curriculum design for one of those well established courses is pretty difficult so, no. In both courses we had fairly different cohorts – the macroeconomics course . We are now looking at A/B testing to see how potential confusion in videos compares with more straightforward “David Attenborough, this is the way of the world” type videos, so we will see what happens there.

Q2) What

A2) There is some evidence that confusion can be a good thing – but there is productive and unproductive confusion. And having productive confusion as part of a pathway towards understanding… And we are getting students from other disciplines looking at very different courses (e.g. Arts students engaging with chemistry courses, say) to cause some deliberate confusion but with no impact on their current college courses.

Q3) On that issue of confusion… What about the impact of learning through mistakes, of not correcting a student and the impact that may have?

A3) A good question… You can have False positive – provide feedback but shouldn’t have. False negative – don’t provide feedback but shouldn’t have. With our system we captured our feedback and compared with a real surgeon’s view on where they would/would not offer feedback. We had about 8% false positives and 12% false negatives. That’s reasonably good for teaching excercise.

Q4) How do your academic colleagues respond to this, as you are essentially buying into the neo liberal agenda about

A4) It’s not a very common issue to come up, its surprising how little it comes up. So in terms of telling teachers what they already know – some people are disheartened by you providing impirical evidence of what they already know as experienced teachers. You have to handle that sensitively but many see that as reenforcement of their practice. In terms of replacing teachers… These are helper applications. The feedback side of things can only be done in a very small way compared to the richness of a human, and tend to be more triage-like applications that forms a small part of the wider curriculum. And often those systems are flagging up the need for a richer interaction or intervention.

Q5) Most students think that more time on task maps to more success… And your MOOC data seems to reinforce that… So what do you do in terms of sharing data with students, and especially students who are not doing as well?

A5) It’s not my research area but my colleague Linda does work on this and on dashboards. It is such a tricky area. There is so much around ethics, pastoral care, etc.

Students with high self efficacy but behind the pack, will race to catch up and may exseed. But students to low self efficacy may drop back or drop out. There is educational psychology work in this area (see Carol Dykal’s work) but little on learning analytics.

But there is also the issue of the impact of showing an individual their performance compared to a group, to their cohort… Does that encourage the student to behave more like the pack which may not be in their best interests. There is still a lot we don’t know about the impact of doing that.

Q6) We are doing some research here with students…

A6) We have a range of these small tasks and we ask them on every screen about how difficult the task is, and how confident they feel about it and we track that along with other analytics. For some tasks confidence and confusion are very far apart – very confident and not confused at all although that can mean you are resistent to learning. But for others each screen sees huge variation in confidence and confusion levels…

Q7) Given your comments about students not doing what they are expected to do… Do you think that could impact here. Like students in self-assessments ranking their own level of understanding as low, in order to game the system so they can show improvement later on.

A7) It’s a really good question. There isn’t a great motivation to lie – these tasks aren’t part of their studies, they get paid etc. And there isn’t a response test which would make that more likely. But in the low confusion, high confidence tasks… the feedback and discussion afterwards suggests that they are confused at times, and there is a disjoint. But if you do put a dashboard in front of students, they are very able to interpret their own behaviour… They are really focused on their own performance. They are quite reflective… And then my colleagues Linda and Paul ask what they will do and they’ll say “Oh, I’ll definitely borrow more books from the library, I’ll definitely download that video…” and six weeks later they are interviewed and there is no behaviour change… Perhaps not surprising that people don’t always do what they say they will… We see that in learning analytics too.

Q8) [Didn’t quite catch all of this but essentially about retention of students]

A8) We have tried changing the structure and assessment of one of our courses, versus the first run, because of our changed understanding of analytics. And we have also looked at diagnostic assessment in the first three weeks of a course as a predictor for later performance. In that you see a classic “browsing in the book store” type behaviour. We are not concerned about them. But those who purchase the first assessment task, we can see they can do well and are able to… And they tend to stick with the course. But we see another type – a competent crowd who engage early on, but fall of. It’s those ones that we are interested in and who are ripe for retaining.

Share/Bookmark