How to become a Learning Designer, assess your skills and get feedback on what you need to improve

I have been contacted recently by many people interested in becoming Learning Designers but needing to know if they have the right skills or where to go to learn more about the role. 

I am launching a scorecard to help people identify the best ways to get their first role by assessing what skills they need, even if they are nervous that they are not yet ready.

It is simple; you answer 15 yes or no questions and automatically receive customised recommendations. 

I have just set it up, so I would love some feedback. 

There is the link: https://learningdesigner.scoreapp.com/

Fix what is broken before you start something new

New is shiny, new is exciting, and new is noteworthy, but if it is not the limiting factor, it will make things worse. Most success in life and work comes from doing what you know you should do but aren’t.

To quote The Phenix project, ‘Any improvement that is not at the constraint is an illusion.’ We tend to think that to grow or improve, we need to add something new, but if that new thing is not the constraint, it is unlikely to make a positive difference

The Theory of Constraints is a methodology for identifying the most important limiting factor (i.e., constraint) that stands in the way of achieving a goal and then systematically improving that constraint until it is no longer the limiting factor.

Leanproduction.com

Imagine if the goal is to develop more online courses, but you are having trouble getting Subject Matter Experts (SMEs) to commit to their projects and deliver the required inputs on time. You might need better onboarding, clearer guidance on time commitments, better project management processes, improved project selection, or more straightforward tools and advice for the SME. Externally, the expectation of improving output would be to add more Learning Designers and take on more projects when you first need to address why they are not committing.

Better, More, New

Alex Hormozi explains this theory using the leaky bucket analogy. Think of your project or business as a bucket and the water as your customer. To handle more water, you first have to fix the holes in your bucket, and then you can increase the flow of water and finally add new buckets. Better, more, new.

Try it

Alex suggests the following activity: list the 25 things you should be doing to improve your work, then complete these things before thinking of doing something new.

Let me know how it goes…

The Student Futures Commission

To mark the launch of the Student Futures Commission, the UPP Foundation, using Cibyl as a research partner, sent out a survey to 1.5 million students at over 140 institutions to understand their university experience during the pandemic. 2,147 students responded between 14th-19th May 2021. Like the rest of us, students miss face to face community. 

Students want universities to prioritise a return to in person teaching and are missing face-to-face interaction around their wider student experience, according to a major new survey.

Student Futures Commission

The key findings: 

  • The preferences for study structure next year: 
    • 45% mostly in-person with online teaching once or twice per week. 
    • 29% fully face to face
    • 21% mostly online
    • 6% fully online
  • The majority of students did not participate in any extracurricular activities this academic year
  • 63% believe the pandemic has negatively affected them academically
  • 48% do not believe they have missed any aspect of teaching despite disruptions to delivery
  • 72% are neutral or satisfied with changes to academic assessments
  • 65% believe their course will still help them find a job. 

The full data set can be downloaded from the UPP Foundation website.

How would you go about becoming an expert at designing online learning?

I read a tweet this morning that asked; if you could be in the 1% of experts for any skill, what would that be? I have been building my skills in the design of online learning for several years, so it got me thinking about what expertise looks like in my field. I wrote the following question at the top of a page and started to make a list. 

How would you go about becoming an expert at designing online learning? 

Here are my steps to developing expertise in the design of online and blended learning courses. If you have questions or what to add to the list, message me on Twitter.

  1. Follow a documented set of learning and design principles
  2. Develop a model for estimating effort and costs
  3. Follow a repeatable development process
  4. Know the fundamentals of project management and follow them religiously
  5. Treat the course creator like the hero of the story, support them and collaborate.
  6. Have a Quality Assurance process linked to the design principles
  7. Set clear expectations for students, create metrics to monitor against these, and have interventions in place when they are not met.
  8. Collect and analyse lots of data and user feedback
  9. Iterate, iterate, iterate
  10. Frequently update your learning and design principles, costing model, and development process

Notes: Firstly, I have explicitly focused on the design of courses and separated this from the very different development and delivery skills. Secondly, I have taken some liberties by putting all the learning and design principles into a single step. These two areas are vast and cover everything from accessibility and user experience to psychology and learning and teaching models. Thirdly, within the third step of following the development process, I currently prefer to use the rapid prototyping model that follows the Design thinking steps, including the creation of student personas, and UCL’s ABC workshop for mapping out the course. Finally, this is the first attempt at a list, and I might wake up tomorrow and realise I have missed a whole section of the field and need to update this list. If you are in the area already or are interested in developing your expertise, then I hope this list is useful.

If you have questions or want to add to the list, message me on Twitter. I would love to see other peoples lists for building expertise in the design of online courses too.

Refactoring, Reuse, and Learning Design

The increase in digital technology in many fields has also brought software engineering language and practices into these areas. I studied Information systems and management at University, so I am more guilty than most for this trend. I have introduced rapid prototyping, the Capability Maturity Model, and daily stand-ups to my team’s work to name just a few.

Refactoring: The process of changing a software system in such a way that it does not alter the external behaviour of the code, yet improves its internal structure.

Martin Fowler

This week I have come across a company using ‘Refactoring’ as a term used in learning design. In software engineering, programmers use refactoring to describe going back to old code and cleaning it up. Refactoring is a continual process of improving code and reducing the number of lines while maintaining functionality. Code is usually written quickly to solve a functionality problem, so programmers revisit it, and rewriting it to run more efficiently. Reuse is a significant part of the refactoring process where a programmer copies some code from another programme to replicate the functionality and simplicity somewhere else quickly.

The company that I will not name used refactoring to describe unbundling a course and then restack it into different offerings. It is splitting a degree or Masters into its separate modules and then offering these individually or in groups of modules to other potential students. This might be offering the first 180 credits of a degree or a combination of modules from all three years into a certificate. It might also be offering working professionals the option of studying a single module that they need for work. The idea they tried to get across was that universities already have these bundles of modules that can be rearranged into courses that attract a wider audience, but I do not think the term quite works.

Reuse: An Engineering strategy where the development process is geared to reusing existing software

Ian Sommerville

After the session, I revisited my university notes to see if ‘refactoring’ was a marketing effort and taking liberties. I came across a line that I am not sure is mine or a direct quote from a book but refactoring ‘allows us to think about reuse of previous components or looking at alternative ways of doing things.’ Reuse is borrowing code from existing software to reduce the amount of code required to produce when developing a new system. I can’t help but feel that ‘reuse’ is a better technical term for what was implied, but it is not flashy. 

Let me know on Twitter if I am wrong or want to share other terms taken from software engineering misused (think Agile). We can have a group eye roll.

n.b. The rest of the presentation was excellent, and they had great ideas.

The Expectation Gap Survey

WONKHE and Pearson today released the analysis of their second Student Expectation Gap survey. The survey was available throughout December 2020 and covered English and Welsh universities with 3,389 student responses. Students have understood the situation academics are in and are satisfied with their responsiveness to feedback and support requests; however, only 40% agree that their experience as been of sufficiently good quality.

What we take from the findings is that among the students we surveyed, the fundamentals are generally in place. Teaching staff seem to be (mostly) engaging and responsive, and though some students flagged specific frustrations about learning remotely, most reported good access to learning resources.

WONKHE

The responses showed that 46% of the courses were delivered entirely online, and a further 14% started with some face-to-face and then moved entirely online during the term. Only 33% of student had campus-based sessions throughout the period. 80% of the students have less than 10 hours of timetabled sessions per week, and 17% had less than two hours (mostly PGT), the rest of their couses were independent study.

The pandemic has accelerated the move to technology-enhanced learning. According to this survey, students are open to keeping the changes once the government lifts the social distancing rules. Universities now have the challenge of assessing what delivery looks like post-COVID. They must decide what should be retained in the short term, what to develop for the longer-term strategically, and what to remove.

The survey suggests students want:

  1. More significant interaction between students on campus and supplemented online through discussion forums
  2. More contact time with tutors in the classroom, online in seminars, through remote check-ins with tutors, and via email.
  3. Encouragement and support to become independent learners through online formative self-assessment, more frequent assessments, and progress reviews indicate how they perform on the course.
  4. A more consistent approach to teaching across modules
  5. The campus and classrooms used for interactive tasks and activities, practical experiences, lab-time, and fieldwork. 
  6. Online learning used to add flexibility, remove constraints around scheduled contact hours, and enhance learning delivery.
  7. A better User Experience UX design of the VLE to improve signposting and to set expectations around learning.
  8. Content broken into manageable chunks interspersed with a large variety of activities and knowledge checks.
  9. Online access to wellbeing, careers, and academic support services.
  10. More skills development through independent study learning activities for academic writing, digital learning, project and time management, the confidence to engage with groups, information literacy, and independent learning.

You can read the summary and the research findings on the WONKHE website.

Group size and interactions in online courses

The Open University (OU) in England was set up in 1969 by the UK government to widen access to higher education. The university has over 160,000 students, almost all studying ‘off-campus’, currently categorised as distance learning in the HESA data, but this term may need updating. The OU has had a long-standing principle of splitting cohorts into groups of 25 students. With almost all UK courses currently delivered entirely online due to a lockdown, I want to know what effect group size has on interaction levels? Is there an optimum group size for highly interactive online courses?

Cohort numbers are important as we want to run courses with lots of interaction where students engage in active and collaborative learning that improves their outcomes. It is vital to keep costs down by controlling the volume of staff interaction provided, so classes are sustainable and represent value for money. We also want a balance for students with opportunities for interactions, but they do not feel lost and disconnected.

My first search found a great quote from a 1969 paper from The Journal of Social Psychology; ‘...as group size increases, individual participation decreases.‘ While this paper looked at on-campus, free discussion within small groups, it was a good starting point. With groups of two students, they have to be highly engaged, whereas groups of five provide individuals with a space to hide or take a step back. 

However, anecdotally from my teaching days, sometimes larger groups can create exciting conversations and develop a social norm of participation that does not happen in smaller groups. I assume that optimum group size might differ for synchronous and asynchronous learning activities, between different pedagogic approaches, teacher expectations and interaction levels, and technical and non-technical subjects.

Group sizes

I found some recommended size ranges include Sieber (2005)‘s 12 for instructors new to teaching online and Tomei (2006)‘s suggestion of 12 for postgraduate courses. Colwell and Jenks (as cited in Burruss, Billing, Brownrigg, Skiba, & Connors, 2009) suggest an upper limit as 20 for undergraduate and 8 to 15 for postgraduate. In a paper by Parks-Stamm et al. (2017), student interaction in classes of 14 or fewer students increased with more instructor participation, but this mattered less with larger groups of 15-30 students. Orellana (2006) states that 16 was perceived as the optimal group size by academics teaching online to achieve the highest level of interaction.

An Inside Higher Ed article interviewed several American universities with established online portfolios asking about optimum group size. The University of Massachusetts at Lowell have 28,000 online enrollments; they cap their undergraduate classes at 27 and postgraduate courses at 25 students. Granite State College in New Hampshire keep group sizes between 12-15 students, and on the other end, Brigham Young University at Idaho’s average class size is 37. The WCET a digital learning policy group for universities sets a ‘rule of thumb’ of 20-25 students.

Initial recommendations

I could not find anyone in my short search that recommended group sizes of over 27 students, but there were many suggestions that group size is not the best metric to use. Starting with the OU’s suggested groups of 25 students and then monitoring each is a good starting point. You can then monitor student performance, withdrawals, instructor response time, engagement measures, including the volume of student/instructor interactions, and student feedback. This data will allow you to assess if the group size, interaction levels, and course design meet the students’ learning and social needs. You could also provide regular opportunities for small-sized groups, including 2-3 students working together for students who would benefit from more intense interactions.

Using Abbing’s brand model to develop a service offer

University leadership teams are currently planning what delivery will look like next academic year. A form of blended learning will likely be maintained even if social distancing rules are relaxed. Educational technology and academic development teams will need to restructure their services to provide academic departments with the support they need to transition from this year’s delivery model to a more sustainable and quality-driven model for the future. But what does that service offer look like and how can it be designed to provide freedom for academic teams to explore what this new future looks like?

Author/Copyright holder: erik roscam abbing. Copyright terms and licence: CC BY-NC-SA 2.0

Erik Roscam Abbing’s brand model could be used as a starting point for Edtech teams to create their new service blueprint. The starting point is to map out the team’s own identity, vision, mission, and behaviours. An understanding of the Capability Maturity Model can also input into the team’s desired brand. I have added below my current thoughts on the first phase for my team. If you have any questions or want to collaborate on ideas, get in contact with me on Twitter @samueljtanner

Team Identity

We have moved towards a Learning Design skill set in the team rather than the more traditional Learning Technologist. Each member of the group would consider themselves as a ‘techie’ and has an expertise that sits somewhere in the nexus of three core technical skills; Learning and teaching, multimedia and technology development, and design. Learning Designers operate as project managers, follow design thinking methodologies using personas and prototypes, and adopt a scholarly approach to quality assurance and continuous improvement practices.

Vision

We believe in the transformational nature of technology, and that learning and teaching can be made better when technology is used to design student centred experiences. Teachnology allowed learning and teaching to be:

  • Flexible: accessible to anyone that wants to learn, at whatever stage of life they are at, and whatever their context.
  • Personalised: designed to meet students individual goals and provide choice as these change.
  • Active and collaborative: engaging learning experiences that prepare students with the skills they need for the workplace, including problem-solving, teamwork, communication, and resilience. 
  • Redefined: using technology to create student experiences previously impossible with physical constraints.

Mission

By 2025, all students will have a flexible, personalised, and active and collaborative learning experience that uses technology to provide better learning outcomes.

Behaviour

We are: 

  • partnering with academic teams to co-design modules and courses
  • defining what quality looks like and how to get there sustainably 
  • sharing ideas of what is possible and what works
  • building an easy to use and seamlessly integrated technology ecosystem that provides the tools needed 

My ideas will be different from yours

The ideas here are just a brain dump around the direction I am taking my team, but I suggest using the same framework for your institution. Phase two will look at the identity, vision, mission, and behaviour of those teaching at university. My team is a service for academic departments to help them teach students, and so our customers are the lecturers. It is a time of disruption for the role of academics, and the answers to the questions in phase two will be very different now than six years ago when I moved from further education to the university sector. I have some research to do, but I imagine that brand promise will be something along the lines of… 

Brand promise: Your Learning Designer will help you design, develop, and deliver a flexible module quicker, easier, and provide a better student experience than if you had done it independently.

The Coursera quality metrics

 I came across these quality metrics from Coursera in some reading today and thought it was interesting.

Components of quality metrics definitions

Engagement (Completion rate): The proportion of completion-eligible learners who complete courses and items

Satisfaction (Rate of 5-star reviews): The proportion of star ratings – given by course completers – that are a perfect five stars. This metric captures more variability than average star ratings.

Skill development (Average score Delta): The average increase in skill scores, demonstrated in graded assignments, and projects in the course.

Career outcomes (Rate of career benefit): The proportion of completers, responding to our survey, who report receiving career benefits from the course.

Coursera

In the paper  Great online learning outcomes happen by design, Coursera states that ‘completion rates among most populations of learners are substantially higher than 50% and can be far higher in courses that adhere to Coursera pedagogy best practises‘. For student satisfaction, they state, ‘The average star rating across courses on Coursera is 4.7 out of 5 stars’. Career outcomes are at 73% of the responding students claiming a positive job-related outcome. 

The engagement and satisfaction numbers are far lower than what we would see in a good university course. This is before you take into acocunt that satisfaction scores at universities are usually taken mid-year and include students that will not complete, whereas Coursera only asks completers. However, it is worth noting that these numbers are a dramatic improvement from the early days of MOOCs when 15% completion rates were not unheard of and where satisfaction was low. What is more impressive is that Coursera is operating at scale, with 70 million students, making up nearly 200 million online enrollments on over 4000 courses provided by around 200 different universities.

These improvements are down to investment from Coursera and other MOOCs in delivery models and quality. It is worth watching these large providers with their massive data sets, intense focus on the student experience, and lower costs. Perhaps the MOOC might deliver the promise that people initially hoped they would provide.   

Read the full paper, Great online learning outcomes happen by design, on Coursera’s website.