Guest post from Chris Melia, Learning Technologist, University of Central Lancashire. DigiReady: Equipping our students for the modern workplace

Chris is a Learning Technologist, working directly with UCLan’s Faculty of Health and Wellbeing as digital lead. Currently an MIE Expert, MIE Trainer and Microsoft Certified Educator – Chris is an active member of the Microsoft community, regularly contributing and advocating the use of technology solutions in education. He played a key role in the deployment and development of Microsoft Surface technology across the University’s academic community. Chris is currently leading on a new initiative, which recognises the development of 21st century, digital skills to support students in their future employment. He is also an active member of the Association for Learning Technology (ALT) and holds Fellowship with the Higher Education Academy (HEA).

Picture by Chris Bull for Association For Learning Technology 13/9/18.
ALTC 2018 day three. www.chrisbullphotographer.com

DigiReady: Equipping our students for the modern workplace

It has never been more important to prepare our students for the modern workplace by equipping them with future facing, digital skills. Following the Jisc Digital experience insights survey 2018, it was found that only 41% of students who were asked felt that their course of study ‘prepared them for the digital workplace’. In addition, only 40% agreed that they had regular opportunities to ‘review and update their digital skills’. At the University of Central Lancashire, the Technology Enabled Learning and Teaching (TELT) team have worked closely with our academic community to address these ever-growing concerns.

“The 21st century demands ‘21st century skills’. Our students are embarking on career paths which are invariably changing at a rapid pace, particularly in relation to technology. Many of our students are aspiring to become teachers of Physical Education and being ‘digitally literate’ is a crucial prerequisite of employability in this domain.”
Andrew Sprake – Lecturer in Physical Education
Jess Macbeth – Senior Lecturer in Sport Studies

It was identified across several disciplines that students were already demonstrating a number of digital skills, often without realising themselves and normally without any kind of formal recognition which could aid their future employability.

“At UCLan, the midwifery curriculum prides itself on allowing students to develop digital literacy skills and encouraging them to be ‘digitally ready’ when seeking employment. With employers often stating that digital skills are an essential requirement for those applying for the role of a newly qualified midwife, the team see this as an essential component of the midwifery course. Lecturers lead by example, by ensuring that all teaching and learning resources are delivered using digital approaches, encouraging the students to engage with these methods of learning.”

Neesha Ridley – Senior Lecturer in Midwifery

Our approach was to look at developing ‘DigiReady’, a new student certification underpinned by the Jisc digital capabilities framework (image above) and adapted from a more recent Microsoft tool. Implemented at course level, it centres around eight core skills ranging from effective communication, to online safety/security and profile management.

Students build up their evidence of skill development across these areas, which they record into an e-portfolio using OneNote Class Notebook. Evidence can consist of annotated screenshots, audio/video reflections, and sometimes involves different participation in interactive activities. The e-portfolio provides tutors with instant access to each individual student space, where they can monitor progress and provide valuable feedback along the way. Students are also asked to build a short presentation or video answering three reflective questions, which draw on evidence from their portfolio and overall DigiReady journey and development. This final digital artefact aims to provide a valuable resource that will support students in their future employability.

Jean Duckworth and Hazel Partington – both academics at the University, lead a suite of online Masters level courses and modules. Students arrive to the course or module with a range of different digital skill levels. Some may have studied or worked in a technology rich environment, whilst others have not yet developed their skills in this area. The team start the course with a very intensive induction, which starts with using Outlook, Microsoft Teams, Skype, Adobe Connect and the University’s Virtual Learning Environment. By doing this, students get to know each other and have the tools to fully engage in their studies. These skills are further developed as a student progresses through the course or module.

While the initiative is still in an early pilot phase that will inform its future development, we know it it will play a crucial role in the digital development of our students.

“Employers often comment how ‘digitally ready’ students from our University are when they apply for jobs. By encouraging students to embrace and develop digital literacy, we are equipping them with lifelong skills that they will use throughout their careers, so they are confident and competent to provide care using digital systems in the work place.”
Neesha Ridley – Senior Lecturer in Midwifery

“We expect the introduction of the DigiReady programme will showcase the development of important 21st century skills to employers, stakeholders or course providers.”
Hazel Partington & Jean Duckworth – Senior Lecturers in the Faculty of Health and Wellbeing

“Initial feedback from the pilot programme has been overwhelmingly positive and will inevitably set our students apart.”
Andrew Sprake – Lecturer in Physical Education
Jess Macbeth – Senior Lecturer in Sport Studies

Guest blog post from Non Scantlebury, Academic Engagement Manager, The University of Hertfordshire: our co-host and organiser for the 4th Jisc building digital capability community of practice event held on 21 November 2018

Photo of Non Scantlebury

Non Scantlebury

For those that were able to attend the Jisc building digital capability community of practice event held at the University of Hertfordshire last month, we were treated to a veritable feast of activities going on across our further education (FE) and higher education (HE) sectors on the theme of ‘digital skills and capabilities’!

The insightful and challenging keynote, delivered by Professor Sal Jarvis and Dr Karen Barton, illustrated how impactful the experience of working in partnership with Jisc had been at a local level and how great our collaborative working has contributed to our strategic engagement with digital capabilities development at the University of Hertfordshire.

Our participation in the Jisc student digital experience insights service (previously known as student digital experience tracker) and discovery tool pilots happened to coincide with a major transition to a new Canvas based Learning Environment implementation and they shared how all these initiatives had come together to influence our thinking and inform our evolving vision and aspiration to becoming a fully ‘digitally capable organisation’. Like many in our sector we have much existing excellent practice to celebrate, whilst we also acknowledge the challenging nature of supporting scalable and flexible digital capabilities development options as we move forward.

Local findings from our insight survey results chimed with those highlighted in the national 2018 digital experience insights reports. In particular, student feedback at the University of Hertfordshire, when asked about their experience of digital technologies used on their course responded:

Student responses when asked about their experience of digital technologies used on their course

Figure 1: Student responses when asked about their experience of digital technologies used on their course

This showed how much they appreciated the use of these in their course experience, but this was also balanced by a finding that they didn’t always see the link between the use of these technologies and their impact on their employability skills:

Bar chart showing student perceptions of importance of digital skills to future career

Figure 2: student perceptions of importance of digital skills

The implementation of a new Learning Environment system had also provided a refreshed opportunity to review academic staff training and development in relation to course design and digital capabilities development.

Dr Barton outlined the ‘Learning Landscape’ approach, developed by staff working in the Learning Technology and Innovation Centre, to support the delivery of academic training.

The method and approach have now been adapted and extended to meet the needs of all our staff professional development and will include digital capabilities development in the future.

Diagram showing university of Hertfordshire's learning landscape approach

Figure 3: University of Hertfordshire Learning Landscape approach

Specific approaches to training academic and academic support staff in the effective use of digital technologies for learning and teaching are delivered through our Guided Learner Journey module developed on the Canvas platform.

To engage academics, we have formed support teams of expert staff aligned to each school which deliver workshops, presentations and literally ‘knock on doors’ and ask if anyone wants any help! These teams comprise educational technologists, learning design champions, student technology mentors and librarians and have proven very effective in supporting the embedding and sharing of new practice!

Gillian Fielding shared what the benefits were to organisations participating in the UCISA digital capabilities survey. By benchmarking local provision across the broader sector, it had helped members to prioritise and stimulate strategic thinking. Most recent results point to the need to reflect digital capabilities inclusion at strategic planning level, increase engagement by other departments such as human resources, and the need to stipulate digital capabilities more clearly within job descriptions and the recruitment process for staff and prioritising the delivery of digital skills for students as a key employability outcome.

As well as being treated to an update from Jisc related to their digital wellbeing project and plans for their building digital capabilities services, we were then offered a full ‘smorgasbord’ of intriguing PechaKucha presentations and were dazzled by the awesome time keeping abilities of those presenting against the clock!

The format was particularly praised as a method to share excellent practices happening everywhere.

Dr Fiona Handley, University of Brighton, shared the research she had led on investigating staff student partnerships highlighting the types of activities which students get involved in such as digital content creation (videos), teaching digital skills training sessions, giving technical advice and presenting at conferences. She highlighted the fact that there was still some way to go in that most current projects are led by professional staff rather then led and driven by students. This highlights a need for us to consider how we might redress this balance and increase opportunities for more equal or student driven partnerships.

Joe Wilson, City of Glasgow College introduced us to the City Learning 4.0 initiative aimed at preparing students for citizenship and industry. Creative initiatives such as the introduction of ‘Digital Mondays’ and Google Educators were important initiatives to stimulate engagement with staff and students.

Image of presentation title page

Figure 4: PechaKucha presentation from City of Glasgow College

Terese Bird, Leicester Medical School, touched on the importance of adopting creative and considerate digital practices when designing and delivering digital learning.

Image of board game

Figure 5: Step Change: the game of organisational digital capability

Clare Killen and I then delivered an interactive board game session based on an adaptation of the Jisc framework for organisational digital capabilities and the four-step model of strategic steps combined with a range of community-based case studies. The activity aimed to stimulate groupwork and problem solving whilst participants explored the 4 key steps of the framework through conversation, shared practice and reflective dialogue. We certainly got great feedback and you can now download of a copy of the game resources yourselves from the University of Hertfordshire Open access repository!

We rounded the day off with more stimulating PechaKucha’s delivered by Sarah Sherman, Bloomsbury Learning Environment and how they had adopted a consortium-based approach to developing a brilliant Moodle course to prepare both staff and students to be ‘digitally ready for learning’ and teaching. Piloting in 2019.

Finally, James Duke, Bishop Grosseteste University, shared the work they had delivered on building their organisational model and Alicia Wallace, Gloucestershire College, showcased how by engaging with the Jisc framework at critical points in the staff induction and annual review process in order to set goals, targets and deliver focused training and development.

Image of slide from Bishop Grosseteste University PechaKucha presentation

Figure 6: Bishop Grosseteste University organisational mode

Slide from Gloucestershire College PechaKucha presentation

Figure 7: Gloucestershire College PechaKucha presentation

All the presentations, resources and outputs from the day are available from the Jisc building digital capabilities events page!

Non Scantlebury
Academic Engagement Manager
The University of Hertfordshire
@Nondigilib

 

 

 

Register now for the digital capability community of practice – 21 November 2018

We are delighted to be working in collaboration with the University of Hertfordshire to jointly run this event, which will take place at on 21 November 2018 at the Fielder Centre.

This event is our fourth community of practice event for staff and leaders responsible for developing digital capability in their organisations.

18 months on from the launch of this vibrant community, the network event offers a unique opportunity for colleagues to work together on all aspects of digital capability for staff and/or students. The network has over 100 active participants and an emphasis on leadership by the community, learning from each other and sharing what works.

View our event page for programme details and how to register. Please note that registration is free but booking is required.

Visit our website to find out more about the community and previous meetings.

Building digital capability service now launched

We have had a busy summer preparing for the launch of our new building digital capability service and we would like to share with you some updates from the team.

Next Community of Practice event
Registration is now open for our next Community of Practice event, running on Wednesday 21st November at the University of Hertfordshire. Everyone is welcome, please register now from our event page and we hope to see many of you there!

New website
Our new Building digital capabilities website has now been launched and is available from https://digitalcapability.jisc.ac.uk. You can access our range of advice and guidance from here as well as find out more information about the additional benefits the service offers, including the discovery tool, and how to subscribe. We have also developed a new role profile for Professional services staff in education, available with the other role profiles from https://digitalcapability.jisc.ac.uk/what-is-digital-capability/.

For subscribers to the service additional pages in a ‘logged in’ section provide access to pathways through our advice, guidance, tools and resources.

Discovery tool
The updated version of the discovery tool is also now live at: https://jisc.potential.ly/ – with a free reduced version available to all staff (please note this is NOT for students as only the question set for staff is available in this free version).

The full version of the tool is available for staff and students within subscribing institutions, see our website for more information on what is included in both free and full versions.

If you have any queries or feedback for us please contact us at: DigitalCapability@jisc.ac.uk

Notes and presentations from the 3rd Digital capabilities community of practice event – 22 May 2018

University of Leicester, College Court fountain

Photo credit: Heather Price

The University of Leicester provided a great setting for our third community of practice meeting. With eighty five delegates participating in person and many more joining in online (using the hashtag #digitalcapability) this was one of our most vibrant and productive meetings to date.

This is a brief summary of the event. All links to slides, recordings and other outputs from these sessions are available from the Jisc event page.

Dr Ross Parry, Associate Professor and Deputy Pro Vice Chancellor (digital) at the University of Leicester set the scene for the day with his opening keynote Digital capabilities as a strategic priority. He talked about the importance of creating a shared vision and gave a number of insights gained from his experience of developing and implementing the University’s digital strategy. He said “You can have all the tech in the world, but it’ll make little difference if you don’t also have a community with the confidence and fluency to use it in creative and exciting ways”. (Watch a recording of Dr Ross Parry’s keynote )

The three parallel community led sessions focussed on practical engagement strategies to engage students, senior leaders and human resource teams. This was an opportunity for participants to share their experiences, discuss with colleagues and identify opportunities for collaboration.

Students – Facilitated by Frances Deepwell, Director of Leicester Learning Institute, University of Leicester and Natalie Norton, Head of Technology enhanced learning and digital literacies, University of Winchester. (Padlet notes on practical strategies to engage students)

Senior Leaders – Facilitated by Dr Ross Parry, Deputy Pro Vice Chancellor (digital), University of Leicester and Dr Leigh Casey, Associate Director Organisational Development, University of Leicester

Human resource teams – Facilitated by Sarah Knight, Head of change – student experience, Jisc (Padlet notes on practical strategies to engage human resource teams)

These were followed by the first set of four pecha kucha sessions:

  1. Future facing learning – Paul Durston, Teesside University
  2. Digital Leadership for Students: Development of an online resource – Vikki McGarvey, Learning and information services manager, Staffordshire University Library
  3. Can student-staff partnerships support the development of digital teaching and learning practices? – Alex Patel and Bethany Cox, University of Leicester
  4. Digital Leaders – Integrating digital in York’s leadership programmes – Susan Halfpenny, Teaching and Learning Manager, University of York; Michelle Blake, head of relationship management, University of York

(Watch recordings of the first set of pecha kucha presentations )

Kerensa Jennings

The second keynote, How iDEA is developing digital citizens was delivered by Kerensa Jennings from the Duke of York Inspiring Digital Enterprise Award (iDEA). Kerensa gave an overview of this international programme which aims to help address the digital skills gap. She explained that all iDEA resources are free to use and are being increasingly taken up by UK FE colleges and other learning providers.

(Watch a recording of Kerensa Jennings’ keynote)

Clare Riley, Microsoft Education

Clare Riley, Microsoft Education (Photo credit – Nevin Moledina)

Sarah Knight and Heather Price then gave a brief update from the Jisc digital capability team  and we were able to discuss specific aspects in four parallel sessions:

  • Digital Discovery tool surgery – Heather Price, Jisc
  • How can we support students with the development of their digital capabilities using the Jisc discovery tool for learners? Helen Beetham and Sarah Knight, Jisc
  • Mapping of Microsoft resources to the digital capability framework – Shri Footring (Jisc), Nevin Moledina (University of Leicester) and Clare Riley (Microsoft)
  • Building digital capability service site – Clare Killen and Alicja Shah, Jisc

The event closed with the second set of four pecha kucha sessions:

  1. Practising Digitally @ NTU – Elaine Swift, Digital practice manager, Nottingham Trent University
  2. MedEd meet real world world – building capability in HE and NHS workplaces – Cath Fenn, Senior academic technologist, University of Warwick
  3. To infinity and beyond: achieving the University’s ambitions through digital capability – Mike Quarrell, Worksforce development co-ordinator and Alison Small, Head of registry services and change, University of Derby
  4. Can you escape the digital challenge? – A Pecha Kucha in rhyme about our Digital Escape Room event – Mark Hall, digital learning developer, Bishops Grosseteste University

(Watch recordings of the second set of pecha kucha presentations)

Overall, I was struck by the sense of energy throughout the day. This was evident in the keynotes, presentations and workshops as well as the depth of questions and conversations throughout the day. Delegates mentioned that they found the keynotes, presentations, and the opportunities to network and share ideas particularly valuable.

This is a community led event and we are really keen to work in partnership to run the next one, due to be held in November 2018. Please get in touch with us in the team if you might be interested in hosting the next event.

Digital discovery tool: please give us your feedback!

Over the last few weeks we’ve been immersed in individual feedback on the experience of using the Digital discovery tool. This has meant some significant revisions to the content and format of the questions for staff, as described in an earlier post. As we are now at the end of the pilot we’ll be able to compare feedback since the changes were made and see if users find them to be an improvement. (Remember you’ll still have access to the tool till the 13th July).

log-in screenSome of the same issues have been recorded by our student users, along with some new ones such as relevance to different subject areas. We’ll be reporting back on this feedback shortly, with our planned response. You’ll have an opportunity to hear more about individual staff and student responses in our webinar at 12.30 – 14.00 on Tuesday 19th June (links to follow).

Now we are keen to hear about the experience of our lead contacts and how the Digital discovery tool has been used at organisational level. We have just launched the evaluation form (you will receive a personal email with a link to this). All the questions are optional, to help you focus on those areas where you really have something to day. But of course the more you can tell us, the more we can improve.

In particular, we ask about any evidence you have that use of the Discovery tool has led to change, either for individual users or more generally in the organisation. It’s really helpful if you have carried out a focus group or consultation event, and there are resources to help you do this on the evaluation page of this site. There’s also a handy reminder here of the evaluation process overall. And Shri’s recent blog post covers some of the organisational issues you might be thinking about as you compose your feedback.unnamed-2

There is a whole section of feedback about your experience of using the organisational data dashboard, so it’s a good idea if you have downloaded your most recent data and thought about how it might be used. See our guide on how to download the data, and blog post on Making use of your data.

We’d appreciate all organisational responses by the 29th June, as we’ll be analysing these results shortly after. There’ll be an opportunity to hear and discuss our findings at our second webinar on Thursday 19th July, 12.30-14.00.

Three emerging insights from the Digital discovery pilot

Co-authored by Clare Killen

Map showing locations of UK pilotsOver one hundred universities, colleges and other providers are piloting the Jisc Digital discovery tool in the UK and overseas. The design of this tool encourages individuals to reflect on and develop their digital capabilities. It provides a summary of their self-assessment in response to nuanced question prompts as well as suggestions for further development with links to relevant, interactive resources. Whilst it is very much a personal tool, additional features allow institutional leads tasked with supporting digital capabilities development to gain insights from anonymised data and translate them into the institutional context.

Jisc team members have visited several pilot institutions to support the implementation process. In doing so, and through our in-depth conversations, we have learned about what works, at a practical level, when it comes to providing opportunities to develop the digital capabilities of staff and students in the various organisations. Further insights have emerged from conferences, events and meetings featuring presentations from our pilots, for example, the Student Experience Experts meeting and the Digital capabilities session at Digifest18.

As the roll-out gathers pace, we are starting to gain some fascinating insights into how institutions are using the opportunities offered by this tool to bring about positive change in their organisations. There are some clear themes emerging around what organisations that are benefiting from the Digital discovery process typically have in place:

 1. Clear strategic vision

We are seeing that the organisations with a clear message about the importance of digital technologies, communicated and understood by everyone, provides a meaningful context for the use of the discovery tool.

“It is important to have a clear strategy and people need to know that digital is part of the strategy and part of what they do. You need to engage people in it, allow them to see how it affects them and why it is important to them. It needs to be exciting, so for example, we have run several big events that inspire and excite people around the idea of using technology to support teaching and learning and the college business.”
Penny Langford, head of e-learning, Milton Keynes College

2. Culture

Having a safe space in which teams can explore their thinking about their own priorities for development creates an environment in which individuals can thrive.

“The individual reports which each member of my team had, generated discussions and comparisons, with staff considering their different roles and how that has had an impact upon their individual percentage. More than that though, it made them consider how they might acquire skills where they didn’t score as highly. I have eLearning Technologists and Librarians in my team and each had different scores, particularly the Information Literacy category. Which prompted all manner of discussion around the fake news agenda and critically evaluating information sources.”
Sarah Crossland, academic services manager, Doncaster College and University Centre

3. Connections

Establishing the connections between individual self-identified aims, the overall picture for all staff and the resources available to support professional development to meet organisational strategic aims.

We wanted to identify gaps in staff confidence in their digital skills and use this information to target staff training and support. We looked at other products but there was nothing really out there to meet those requirements. We were looking for a standardised tool and wanted something to self-motivate staff. The approach taken by the Digital discovery tool supports that.
Joseph Pilgrim, digital learning co-ordinator, ACT Training

Digital capability community of practice

The next digital capability community of practice event is being hosted in partnership with the University of Leicester on 22 May 2018. This provides an opportunity to learn about related initiatives and hear more from the wider community including many members who are taking part in the pilot of Digital discovery tool.
While registration for this event has now closed, the keynote sessions will be live streamed. Follow the hashtag #digitalcapability on the day and presentations and any outputs will be available from the event page.

There is still time to engage staff

If you are part of the pilot, you still have time to engage staff, perhaps through end of term staff development events. Remember that feedback is required by the end of May but the Digital discovery tool will continue to be available until 13 July 2018.

How HR teams support staff digital capability

At the end of 2017 we began a short review into how Human Resources (HR) departments support staff to develop their digital capability. We developed an online survey and interviewed some of the respondents to try to capture a snapshot of current practice.

Initial results

The results of these activities confirmed our initial expectation that many HR teams have been working across several areas of the digital capability framework, often in partnership with other teams within their institutions. However for both HE and FE respondents there were quite significant variations to the questions about HR team involvement in the 6 core digital capability areas. Whilst 90% of people said they were involved in supporting ICT proficiency of staff, only 50% said they were involved supporting staff with information, data and media literacy, digital communication, collaboration and participation, or digital learning and teaching. 84% said they were not involved in digital creation, problem solving and innovation and 58% said they were not involved in digital identity and wellbeing.

Later questions and in-depth interviews revealed that many HR teams are in universities or colleges which are just starting to take an institution-wide approach to staff and student digital capabilities. One of the challenges for HR teams is in identifying their roles and potential areas where they could input to institution-wide initiatives and the developments of strategies for developing digital capabilities. Whilst some HR teams were aware of the Jisc tools and resources to support this work, many had not seen them before or had not engaged with them. It became clear to us that there was a need for some practical materials to help HR teams map their various activities (often split into specialist sections) to the digital capabilities framework.

The original survey is still open so if you did not get a chance to respond earlier we would still welcome your input.

https://jisc-beta.onlinesurveys.ac.uk/hr-support-of-staff-digital-capabilites

New materials for HR teams

HR teams cover a wide range of activities that require them to consider and/or support staff digital capabilities across their institutions. These include recruitment and selection, onboarding, appraisal/performance review, learning and development, relationship management and health and wellbeing. Data management and analytics, increasingly sophisticated institutional systems and the impact of social media mean that Human Resource teams themselves need a range of digital capabilities to effectively carry out their work.

We have produced two sets of powerpoint slides that could be used within HR teams and we are interested to find out if they are useful. Thanks are due to Abi Mawhirt, Head of People and Organisational Development at Dundee and Angus College who worked with us to refine these slides and to make sure we did not have any serious omissions. Abi will be using the slides within her own institution and we have some other HR teams who have said they might try them out.

HR teams could use the slides (or select the ones that they feel are most relevant to their context) to consider their activities, identify and build on strengths, as well as identify any gaps or areas where they could enhance their support of staff digital capabilities. It may highlight areas where HR teams could take the lead, for example in the area of Digital identity and wellbeing.

HR-ppt-screen

This set maps HR activities and roles to the Jisc digital capabilities framework. It highlights where HR teams can input to institution wide approaches to staff digital capabilities and offers some suggestions for activities where they could get involved. Some of these areas involve other teams and would encourage HR input to support teams leading on a particular area.

HR-ppt-DigID

This set offers a view of HR activities through the Jisc digital capabilities framework. Each area of HR activities is mapped to the 6 key elements of the Digital capabilities framework and highlights where HR teams can impact on digital capabilities of staff (and to a lesser extent students).

We have also highlighted those activities that relate to digital capabilities of staff in HR teams.


Please pass these on to your own HR team and ask them to try them out. We have produced a brief pdf document which offers ideas for how they might be used.

Here are some of the suggestions:

  1. Use the slides to deliver a team presentation highlighting areas of most relevance to the team.
  2. Use the slides or a selection of slides in a presentation to focus on particular aspects – either a particular area of HR activities such as recruitment and selection or on a specific area of the digital capabilities framework such as Digital wellbeing.
  3. Use the slides as a pdf document to share within teams and follow up with workshops to consider them within your own context.
  4. Get different teams within HR to focus on specific slides (or pdf pages) and ask them to come up with an action plan following their discussions.
  5. Use the slides or some of the content to present to different teams within the organisation to highlight what you are doing in different areas of digital capability or what you would like to do.
  6. Use the materials to highlight areas for joint working or partnership approaches to other teams or departments within the institution.
  7. Link to other Jisc digital capabilities, guidance, tools or resources to highlight possible HR roles across the institution.

We would like to gather some feedback about these so that we can adapt or enhance them. Link to a brief survey.

Let us know what you think. Help us make them better.

Discovery tool: understanding the questions

We have just been through an interim analysis of feedback from staff users of the Digital discovery tool. Thank you for directing so many staff to complete the feedback form – 225 general staff and 150 teaching staff have done so already, and it has been an invaluable resource.

Screen Shot 2018-05-14 at 15.20.38The feedback so far has been very positive, with some interesting perceptions that we will report in the next blog post. This post is about some of the changes we have made to the content of questions. It also seems like a good opportunity to explain a bit more of the thinking that goes into the three question types, and into the reasons for designing the discovery tool in the way we have. There is some general information at the top of the post, and more detail further down for those who are interested in the different question types.

Development rather than testing

At the start of the design process we had to make a significant decision. We could have written ‘testing’ questions, as in a typical assessment test, to find out what users really  understand about digital applications and approaches. But we decided to write ‘developmental’ questions instead. These are designed to develop understanding, for example by making clear what ‘better’ (deeper, better judged) performance looks like. Rather than hiding the ‘right’ answer, they make transparent what expert digital professionals do and ask users to reflect and report: ‘do I do that?’

We have gone down this road partly because we are not convinced that testing abstract understanding is the best indicator of actual practice, and partly because this approach is more acceptable to end users. Staff want to be treated as professionals, and to take responsibility for assessing and moving forward their own practice. Also, we are not designing in a platform that supports item-by-item matching of feedback to response. So it’s not possible for the feedback itself to be closely matched to users’ input – as it would be in an assessment system – and our questions themselves have to do a lot of the work.

This has important implications for the meaning of the scoring ‘bands’ that we use to assign feedback to users (more of this shortly).

Where do the question items come from?

Essentially, to design the questions we first developed a wide range of real-world activities that digital professionals do. We’ve tested those out with expert panels, and also against the relevant professional profile(s) – which have had professional body involvement.

Of course we could just have presented these activities in a random order, and this was an early design idea. But the digital capabilities framework already had good recognition in the sector, and we needed a navigational aid. So in the case of the generic assessments (for staff and students) we allocated activities to the different framework areas, e.g. ‘data literacy’. In the case of role-specialist assessments, we used specialist requirements from the relevant profile, such as ‘face-to-face teaching’ or ‘assessment and feedback’ in the case of the teaching assessments.

We then took one activity that was central to the area in question and framed it as a ‘confidence’ question (‘How confident do you feel about doing x?’). We developed another activity into a mini-scenario or example to create a ‘depth’ question, with four levels of response possible (‘Which of these best reflects your response?’). Six further activities became options in a ‘breadth’ question (‘Which of these can you do? Select any or all that apply to you’). This provides us with three questions, 8 activities, for each area of practice. There is more about the different question types below.

We have not statistically tested to discover whether responses to all three questions in one area  hang together to create a distinct and separate factor. There is the opportunity to do that with system data at this point, but our first aim was to create a navigable user experience – making sense and generating helpful feedback – rather than to validate a model.

Ideally the feedback we give to users would relate to their responses for each of the eight different activities. Without this option, we have used scoring bands to allocate roughly appropriate feedback to users, based on their responses to the three questions. It’s not exact, and some users have picked that up. However, most users rate the quality of feedback highly – it has the most positive comments of any feature – so we know we are getting it more or less right. We hope we have dealt with the lack of specificity by offering a range of ‘next steps’ that participants can choose from, according to their own interests and self-assessed development needs.

You’ll understand from this that scoring is an artefact of the system we are using and the design choices we have made within it, not an objective measure of any kind.

We were pleased when we analysed system data from the first two months of use to see that in all but three of the 45 generic staff questions, and in all the teaching staff questions, the scoring bands were evenly distributed. This means that the questions were doing a good job of discriminating among staff according to their (self-declared) expertise, and the full range of scoring bands and feedback was being used. Three questions had median scores outside of the normal range, and a couple of sections elicited comments that users did not feel their feedback reflected their actual capability (‘information literacy’ was one). Rather than changing the underlying scoring model for these questions, we decided it was more appropriate to work on the content to try to produce a more even distribution of responses around a central median point. So if users’ scores differ from the median, that should mean something – but we can’t say that it means anything about their objective performance.

Of course users who answer the questions after the changes were made on 5 May will not be scoring in the same way as users who answered the questions before. (It’s also possible that in making the changes suggested by user feedback, we have inadvertently shifted the scoring for some other questions – we will be checking this.) This will need to be communicated to any staff who are returning to use the discovery tool again. It will also need to be taken into account when looking at data returns, since data from before and after the changes can’t be treated as one data set. This is one reason we have cautioned against using scoring data to draw any firm conclusions, particularly during this pilot period when the content is still evolving.

We hope you will convey to all the staff who took the time to complete a feedback form that we have listened to their views – and that you and they will feel that the revised questions are an improvement. This is why this pilot process is so valuable.

How have the questions changed in response to feedback?

(Some changes to wording and options is based on findings from early user testing and not from the more general feedback we gained via the user feedback forms.)

We’ve slightly changed the lay-out of questions and added some more navigational text to clarify how to answer them.

We’ve removed or clarified some terms that were not well understood. Overall we know there is a need for a glossary – ideally with examples and links. That is something Lou will be working on for the future service. We’ve also changed a couple of examples we were using for illustration. There have been many discussions about the pros and cons of examples. Some people find generic terms difficult to understand without examples: but more people object when examples are used, because they favour some applications or approaches over others that are equally valid. Examples can confuse further: ‘if I don’t use that tool, I’m obviously not doing it (right)’. Overall we have gone light on examples, and we hope users’ understanding of terms will improve when we have a detailed glossary we can link to.

We have tried to focus more on activities users do at work, in an educational organisation (college or university). There were some negative comments about references to digital practices beyond this space. However, because of the need to cover a very wide range of roles – and because some roles don’t allow people to express digital capabilities they actually have – we can’t avoid offering some examples from beyond a narrowly-defined work role. For example, one of the activities under ‘digital identity’ is ‘manage social media for an organisation, group or team‘, and under ‘data literacy’ we have ‘judge the credibility of statistics used in public debate’. This is to allow users who don’t manage social media or evaluate statistics as part of their job to reflect on whether they have these capabilities anyway – perhaps gained in their personal life or another role. And indeed to consider whether these activities might be useful to them.

We’ve changed several references to social media, as a number of users objected to what they felt was an underlying assumption that social media would or should be used, and that this was a positive sign of capability. There are still several ways that users can show they are making wise judgements about the appropriateness of social media.

We’ve tried our best to use prompts that reflect capability (‘could do’, ‘would do’, ‘have ever done’) rather than current practice (‘do’, ‘do regularly’), which may be constrained by organisational issues or may reflect judgements not to use. However, we are also mindful that self-reported practice (‘I actually do this’) is usually more accurate than self-reported ability (‘I could do this if I wanted to’). Where we feel it is justified, we have continued to ask about actual use. So long as users understand that they are not being judged, it seems appropriate for the questions and feedback to indicate areas where they are not as capable as they might be if their organisation were more supportive of different practices, or their job role offered more digital opportunities.

There have been changes to the teaching questions, again to focus on pedagogical judgement rather than digital practice. There are now quite a number of caveats e.g. ‘if appropriate to my learners‘, which were suggested by more expert users. Of course we always listen to our experts (!) but as designers we’re aware that introducing caveats like this makes the questions longer and more complex, creating more cognitive load for users, and potential annoyance. We will monitor completion rates to see if this is a problem.

We have particularly reviewed the assessment questions and the online learning questions to be sure we are covering the very wide range of good practice in these areas.

There follows more detail on specific question types and the changes we have made to each of these.

‘Confidence’ questions

Why have we included questions that ask users ‘How confident do you feel about..?’ when we know that self-assessed confidence is generally unreliable? We do this at the start of each element to give users an orientation towards the questions that follow – ‘this is the area of practice we are looking at next’ – and a sense that they are in control. By trusting users to rate themselves, we are both reassuring them that they are not being ‘tested’, and asking them to be honest and searching in their responses. We have weighted the scoring for this question at a low level to reflect users tendency to answer inaccurately – though in fact when we came to compare confidence scores with scores on the other two question types in the same area of practice, there was a positive match.

In feedback, quite a number of users mentioned the tone of these questions positively. Screen Shot 2018-05-14 at 15.03.11However, some felt that they were too ‘subjective’, or ‘vague’. We have tried to deal with this in the update by focusing some questions more tightly on specific practices within the overall area we are looking at. So for example in the generic staff set, under ‘digital creativity’ we ask: ‘How confident are you creating digital content e.g. video, audio, animations, graphics, web pages?’ In the teaching set, under ‘learning resources’, we ask ‘How confident are you about using digital resources within the rules of copyright?‘ We have to find a practice that is generic enough to be available to staff in a wide variety of different roles, but specific enough for the response to feel rooted in a real-world activity.

We have had internal discussions about whether to move the confidence questions to the end of each set, or to remove them altogether. For now they stay where they are.

 

‘Depth’ questions

These questions are the most difficult to write and currently the most troublesome to end users. There are some ongoing issues with how they are presented on screen, and we are looking into whether any improvements are possible, but for now we have reworded the questions to make the steps to answer them as clear as we can.

These questions offer a short situation or example. Users select the one response that best matches what they would do or what expertise they have. The lay-out of the question reflects the progression logic: the first option reflects the lowest level of judgement or expertise, and the fourth option reflects the highest. There is no trickery here. We describe how progressively more expert practitioners think or act, and ask users to report where they sit on that scale. (At the moment, the visual cues do not make clear that it is a scale, or that higher levels of judgement encompass aScreen Shot 2018-05-14 at 14.45.16nd include the lower ones.)

 

Beyond the difficulties some users had in ‘reading’ the answer logic for these questions, it is clear that we have to get the progression logic right in each case. When people disagree with our judgement about what is ‘more expert’, they don’t like these questions. When they agree, they say they are ‘nuanced’, ‘thoughtful’, and ‘made me think‘. We know that our users expect us to reflect issues of judgement and discrimination (‘how well is
digital technology being used?’) at least as much as extent of use (‘how many different digital tools?’). So we know these questions have to be in there. They have to reflect important issues of digital thinking or mindset, and we have to get them right – in a very small number of words!

Our recent updates aim to clarify the focus on judgement and experience rather than extent of use. And we have added modifiers such as ‘when appropriate’ or ‘if appropriate for your learners’ (teaching staff) to emphasise that we don’t believe technology is always the answer – but good judgement about technology is. This creates more words on the screen, which will put off some users, but we want our champions to feel that our words represent thoughtful
practice and not a shallow checklist of skills.

‘Breadth’ questions

Screen Shot 2018-05-14 at 14.48.55These are in many ways the most unproblematic. They offer a range of digital activities that staff may do already, may want to do, or may not even have thought about. As before, we try to clarify that we don’t think digital practices are always the best, but we do want people to extend their repertoire so they have more experience of what does (and doesn’t) work. We try to use wording that values skills users have, even if they can’t use them currently due to their role or organisational context. We have tried to avoid very role-specific activities, but not to preclude the possibility that people might develop some professionally-relevant skills in their personal lives, or take on tasks from ‘other’ roles that they enjoy. We include fairly basic activities that many users will be able to select, and quite advanced activities that offer something to aspire to. The ‘nudge’ information is obvious: think about doing some of these things if you don’t or can’t already.

 

What next?

We are always interested in your views on the questions and other content. The user feedback forms will remain live until the end of the pilot project and we expect to make some further updates to content at that point. Please keep asking your users to access this from the potentially platform.

If you are an institutional lead, you will shortly have an opportunity to give us feedback via your own detailed evaluation survey. You can also give us comments and feedback at any time via our expert feedback form – please direct other stakeholders and interested users to do this too.

Engaging users with the Digital discovery tool

There are only a few weeks to go before we wrap up this pilot phase of the the Digital discovery tool, but still time to get new users involved. Some pilot sites have finished engaging users and are now evaluating how things have gone, but others are still looking for staff and students to give the discovery tool a try.

There are five new promotional posters from the Jisc team that can help. These can be adapted with an institutional logo and the details of any live workshops or support materials.

Screen Shot 2018-04-19 at 23.17.53

Download your posters here:

There are other ideas for engaging users on our Guidance page: Engaging users.

Thinking ahead, lead contacts at all the pilot sites will be sent a survey about their experience on 5 June. The survey is quite comprehensive, as this is our best source of information about how the Digital discovery tool is being used in practice. There are 15 questions, covering user engagement, support and follow-up for the discovery tool, and whether there have been any individual or organisational benefits. We ask for this to be completed by 30 June.

Before completing the form, we suggest that leads run a focus group or consultation event with users. This will allow evidence to be gathered that can help to answer the evaluation questions. There are materials for running consultation events on our Guidance page: evaluating with users, but this doesn’t have to be complicated. It could be as simple as getting some users together and exploring a couple of the questions on the evaluation form.

Just now, we are using all the valuable feedback from users to make some refinements. You may notice these in the questions and feedback for staff. There will be more significant updates once the pilot has finished. It’s really helpful if you can point your users to these feedback forms, which are found on their dashboards. We can only make things better with their help – and yours!