Building digital capability service now launched

We have had a busy summer preparing for the launch of our new building digital capability service and we would like to share with you some updates from the team.

Next Community of Practice event
Registration is now open for our next Community of Practice event, running on Wednesday 21st November at the University of Hertfordshire. Everyone is welcome, please register now from our event page and we hope to see many of you there!

New website
Our new Building digital capabilities website has now been launched and is available from https://digitalcapability.jisc.ac.uk. You can access our range of advice and guidance from here as well as find out more information about the additional benefits the service offers, including the discovery tool, and how to subscribe. We have also developed a new role profile for Professional services staff in education, available with the other role profiles from https://digitalcapability.jisc.ac.uk/what-is-digital-capability/.

For subscribers to the service additional pages in a ‘logged in’ section provide access to pathways through our advice, guidance, tools and resources.

Discovery tool
The updated version of the discovery tool is also now live at: https://jisc.potential.ly/ – with a free reduced version available to all staff (please note this is NOT for students as only the question set for staff is available in this free version).

The full version of the tool is available for staff and students within subscribing institutions, see our website for more information on what is included in both free and full versions.

If you have any queries or feedback for us please contact us at: DigitalCapability@jisc.ac.uk

Notes and presentations from the 3rd Digital capabilities community of practice event – 22 May 2018

University of Leicester, College Court fountain

Photo credit: Heather Price

The University of Leicester provided a great setting for our third community of practice meeting. With eighty five delegates participating in person and many more joining in online (using the hashtag #digitalcapability) this was one of our most vibrant and productive meetings to date.

This is a brief summary of the event. All links to slides, recordings and other outputs from these sessions are available from the Jisc event page.

Dr Ross Parry, Associate Professor and Deputy Pro Vice Chancellor (digital) at the University of Leicester set the scene for the day with his opening keynote Digital capabilities as a strategic priority. He talked about the importance of creating a shared vision and gave a number of insights gained from his experience of developing and implementing the University’s digital strategy. He said “You can have all the tech in the world, but it’ll make little difference if you don’t also have a community with the confidence and fluency to use it in creative and exciting ways”. (Watch a recording of Dr Ross Parry’s keynote )

The three parallel community led sessions focussed on practical engagement strategies to engage students, senior leaders and human resource teams. This was an opportunity for participants to share their experiences, discuss with colleagues and identify opportunities for collaboration.

Students – Facilitated by Frances Deepwell, Director of Leicester Learning Institute, University of Leicester and Natalie Norton, Head of Technology enhanced learning and digital literacies, University of Winchester. (Padlet notes on practical strategies to engage students)

Senior Leaders – Facilitated by Dr Ross Parry, Deputy Pro Vice Chancellor (digital), University of Leicester and Dr Leigh Casey, Associate Director Organisational Development, University of Leicester

Human resource teams – Facilitated by Sarah Knight, Head of change – student experience, Jisc (Padlet notes on practical strategies to engage human resource teams)

These were followed by the first set of four pecha kucha sessions:

  1. Future facing learning – Paul Durston, Teesside University
  2. Digital Leadership for Students: Development of an online resource – Vikki McGarvey, Learning and information services manager, Staffordshire University Library
  3. Can student-staff partnerships support the development of digital teaching and learning practices? – Alex Patel and Bethany Cox, University of Leicester
  4. Digital Leaders – Integrating digital in York’s leadership programmes – Susan Halfpenny, Teaching and Learning Manager, University of York; Michelle Blake, head of relationship management, University of York

(Watch recordings of the first set of pecha kucha presentations )

Kerensa Jennings

The second keynote, How iDEA is developing digital citizens was delivered by Kerensa Jennings from the Duke of York Inspiring Digital Enterprise Award (iDEA). Kerensa gave an overview of this international programme which aims to help address the digital skills gap. She explained that all iDEA resources are free to use and are being increasingly taken up by UK FE colleges and other learning providers.

(Watch a recording of Kerensa Jennings’ keynote)

Clare Riley, Microsoft Education

Clare Riley, Microsoft Education (Photo credit – Nevin Moledina)

Sarah Knight and Heather Price then gave a brief update from the Jisc digital capability team  and we were able to discuss specific aspects in four parallel sessions:

  • Digital Discovery tool surgery – Heather Price, Jisc
  • How can we support students with the development of their digital capabilities using the Jisc discovery tool for learners? Helen Beetham and Sarah Knight, Jisc
  • Mapping of Microsoft resources to the digital capability framework – Shri Footring (Jisc), Nevin Moledina (University of Leicester) and Clare Riley (Microsoft)
  • Building digital capability service site – Clare Killen and Alicja Shah, Jisc

The event closed with the second set of four pecha kucha sessions:

  1. Practising Digitally @ NTU – Elaine Swift, Digital practice manager, Nottingham Trent University
  2. MedEd meet real world world – building capability in HE and NHS workplaces – Cath Fenn, Senior academic technologist, University of Warwick
  3. To infinity and beyond: achieving the University’s ambitions through digital capability – Mike Quarrell, Worksforce development co-ordinator and Alison Small, Head of registry services and change, University of Derby
  4. Can you escape the digital challenge? – A Pecha Kucha in rhyme about our Digital Escape Room event – Mark Hall, digital learning developer, Bishops Grosseteste University

(Watch recordings of the second set of pecha kucha presentations)

Overall, I was struck by the sense of energy throughout the day. This was evident in the keynotes, presentations and workshops as well as the depth of questions and conversations throughout the day. Delegates mentioned that they found the keynotes, presentations, and the opportunities to network and share ideas particularly valuable.

This is a community led event and we are really keen to work in partnership to run the next one, due to be held in November 2018. Please get in touch with us in the team if you might be interested in hosting the next event.

Digital discovery tool: please give us your feedback!

Over the last few weeks we’ve been immersed in individual feedback on the experience of using the Digital discovery tool. This has meant some significant revisions to the content and format of the questions for staff, as described in an earlier post. As we are now at the end of the pilot we’ll be able to compare feedback since the changes were made and see if users find them to be an improvement. (Remember you’ll still have access to the tool till the 13th July).

log-in screenSome of the same issues have been recorded by our student users, along with some new ones such as relevance to different subject areas. We’ll be reporting back on this feedback shortly, with our planned response. You’ll have an opportunity to hear more about individual staff and student responses in our webinar at 12.30 – 14.00 on Tuesday 19th June (links to follow).

Now we are keen to hear about the experience of our lead contacts and how the Digital discovery tool has been used at organisational level. We have just launched the evaluation form (you will receive a personal email with a link to this). All the questions are optional, to help you focus on those areas where you really have something to day. But of course the more you can tell us, the more we can improve.

In particular, we ask about any evidence you have that use of the Discovery tool has led to change, either for individual users or more generally in the organisation. It’s really helpful if you have carried out a focus group or consultation event, and there are resources to help you do this on the evaluation page of this site. There’s also a handy reminder here of the evaluation process overall. And Shri’s recent blog post covers some of the organisational issues you might be thinking about as you compose your feedback.unnamed-2

There is a whole section of feedback about your experience of using the organisational data dashboard, so it’s a good idea if you have downloaded your most recent data and thought about how it might be used. See our guide on how to download the data, and blog post on Making use of your data.

We’d appreciate all organisational responses by the 29th June, as we’ll be analysing these results shortly after. There’ll be an opportunity to hear and discuss our findings at our second webinar on Thursday 19th July, 12.30-14.00.

Three emerging insights from the Digital discovery pilot

Co-authored by Clare Killen

Map showing locations of UK pilotsOver one hundred universities, colleges and other providers are piloting the Jisc Digital discovery tool in the UK and overseas. The design of this tool encourages individuals to reflect on and develop their digital capabilities. It provides a summary of their self-assessment in response to nuanced question prompts as well as suggestions for further development with links to relevant, interactive resources. Whilst it is very much a personal tool, additional features allow institutional leads tasked with supporting digital capabilities development to gain insights from anonymised data and translate them into the institutional context.

Jisc team members have visited several pilot institutions to support the implementation process. In doing so, and through our in-depth conversations, we have learned about what works, at a practical level, when it comes to providing opportunities to develop the digital capabilities of staff and students in the various organisations. Further insights have emerged from conferences, events and meetings featuring presentations from our pilots, for example, the Student Experience Experts meeting and the Digital capabilities session at Digifest18.

As the roll-out gathers pace, we are starting to gain some fascinating insights into how institutions are using the opportunities offered by this tool to bring about positive change in their organisations. There are some clear themes emerging around what organisations that are benefiting from the Digital discovery process typically have in place:

 1. Clear strategic vision

We are seeing that the organisations with a clear message about the importance of digital technologies, communicated and understood by everyone, provides a meaningful context for the use of the discovery tool.

“It is important to have a clear strategy and people need to know that digital is part of the strategy and part of what they do. You need to engage people in it, allow them to see how it affects them and why it is important to them. It needs to be exciting, so for example, we have run several big events that inspire and excite people around the idea of using technology to support teaching and learning and the college business.”
Penny Langford, head of e-learning, Milton Keynes College

2. Culture

Having a safe space in which teams can explore their thinking about their own priorities for development creates an environment in which individuals can thrive.

“The individual reports which each member of my team had, generated discussions and comparisons, with staff considering their different roles and how that has had an impact upon their individual percentage. More than that though, it made them consider how they might acquire skills where they didn’t score as highly. I have eLearning Technologists and Librarians in my team and each had different scores, particularly the Information Literacy category. Which prompted all manner of discussion around the fake news agenda and critically evaluating information sources.”
Sarah Crossland, academic services manager, Doncaster College and University Centre

3. Connections

Establishing the connections between individual self-identified aims, the overall picture for all staff and the resources available to support professional development to meet organisational strategic aims.

We wanted to identify gaps in staff confidence in their digital skills and use this information to target staff training and support. We looked at other products but there was nothing really out there to meet those requirements. We were looking for a standardised tool and wanted something to self-motivate staff. The approach taken by the Digital discovery tool supports that.
Joseph Pilgrim, digital learning co-ordinator, ACT Training

Digital capability community of practice

The next digital capability community of practice event is being hosted in partnership with the University of Leicester on 22 May 2018. This provides an opportunity to learn about related initiatives and hear more from the wider community including many members who are taking part in the pilot of Digital discovery tool.
While registration for this event has now closed, the keynote sessions will be live streamed. Follow the hashtag #digitalcapability on the day and presentations and any outputs will be available from the event page.

There is still time to engage staff

If you are part of the pilot, you still have time to engage staff, perhaps through end of term staff development events. Remember that feedback is required by the end of May but the Digital discovery tool will continue to be available until 13 July 2018.

How HR teams support staff digital capability

At the end of 2017 we began a short review into how Human Resources (HR) departments support staff to develop their digital capability. We developed an online survey and interviewed some of the respondents to try to capture a snapshot of current practice.

Initial results

The results of these activities confirmed our initial expectation that many HR teams have been working across several areas of the digital capability framework, often in partnership with other teams within their institutions. However for both HE and FE respondents there were quite significant variations to the questions about HR team involvement in the 6 core digital capability areas. Whilst 90% of people said they were involved in supporting ICT proficiency of staff, only 50% said they were involved supporting staff with information, data and media literacy, digital communication, collaboration and participation, or digital learning and teaching. 84% said they were not involved in digital creation, problem solving and innovation and 58% said they were not involved in digital identity and wellbeing.

Later questions and in-depth interviews revealed that many HR teams are in universities or colleges which are just starting to take an institution-wide approach to staff and student digital capabilities. One of the challenges for HR teams is in identifying their roles and potential areas where they could input to institution-wide initiatives and the developments of strategies for developing digital capabilities. Whilst some HR teams were aware of the Jisc tools and resources to support this work, many had not seen them before or had not engaged with them. It became clear to us that there was a need for some practical materials to help HR teams map their various activities (often split into specialist sections) to the digital capabilities framework.

The original survey is still open so if you did not get a chance to respond earlier we would still welcome your input.

https://jisc-beta.onlinesurveys.ac.uk/hr-support-of-staff-digital-capabilites

New materials for HR teams

HR teams cover a wide range of activities that require them to consider and/or support staff digital capabilities across their institutions. These include recruitment and selection, onboarding, appraisal/performance review, learning and development, relationship management and health and wellbeing. Data management and analytics, increasingly sophisticated institutional systems and the impact of social media mean that Human Resource teams themselves need a range of digital capabilities to effectively carry out their work.

We have produced two sets of powerpoint slides that could be used within HR teams and we are interested to find out if they are useful. Thanks are due to Abi Mawhirt, Head of People and Organisational Development at Dundee and Angus College who worked with us to refine these slides and to make sure we did not have any serious omissions. Abi will be using the slides within her own institution and we have some other HR teams who have said they might try them out.

HR teams could use the slides (or select the ones that they feel are most relevant to their context) to consider their activities, identify and build on strengths, as well as identify any gaps or areas where they could enhance their support of staff digital capabilities. It may highlight areas where HR teams could take the lead, for example in the area of Digital identity and wellbeing.

HR-ppt-screen

This set maps HR activities and roles to the Jisc digital capabilities framework. It highlights where HR teams can input to institution wide approaches to staff digital capabilities and offers some suggestions for activities where they could get involved. Some of these areas involve other teams and would encourage HR input to support teams leading on a particular area.

HR-ppt-DigID

This set offers a view of HR activities through the Jisc digital capabilities framework. Each area of HR activities is mapped to the 6 key elements of the Digital capabilities framework and highlights where HR teams can impact on digital capabilities of staff (and to a lesser extent students).

We have also highlighted those activities that relate to digital capabilities of staff in HR teams.


Please pass these on to your own HR team and ask them to try them out. We have produced a brief pdf document which offers ideas for how they might be used.

Here are some of the suggestions:

  1. Use the slides to deliver a team presentation highlighting areas of most relevance to the team.
  2. Use the slides or a selection of slides in a presentation to focus on particular aspects – either a particular area of HR activities such as recruitment and selection or on a specific area of the digital capabilities framework such as Digital wellbeing.
  3. Use the slides as a pdf document to share within teams and follow up with workshops to consider them within your own context.
  4. Get different teams within HR to focus on specific slides (or pdf pages) and ask them to come up with an action plan following their discussions.
  5. Use the slides or some of the content to present to different teams within the organisation to highlight what you are doing in different areas of digital capability or what you would like to do.
  6. Use the materials to highlight areas for joint working or partnership approaches to other teams or departments within the institution.
  7. Link to other Jisc digital capabilities, guidance, tools or resources to highlight possible HR roles across the institution.

We would like to gather some feedback about these so that we can adapt or enhance them. Link to a brief survey.

Let us know what you think. Help us make them better.

Discovery tool: understanding the questions

We have just been through an interim analysis of feedback from staff users of the Digital discovery tool. Thank you for directing so many staff to complete the feedback form – 225 general staff and 150 teaching staff have done so already, and it has been an invaluable resource.

Screen Shot 2018-05-14 at 15.20.38The feedback so far has been very positive, with some interesting perceptions that we will report in the next blog post. This post is about some of the changes we have made to the content of questions. It also seems like a good opportunity to explain a bit more of the thinking that goes into the three question types, and into the reasons for designing the discovery tool in the way we have. There is some general information at the top of the post, and more detail further down for those who are interested in the different question types.

Development rather than testing

At the start of the design process we had to make a significant decision. We could have written ‘testing’ questions, as in a typical assessment test, to find out what users really  understand about digital applications and approaches. But we decided to write ‘developmental’ questions instead. These are designed to develop understanding, for example by making clear what ‘better’ (deeper, better judged) performance looks like. Rather than hiding the ‘right’ answer, they make transparent what expert digital professionals do and ask users to reflect and report: ‘do I do that?’

We have gone down this road partly because we are not convinced that testing abstract understanding is the best indicator of actual practice, and partly because this approach is more acceptable to end users. Staff want to be treated as professionals, and to take responsibility for assessing and moving forward their own practice. Also, we are not designing in a platform that supports item-by-item matching of feedback to response. So it’s not possible for the feedback itself to be closely matched to users’ input – as it would be in an assessment system – and our questions themselves have to do a lot of the work.

This has important implications for the meaning of the scoring ‘bands’ that we use to assign feedback to users (more of this shortly).

Where do the question items come from?

Essentially, to design the questions we first developed a wide range of real-world activities that digital professionals do. We’ve tested those out with expert panels, and also against the relevant professional profile(s) – which have had professional body involvement.

Of course we could just have presented these activities in a random order, and this was an early design idea. But the digital capabilities framework already had good recognition in the sector, and we needed a navigational aid. So in the case of the generic assessments (for staff and students) we allocated activities to the different framework areas, e.g. ‘data literacy’. In the case of role-specialist assessments, we used specialist requirements from the relevant profile, such as ‘face-to-face teaching’ or ‘assessment and feedback’ in the case of the teaching assessments.

We then took one activity that was central to the area in question and framed it as a ‘confidence’ question (‘How confident do you feel about doing x?’). We developed another activity into a mini-scenario or example to create a ‘depth’ question, with four levels of response possible (‘Which of these best reflects your response?’). Six further activities became options in a ‘breadth’ question (‘Which of these can you do? Select any or all that apply to you’). This provides us with three questions, 8 activities, for each area of practice. There is more about the different question types below.

We have not statistically tested to discover whether responses to all three questions in one area  hang together to create a distinct and separate factor. There is the opportunity to do that with system data at this point, but our first aim was to create a navigable user experience – making sense and generating helpful feedback – rather than to validate a model.

Ideally the feedback we give to users would relate to their responses for each of the eight different activities. Without this option, we have used scoring bands to allocate roughly appropriate feedback to users, based on their responses to the three questions. It’s not exact, and some users have picked that up. However, most users rate the quality of feedback highly – it has the most positive comments of any feature – so we know we are getting it more or less right. We hope we have dealt with the lack of specificity by offering a range of ‘next steps’ that participants can choose from, according to their own interests and self-assessed development needs.

You’ll understand from this that scoring is an artefact of the system we are using and the design choices we have made within it, not an objective measure of any kind.

We were pleased when we analysed system data from the first two months of use to see that in all but three of the 45 generic staff questions, and in all the teaching staff questions, the scoring bands were evenly distributed. This means that the questions were doing a good job of discriminating among staff according to their (self-declared) expertise, and the full range of scoring bands and feedback was being used. Three questions had median scores outside of the normal range, and a couple of sections elicited comments that users did not feel their feedback reflected their actual capability (‘information literacy’ was one). Rather than changing the underlying scoring model for these questions, we decided it was more appropriate to work on the content to try to produce a more even distribution of responses around a central median point. So if users’ scores differ from the median, that should mean something – but we can’t say that it means anything about their objective performance.

Of course users who answer the questions after the changes were made on 5 May will not be scoring in the same way as users who answered the questions before. (It’s also possible that in making the changes suggested by user feedback, we have inadvertently shifted the scoring for some other questions – we will be checking this.) This will need to be communicated to any staff who are returning to use the discovery tool again. It will also need to be taken into account when looking at data returns, since data from before and after the changes can’t be treated as one data set. This is one reason we have cautioned against using scoring data to draw any firm conclusions, particularly during this pilot period when the content is still evolving.

We hope you will convey to all the staff who took the time to complete a feedback form that we have listened to their views – and that you and they will feel that the revised questions are an improvement. This is why this pilot process is so valuable.

How have the questions changed in response to feedback?

(Some changes to wording and options is based on findings from early user testing and not from the more general feedback we gained via the user feedback forms.)

We’ve slightly changed the lay-out of questions and added some more navigational text to clarify how to answer them.

We’ve removed or clarified some terms that were not well understood. Overall we know there is a need for a glossary – ideally with examples and links. That is something Lou will be working on for the future service. We’ve also changed a couple of examples we were using for illustration. There have been many discussions about the pros and cons of examples. Some people find generic terms difficult to understand without examples: but more people object when examples are used, because they favour some applications or approaches over others that are equally valid. Examples can confuse further: ‘if I don’t use that tool, I’m obviously not doing it (right)’. Overall we have gone light on examples, and we hope users’ understanding of terms will improve when we have a detailed glossary we can link to.

We have tried to focus more on activities users do at work, in an educational organisation (college or university). There were some negative comments about references to digital practices beyond this space. However, because of the need to cover a very wide range of roles – and because some roles don’t allow people to express digital capabilities they actually have – we can’t avoid offering some examples from beyond a narrowly-defined work role. For example, one of the activities under ‘digital identity’ is ‘manage social media for an organisation, group or team‘, and under ‘data literacy’ we have ‘judge the credibility of statistics used in public debate’. This is to allow users who don’t manage social media or evaluate statistics as part of their job to reflect on whether they have these capabilities anyway – perhaps gained in their personal life or another role. And indeed to consider whether these activities might be useful to them.

We’ve changed several references to social media, as a number of users objected to what they felt was an underlying assumption that social media would or should be used, and that this was a positive sign of capability. There are still several ways that users can show they are making wise judgements about the appropriateness of social media.

We’ve tried our best to use prompts that reflect capability (‘could do’, ‘would do’, ‘have ever done’) rather than current practice (‘do’, ‘do regularly’), which may be constrained by organisational issues or may reflect judgements not to use. However, we are also mindful that self-reported practice (‘I actually do this’) is usually more accurate than self-reported ability (‘I could do this if I wanted to’). Where we feel it is justified, we have continued to ask about actual use. So long as users understand that they are not being judged, it seems appropriate for the questions and feedback to indicate areas where they are not as capable as they might be if their organisation were more supportive of different practices, or their job role offered more digital opportunities.

There have been changes to the teaching questions, again to focus on pedagogical judgement rather than digital practice. There are now quite a number of caveats e.g. ‘if appropriate to my learners‘, which were suggested by more expert users. Of course we always listen to our experts (!) but as designers we’re aware that introducing caveats like this makes the questions longer and more complex, creating more cognitive load for users, and potential annoyance. We will monitor completion rates to see if this is a problem.

We have particularly reviewed the assessment questions and the online learning questions to be sure we are covering the very wide range of good practice in these areas.

There follows more detail on specific question types and the changes we have made to each of these.

‘Confidence’ questions

Why have we included questions that ask users ‘How confident do you feel about..?’ when we know that self-assessed confidence is generally unreliable? We do this at the start of each element to give users an orientation towards the questions that follow – ‘this is the area of practice we are looking at next’ – and a sense that they are in control. By trusting users to rate themselves, we are both reassuring them that they are not being ‘tested’, and asking them to be honest and searching in their responses. We have weighted the scoring for this question at a low level to reflect users tendency to answer inaccurately – though in fact when we came to compare confidence scores with scores on the other two question types in the same area of practice, there was a positive match.

In feedback, quite a number of users mentioned the tone of these questions positively. Screen Shot 2018-05-14 at 15.03.11However, some felt that they were too ‘subjective’, or ‘vague’. We have tried to deal with this in the update by focusing some questions more tightly on specific practices within the overall area we are looking at. So for example in the generic staff set, under ‘digital creativity’ we ask: ‘How confident are you creating digital content e.g. video, audio, animations, graphics, web pages?’ In the teaching set, under ‘learning resources’, we ask ‘How confident are you about using digital resources within the rules of copyright?‘ We have to find a practice that is generic enough to be available to staff in a wide variety of different roles, but specific enough for the response to feel rooted in a real-world activity.

We have had internal discussions about whether to move the confidence questions to the end of each set, or to remove them altogether. For now they stay where they are.

 

‘Depth’ questions

These questions are the most difficult to write and currently the most troublesome to end users. There are some ongoing issues with how they are presented on screen, and we are looking into whether any improvements are possible, but for now we have reworded the questions to make the steps to answer them as clear as we can.

These questions offer a short situation or example. Users select the one response that best matches what they would do or what expertise they have. The lay-out of the question reflects the progression logic: the first option reflects the lowest level of judgement or expertise, and the fourth option reflects the highest. There is no trickery here. We describe how progressively more expert practitioners think or act, and ask users to report where they sit on that scale. (At the moment, the visual cues do not make clear that it is a scale, or that higher levels of judgement encompass aScreen Shot 2018-05-14 at 14.45.16nd include the lower ones.)

 

Beyond the difficulties some users had in ‘reading’ the answer logic for these questions, it is clear that we have to get the progression logic right in each case. When people disagree with our judgement about what is ‘more expert’, they don’t like these questions. When they agree, they say they are ‘nuanced’, ‘thoughtful’, and ‘made me think‘. We know that our users expect us to reflect issues of judgement and discrimination (‘how well is
digital technology being used?’) at least as much as extent of use (‘how many different digital tools?’). So we know these questions have to be in there. They have to reflect important issues of digital thinking or mindset, and we have to get them right – in a very small number of words!

Our recent updates aim to clarify the focus on judgement and experience rather than extent of use. And we have added modifiers such as ‘when appropriate’ or ‘if appropriate for your learners’ (teaching staff) to emphasise that we don’t believe technology is always the answer – but good judgement about technology is. This creates more words on the screen, which will put off some users, but we want our champions to feel that our words represent thoughtful
practice and not a shallow checklist of skills.

‘Breadth’ questions

Screen Shot 2018-05-14 at 14.48.55These are in many ways the most unproblematic. They offer a range of digital activities that staff may do already, may want to do, or may not even have thought about. As before, we try to clarify that we don’t think digital practices are always the best, but we do want people to extend their repertoire so they have more experience of what does (and doesn’t) work. We try to use wording that values skills users have, even if they can’t use them currently due to their role or organisational context. We have tried to avoid very role-specific activities, but not to preclude the possibility that people might develop some professionally-relevant skills in their personal lives, or take on tasks from ‘other’ roles that they enjoy. We include fairly basic activities that many users will be able to select, and quite advanced activities that offer something to aspire to. The ‘nudge’ information is obvious: think about doing some of these things if you don’t or can’t already.

 

What next?

We are always interested in your views on the questions and other content. The user feedback forms will remain live until the end of the pilot project and we expect to make some further updates to content at that point. Please keep asking your users to access this from the potentially platform.

If you are an institutional lead, you will shortly have an opportunity to give us feedback via your own detailed evaluation survey. You can also give us comments and feedback at any time via our expert feedback form – please direct other stakeholders and interested users to do this too.

Engaging users with the Digital discovery tool

There are only a few weeks to go before we wrap up this pilot phase of the the Digital discovery tool, but still time to get new users involved. Some pilot sites have finished engaging users and are now evaluating how things have gone, but others are still looking for staff and students to give the discovery tool a try.

There are five new promotional posters from the Jisc team that can help. These can be adapted with an institutional logo and the details of any live workshops or support materials.

Screen Shot 2018-04-19 at 23.17.53

Download your posters here:

There are other ideas for engaging users on our Guidance page: Engaging users.

Thinking ahead, lead contacts at all the pilot sites will be sent a survey about their experience on 5 June. The survey is quite comprehensive, as this is our best source of information about how the Digital discovery tool is being used in practice. There are 15 questions, covering user engagement, support and follow-up for the discovery tool, and whether there have been any individual or organisational benefits. We ask for this to be completed by 30 June.

Before completing the form, we suggest that leads run a focus group or consultation event with users. This will allow evidence to be gathered that can help to answer the evaluation questions. There are materials for running consultation events on our Guidance page: evaluating with users, but this doesn’t have to be complicated. It could be as simple as getting some users together and exploring a couple of the questions on the evaluation form.

Just now, we are using all the valuable feedback from users to make some refinements. You may notice these in the questions and feedback for staff. There will be more significant updates once the pilot has finished. It’s really helpful if you can point your users to these feedback forms, which are found on their dashboards. We can only make things better with their help – and yours!

 

Organisational data available

If you are part of our organisational pilot of the Digital discovery tool, you will now have access to your data dashboard with visual results from your staff users. Guidance for accessing and reading your data visualisations can be found here.

There is also a collaborate webinar on Tuesday 17 April at 13:00 which will walk you through the process and help you to make use of your data. You can access the webinar live here, or after the event you can access the recording here.

The rest of this post is about how you might make use of the data in your organisation. Please remember that the data provided as part of the pilot is still in development. We are in the process of finding out what data is useful. You should not rely on these data visualisations as a definitive source of information about staff training needs.

Making use of your data

You may want to use the number of staff completions – possibly broken down by department – to compare the number of staff who have fully engaged with the number of staff you hoped to reach at the start of the project. Who has and who has not engaged? Do you have feedback from your engagement sessions or a follow-up process (e.g. focus group) to explain any differences? How might you encourage engagement from other groups of staff?

You could also compare the number of staff who have completed the general (‘all staff’) assessment with the number completing the specialist teaching assessment(s). How would you explain any differences? Again consult with your users: were teaching staff more motivated and satisfied by the role-specific assessment?

The ‘in progress’ data allows you to see if there is a significant drop-off as staff are going through an assessment. This is a figure Jisc is looking at closely, as the user experience needs to be easy and supportive – that is our responsibility. But if you find differences in the drop-off rate across different staff groups, could this be because of differences in the support you make available to them?

Scoring band data should be interpreted with great caution. Jisc is using this data to ensure that the questions we ask produce a reasonably even spread of medians across the different areas of digital capability. But this is a broad aspiration: it is inevitable that some areas will prove more challenging to users than others. Also, some areas are essential for all staff (such as digital wellbeing), while others such as information, media or data literacy are more important in different roles.

This is why all our feedback to individual users asks them to reflect on their role and its demands before deciding how to prioritise their next steps. It is also why you should not compare scoring bands across completely different areas of digital capability and conclude that your staff have a ‘deficit’ in one area as compared with another. If you want to make comparisons, look at overall sector scoring bands and compare with the relevant banding in your organisation. But even this should be done with great care, particularly if you have a low number of users overall or in one departmental group, as this will skew the results.

Scores are all self-assigned, and their purpose is to ensure that users get appropriate feedback. If staff believe that their scores are being used for another purpose, they may not answer questions honestly, and the value of the Digital discovery tool will be severely limited.

Jisc encourages you to use the Digital discovery tool to support a dialogue with staff about the training and development they need. The spread of scoring bands across different departments may encourage you to target training in specific areas towards specific groups of staff. Because of the caveats above, you should not do this without consulting with the staff involved. Where staff score lower than others in their sector, this is definitely a cue for you to investigate whether they would appreciate more training and support, but it is not a performance measure and should never be used as such.

Following up and closing the feedback loop

The information you gather from the Digital discovery tool can be used to start conversations:

  • with HR and staff development about overall staff training and development needs;
  • with teaching staff about their confidence with digital teaching, learning and assessment, and their further development needs;
  • with IT and e-learning teams about support for specific systems and practices;
  • with budget-holders about investing in staff development resources and in online services.

You should report back to your staff users about how you are using this data, and what you are doing to support them more effectively in the future.

Using Discovery tool data to refine the questions and scoring

Thanks to the aggregate data we are getting from our first pilot users, we have been able to compare the median scores for each of the questions asked, and look at some other stats across the different assessments.

We were pleased to see from the first data returns that ‘depth’ and ‘breadth’ questions produce the medians we would expect, with one or two exceptions. We’ve worked on these outlying questions to make it a bit easier (or in one case a bit harder) to score in the middle range. This should bring the medians more into line with each other, making it easier and more valid to look across aggregate scores and compare areas of high and low self-assessment.

Median Question Scores - All capabilities

Median scores, ‘all staff’ assessment, snapshot from early March 2018: click for detail

There will always be some natural variation in average scores, because we are asking about different areas of practice, some of which will be more quickly adopted or more generally accomplished than others.

We were particularly pleased to find on testing that there is a positive correlation between confidence and responses to other questions in the same area (i.e. expertise and range). We would expect this, but it is good to have it confirmed. However, although there was a meaningful range of responses, almost no users were rating themselves less than averagely confident, so we are looking to adjust the scoring bands to reflect this. We don’t attach a great deal of weight to this question type, precisely because it is known that users tend to over-state their confidence, but is included to encourage reflection and a sense of personal responsibility.

You will see the impact of this work when we reach the mid-April review point, along with some further changes to the content and platform indicated by our user feedback. More about this below.

Scoring is designed to deliver appropriate feedback

As you see, we’re doing what we can to ensure that the scores individuals assign themselves are meaningful, so they allow relevant feedback to be delivered. The question types available don’t allow us to match selected items with feedback items (e.g. items not chosen in the grid or ‘breadth’ questions with ‘next steps’ suggestions in the personal report). This means relying on aggregate scores for each digital capability area. The pilot process is allowing us to find out how well the scoring process delivers feedback that users feel is right for them, and how the different areas relate to one another (or don’t!). However, the questions and scoring are not designed to provide accurate data to third parties about aptitude or performance. So scoring data, even at an aggregate level, should be treated with a great deal of caution. We are issuing new guidance on interpreting data returns very shortly.

radial

The radial diagram gives a quick overview of individual scores

The aim of the Digital discovery tool is developmental, so it’s clear what progress looks like and ‘gaming’ the scores is simple. Our contextualising information is designed to remove this temptation, by showing that the discovery process is for personal development and not for external scrutiny. Our feedback from staff in particular suggests that if there is any suggestion of external performance monitoring, they won’t engage or – if required to engage – they won’t answer honestly. Which of course will mean there is no useful information for anyone!

 

The ongoing evaluation process

evalform

Showing where to find the evaluation form on the dashboard

As well as examining user data, of course, we have access to the individual evaluation forms that (some) respondents fill out on completion.This is giving us some really useful insights into what works and what doesn’t.  However, at the moment we think the sample of respondents is weighted towards people who already know quite a lot about digital capability as a concept and a project. The views of people with a lot invested are really important to us. But we also need the feedback from naive users who may have a very different experience. Please encourage as many as possible of your users to complete this step. The evaluation form is available from a link on the user dashboard (see right).

Screen Shot 2018-04-01 at 22.01.53In addition we have taken a variety of expert views, and we are just about to launch a follow-up survey for organisational leads. This will ask you about what you have found beneficial about the project, what has supported you to implement it in your organisation, what you would change, and how you would prefer Jisc to take the Discovery tool project forward. Please look out for the next blog post and launch!

Resources in the Digital discovery tool

The Digital discovery tool provides links to a wide range of resources for each of the digital capability framework areas.

The platform delivers these resources in two ways.

Browse resources on your dashboard

When people log-in to the tool they are presented with a tailored welcome page/dashboard offering appropriate assessments for them based on the selections they make during log-in.   The dashboard also includes sets of resources for each of the six broad digital capability areas. You can scroll through these sets and browse the resources that we have mapped to these areas. We offer a brief description of the resource in this view.

resources-dashboard

Once you see a resource that looks interesting you can click on it to find out more. For each resource we have identified key audiences and level as appropriate and provide a brief description to help you decide how relevant it is to you. When you click on the URL in the resource page you will be taken directly to that resource outside of the discovery tool.

For some resources we offer suggested activities or reflections and a space to record them to save for the future.

resource-page

Find resources in your assessment report

When you complete an assessment, you receive a personal report which offers results, feedback and suggests some next steps that you could take. You are also offered links to selected resources for each area. These are offered in the same kind of scrolling list with a summary about the resource. When you print your assessment results report the resources are offered as a simple list of links so that you can revisit these at a time convenient to yourself.

Resource selection

Resources included in the discovery tool come from a wide range of publishers. They are checked for accuracy, relevance and quality. They are all free to use although some may require users to register.

These publishers include:

  • national or international bodies (such as Jisc, Nesta, HEFCE, SCONUL, EU bodies)
  • professional bodies (such as CILIP, AoC, UUK)
  • educational institution resources produced for staff or students but which could be of interest to a wide range of users.
  • individual academics who have set up websites or blogs
  • educational consultants or specialists who have websites or blogs
  • networks of educators or specialist collaborators (e.g. supporting citizenship, research, innovation)
  • wikipedia and wikiversity
  • commercial companies (such as Microsoft, Adobe, Google)

Jisc has been working closely with some publishers including the Microsoft educator community, and the Duke of York Inspiring Digital Enterprise Award (IDEA) to map their resources to the digital capabilities framework and include them within the tool. Jisc is also working with the subscription based online learning platform Lynda.com to map their resources to the framework.

Jisc is aware that many educational institutions subscribe to resource collections and may want the discovery tool to link out to them. This is something we are thinking about and hope to implement in the future.

Each resource included in the discovery tool is reviewed for relevance to the framework area, content and quality. Many of the resources also reflect the next steps suggestions.

Following feedback from our pilot phases we have attempted to limit the number of resources that are offered to prevent overload. The collection is not meant to be comprehensive – it has been selected to map to the digital capability framework, the questions and the feedback.

While we only have limited space, we are always looking for great new resources so please let us know if you can recommend one. Even if we can’t include it straight away we will review it for future use.

Resource description

We provide information to help you decide how relevant the resource might be for you. Each resource has a description of the aims and content.

We highlight if a resource is aimed at a specific audience, sector or level. Several resources are aimed at a specific audience but could also be of value to people in other sectors of with other roles. For example a resource aimed at students may be of value to a staff member if their capability levels are just developing in that area.

All the resources are mapped to the digital capability framework and to the different areas covered in the assessments. For example, the same resource may appear in the section about media literacy, or in the teacher assessment on creating learning resources.

Some of the resources have a very specific focus such as ‘managing your emails’ while others are broader and cover a range of digital literacies.

We have included a wide range of formats – from whole courses or sections of courses to downloadable learning resources. We have links to videos, websites, networks, screencasts, toolkits, reports and guides. We have included links to the Jisc guides as these often offer links to further resources. Some of the resources are in the pdf format which will require you to download a pdf reader such as Adobe Acrobat.

Resource management

Jisc has longstanding experience of managing resource collections and will be updating and maintaining this collection. This means that if you go back to an assessment report you may sometimes find different resources listed. Dead links will result in resources being removed from the collection. If you find any links that do not work please report it to us.