Three emerging insights from the Digital discovery pilot

Co-authored by Clare Killen

Map showing locations of UK pilotsOver one hundred universities, colleges and other providers are piloting the Jisc Digital discovery tool in the UK and overseas. The design of this tool encourages individuals to reflect on and develop their digital capabilities. It provides a summary of their self-assessment in response to nuanced question prompts as well as suggestions for further development with links to relevant, interactive resources. Whilst it is very much a personal tool, additional features allow institutional leads tasked with supporting digital capabilities development to gain insights from anonymised data and translate them into the institutional context.

Jisc team members have visited several pilot institutions to support the implementation process. In doing so, and through our in-depth conversations, we have learned about what works, at a practical level, when it comes to providing opportunities to develop the digital capabilities of staff and students in the various organisations. Further insights have emerged from conferences, events and meetings featuring presentations from our pilots, for example, the Student Experience Experts meeting and the Digital capabilities session at Digifest18.

As the roll-out gathers pace, we are starting to gain some fascinating insights into how institutions are using the opportunities offered by this tool to bring about positive change in their organisations. There are some clear themes emerging around what organisations that are benefiting from the Digital discovery process typically have in place:

 1. Clear strategic vision

We are seeing that the organisations with a clear message about the importance of digital technologies, communicated and understood by everyone, provides a meaningful context for the use of the discovery tool.

“It is important to have a clear strategy and people need to know that digital is part of the strategy and part of what they do. You need to engage people in it, allow them to see how it affects them and why it is important to them. It needs to be exciting, so for example, we have run several big events that inspire and excite people around the idea of using technology to support teaching and learning and the college business.”
Penny Langford, head of e-learning, Milton Keynes College

2. Culture

Having a safe space in which teams can explore their thinking about their own priorities for development creates an environment in which individuals can thrive.

“The individual reports which each member of my team had, generated discussions and comparisons, with staff considering their different roles and how that has had an impact upon their individual percentage. More than that though, it made them consider how they might acquire skills where they didn’t score as highly. I have eLearning Technologists and Librarians in my team and each had different scores, particularly the Information Literacy category. Which prompted all manner of discussion around the fake news agenda and critically evaluating information sources.”
Sarah Crossland, academic services manager, Doncaster College and University Centre

3. Connections

Establishing the connections between individual self-identified aims, the overall picture for all staff and the resources available to support professional development to meet organisational strategic aims.

We wanted to identify gaps in staff confidence in their digital skills and use this information to target staff training and support. We looked at other products but there was nothing really out there to meet those requirements. We were looking for a standardised tool and wanted something to self-motivate staff. The approach taken by the Digital discovery tool supports that.
Joseph Pilgrim, digital learning co-ordinator, ACT Training

Digital capability community of practice

The next digital capability community of practice event is being hosted in partnership with the University of Leicester on 22 May 2018. This provides an opportunity to learn about related initiatives and hear more from the wider community including many members who are taking part in the pilot of Digital discovery tool.
While registration for this event has now closed, the keynote sessions will be live streamed. Follow the hashtag #digitalcapability on the day and presentations and any outputs will be available from the event page.

There is still time to engage staff

If you are part of the pilot, you still have time to engage staff, perhaps through end of term staff development events. Remember that feedback is required by the end of May but the Digital discovery tool will continue to be available until 13 July 2018.

How HR teams support staff digital capability

At the end of 2017 we began a short review into how Human Resources (HR) departments support staff to develop their digital capability. We developed an online survey and interviewed some of the respondents to try to capture a snapshot of current practice.

Initial results

The results of these activities confirmed our initial expectation that many HR teams have been working across several areas of the digital capability framework, often in partnership with other teams within their institutions. However for both HE and FE respondents there were quite significant variations to the questions about HR team involvement in the 6 core digital capability areas. Whilst 90% of people said they were involved in supporting ICT proficiency of staff, only 50% said they were involved supporting staff with information, data and media literacy, digital communication, collaboration and participation, or digital learning and teaching. 84% said they were not involved in digital creation, problem solving and innovation and 58% said they were not involved in digital identity and wellbeing.

Later questions and in-depth interviews revealed that many HR teams are in universities or colleges which are just starting to take an institution-wide approach to staff and student digital capabilities. One of the challenges for HR teams is in identifying their roles and potential areas where they could input to institution-wide initiatives and the developments of strategies for developing digital capabilities. Whilst some HR teams were aware of the Jisc tools and resources to support this work, many had not seen them before or had not engaged with them. It became clear to us that there was a need for some practical materials to help HR teams map their various activities (often split into specialist sections) to the digital capabilities framework.

The original survey is still open so if you did not get a chance to respond earlier we would still welcome your input.

New materials for HR teams

HR teams cover a wide range of activities that require them to consider and/or support staff digital capabilities across their institutions. These include recruitment and selection, onboarding, appraisal/performance review, learning and development, relationship management and health and wellbeing. Data management and analytics, increasingly sophisticated institutional systems and the impact of social media mean that Human Resource teams themselves need a range of digital capabilities to effectively carry out their work.

We have produced two sets of powerpoint slides that could be used within HR teams and we are interested to find out if they are useful. Thanks are due to Abi Mawhirt, Head of People and Organisational Development at Dundee and Angus College who worked with us to refine these slides and to make sure we did not have any serious omissions. Abi will be using the slides within her own institution and we have some other HR teams who have said they might try them out.

HR teams could use the slides (or select the ones that they feel are most relevant to their context) to consider their activities, identify and build on strengths, as well as identify any gaps or areas where they could enhance their support of staff digital capabilities. It may highlight areas where HR teams could take the lead, for example in the area of Digital identity and wellbeing.


This set maps HR activities and roles to the Jisc digital capabilities framework. It highlights where HR teams can input to institution wide approaches to staff digital capabilities and offers some suggestions for activities where they could get involved. Some of these areas involve other teams and would encourage HR input to support teams leading on a particular area.


This set offers a view of HR activities through the Jisc digital capabilities framework. Each area of HR activities is mapped to the 6 key elements of the Digital capabilities framework and highlights where HR teams can impact on digital capabilities of staff (and to a lesser extent students).

We have also highlighted those activities that relate to digital capabilities of staff in HR teams.

Please pass these on to your own HR team and ask them to try them out. We have produced a brief pdf document which offers ideas for how they might be used.

Here are some of the suggestions:

  1. Use the slides to deliver a team presentation highlighting areas of most relevance to the team.
  2. Use the slides or a selection of slides in a presentation to focus on particular aspects – either a particular area of HR activities such as recruitment and selection or on a specific area of the digital capabilities framework such as Digital wellbeing.
  3. Use the slides as a pdf document to share within teams and follow up with workshops to consider them within your own context.
  4. Get different teams within HR to focus on specific slides (or pdf pages) and ask them to come up with an action plan following their discussions.
  5. Use the slides or some of the content to present to different teams within the organisation to highlight what you are doing in different areas of digital capability or what you would like to do.
  6. Use the materials to highlight areas for joint working or partnership approaches to other teams or departments within the institution.
  7. Link to other Jisc digital capabilities, guidance, tools or resources to highlight possible HR roles across the institution.

We would like to gather some feedback about these so that we can adapt or enhance them. Link to a brief survey.

Let us know what you think. Help us make them better.

Discovery tool: understanding the questions

We have just been through an interim analysis of feedback from staff users of the Digital discovery tool. Thank you for directing so many staff to complete the feedback form – 225 general staff and 150 teaching staff have done so already, and it has been an invaluable resource.

Screen Shot 2018-05-14 at 15.20.38The feedback so far has been very positive, with some interesting perceptions that we will report in the next blog post. This post is about some of the changes we have made to the content of questions. It also seems like a good opportunity to explain a bit more of the thinking that goes into the three question types, and into the reasons for designing the discovery tool in the way we have. There is some general information at the top of the post, and more detail further down for those who are interested in the different question types.

Development rather than testing

At the start of the design process we had to make a significant decision. We could have written ‘testing’ questions, as in a typical assessment test, to find out what users really  understand about digital applications and approaches. But we decided to write ‘developmental’ questions instead. These are designed to develop understanding, for example by making clear what ‘better’ (deeper, better judged) performance looks like. Rather than hiding the ‘right’ answer, they make transparent what expert digital professionals do and ask users to reflect and report: ‘do I do that?’

We have gone down this road partly because we are not convinced that testing abstract understanding is the best indicator of actual practice, and partly because this approach is more acceptable to end users. Staff want to be treated as professionals, and to take responsibility for assessing and moving forward their own practice. Also, we are not designing in a platform that supports item-by-item matching of feedback to response. So it’s not possible for the feedback itself to be closely matched to users’ input – as it would be in an assessment system – and our questions themselves have to do a lot of the work.

This has important implications for the meaning of the scoring ‘bands’ that we use to assign feedback to users (more of this shortly).

Where do the question items come from?

Essentially, to design the questions we first developed a wide range of real-world activities that digital professionals do. We’ve tested those out with expert panels, and also against the relevant professional profile(s) – which have had professional body involvement.

Of course we could just have presented these activities in a random order, and this was an early design idea. But the digital capabilities framework already had good recognition in the sector, and we needed a navigational aid. So in the case of the generic assessments (for staff and students) we allocated activities to the different framework areas, e.g. ‘data literacy’. In the case of role-specialist assessments, we used specialist requirements from the relevant profile, such as ‘face-to-face teaching’ or ‘assessment and feedback’ in the case of the teaching assessments.

We then took one activity that was central to the area in question and framed it as a ‘confidence’ question (‘How confident do you feel about doing x?’). We developed another activity into a mini-scenario or example to create a ‘depth’ question, with four levels of response possible (‘Which of these best reflects your response?’). Six further activities became options in a ‘breadth’ question (‘Which of these can you do? Select any or all that apply to you’). This provides us with three questions, 8 activities, for each area of practice. There is more about the different question types below.

We have not statistically tested to discover whether responses to all three questions in one area  hang together to create a distinct and separate factor. There is the opportunity to do that with system data at this point, but our first aim was to create a navigable user experience – making sense and generating helpful feedback – rather than to validate a model.

Ideally the feedback we give to users would relate to their responses for each of the eight different activities. Without this option, we have used scoring bands to allocate roughly appropriate feedback to users, based on their responses to the three questions. It’s not exact, and some users have picked that up. However, most users rate the quality of feedback highly – it has the most positive comments of any feature – so we know we are getting it more or less right. We hope we have dealt with the lack of specificity by offering a range of ‘next steps’ that participants can choose from, according to their own interests and self-assessed development needs.

You’ll understand from this that scoring is an artefact of the system we are using and the design choices we have made within it, not an objective measure of any kind.

We were pleased when we analysed system data from the first two months of use to see that in all but three of the 45 generic staff questions, and in all the teaching staff questions, the scoring bands were evenly distributed. This means that the questions were doing a good job of discriminating among staff according to their (self-declared) expertise, and the full range of scoring bands and feedback was being used. Three questions had median scores outside of the normal range, and a couple of sections elicited comments that users did not feel their feedback reflected their actual capability (‘information literacy’ was one). Rather than changing the underlying scoring model for these questions, we decided it was more appropriate to work on the content to try to produce a more even distribution of responses around a central median point. So if users’ scores differ from the median, that should mean something – but we can’t say that it means anything about their objective performance.

Of course users who answer the questions after the changes were made on 5 May will not be scoring in the same way as users who answered the questions before. (It’s also possible that in making the changes suggested by user feedback, we have inadvertently shifted the scoring for some other questions – we will be checking this.) This will need to be communicated to any staff who are returning to use the discovery tool again. It will also need to be taken into account when looking at data returns, since data from before and after the changes can’t be treated as one data set. This is one reason we have cautioned against using scoring data to draw any firm conclusions, particularly during this pilot period when the content is still evolving.

We hope you will convey to all the staff who took the time to complete a feedback form that we have listened to their views – and that you and they will feel that the revised questions are an improvement. This is why this pilot process is so valuable.

How have the questions changed in response to feedback?

(Some changes to wording and options is based on findings from early user testing and not from the more general feedback we gained via the user feedback forms.)

We’ve slightly changed the lay-out of questions and added some more navigational text to clarify how to answer them.

We’ve removed or clarified some terms that were not well understood. Overall we know there is a need for a glossary – ideally with examples and links. That is something Lou will be working on for the future service. We’ve also changed a couple of examples we were using for illustration. There have been many discussions about the pros and cons of examples. Some people find generic terms difficult to understand without examples: but more people object when examples are used, because they favour some applications or approaches over others that are equally valid. Examples can confuse further: ‘if I don’t use that tool, I’m obviously not doing it (right)’. Overall we have gone light on examples, and we hope users’ understanding of terms will improve when we have a detailed glossary we can link to.

We have tried to focus more on activities users do at work, in an educational organisation (college or university). There were some negative comments about references to digital practices beyond this space. However, because of the need to cover a very wide range of roles – and because some roles don’t allow people to express digital capabilities they actually have – we can’t avoid offering some examples from beyond a narrowly-defined work role. For example, one of the activities under ‘digital identity’ is ‘manage social media for an organisation, group or team‘, and under ‘data literacy’ we have ‘judge the credibility of statistics used in public debate’. This is to allow users who don’t manage social media or evaluate statistics as part of their job to reflect on whether they have these capabilities anyway – perhaps gained in their personal life or another role. And indeed to consider whether these activities might be useful to them.

We’ve changed several references to social media, as a number of users objected to what they felt was an underlying assumption that social media would or should be used, and that this was a positive sign of capability. There are still several ways that users can show they are making wise judgements about the appropriateness of social media.

We’ve tried our best to use prompts that reflect capability (‘could do’, ‘would do’, ‘have ever done’) rather than current practice (‘do’, ‘do regularly’), which may be constrained by organisational issues or may reflect judgements not to use. However, we are also mindful that self-reported practice (‘I actually do this’) is usually more accurate than self-reported ability (‘I could do this if I wanted to’). Where we feel it is justified, we have continued to ask about actual use. So long as users understand that they are not being judged, it seems appropriate for the questions and feedback to indicate areas where they are not as capable as they might be if their organisation were more supportive of different practices, or their job role offered more digital opportunities.

There have been changes to the teaching questions, again to focus on pedagogical judgement rather than digital practice. There are now quite a number of caveats e.g. ‘if appropriate to my learners‘, which were suggested by more expert users. Of course we always listen to our experts (!) but as designers we’re aware that introducing caveats like this makes the questions longer and more complex, creating more cognitive load for users, and potential annoyance. We will monitor completion rates to see if this is a problem.

We have particularly reviewed the assessment questions and the online learning questions to be sure we are covering the very wide range of good practice in these areas.

There follows more detail on specific question types and the changes we have made to each of these.

‘Confidence’ questions

Why have we included questions that ask users ‘How confident do you feel about..?’ when we know that self-assessed confidence is generally unreliable? We do this at the start of each element to give users an orientation towards the questions that follow – ‘this is the area of practice we are looking at next’ – and a sense that they are in control. By trusting users to rate themselves, we are both reassuring them that they are not being ‘tested’, and asking them to be honest and searching in their responses. We have weighted the scoring for this question at a low level to reflect users tendency to answer inaccurately – though in fact when we came to compare confidence scores with scores on the other two question types in the same area of practice, there was a positive match.

In feedback, quite a number of users mentioned the tone of these questions positively. Screen Shot 2018-05-14 at 15.03.11However, some felt that they were too ‘subjective’, or ‘vague’. We have tried to deal with this in the update by focusing some questions more tightly on specific practices within the overall area we are looking at. So for example in the generic staff set, under ‘digital creativity’ we ask: ‘How confident are you creating digital content e.g. video, audio, animations, graphics, web pages?’ In the teaching set, under ‘learning resources’, we ask ‘How confident are you about using digital resources within the rules of copyright?‘ We have to find a practice that is generic enough to be available to staff in a wide variety of different roles, but specific enough for the response to feel rooted in a real-world activity.

We have had internal discussions about whether to move the confidence questions to the end of each set, or to remove them altogether. For now they stay where they are.


‘Depth’ questions

These questions are the most difficult to write and currently the most troublesome to end users. There are some ongoing issues with how they are presented on screen, and we are looking into whether any improvements are possible, but for now we have reworded the questions to make the steps to answer them as clear as we can.

These questions offer a short situation or example. Users select the one response that best matches what they would do or what expertise they have. The lay-out of the question reflects the progression logic: the first option reflects the lowest level of judgement or expertise, and the fourth option reflects the highest. There is no trickery here. We describe how progressively more expert practitioners think or act, and ask users to report where they sit on that scale. (At the moment, the visual cues do not make clear that it is a scale, or that higher levels of judgement encompass aScreen Shot 2018-05-14 at 14.45.16nd include the lower ones.)


Beyond the difficulties some users had in ‘reading’ the answer logic for these questions, it is clear that we have to get the progression logic right in each case. When people disagree with our judgement about what is ‘more expert’, they don’t like these questions. When they agree, they say they are ‘nuanced’, ‘thoughtful’, and ‘made me think‘. We know that our users expect us to reflect issues of judgement and discrimination (‘how well is
digital technology being used?’) at least as much as extent of use (‘how many different digital tools?’). So we know these questions have to be in there. They have to reflect important issues of digital thinking or mindset, and we have to get them right – in a very small number of words!

Our recent updates aim to clarify the focus on judgement and experience rather than extent of use. And we have added modifiers such as ‘when appropriate’ or ‘if appropriate for your learners’ (teaching staff) to emphasise that we don’t believe technology is always the answer – but good judgement about technology is. This creates more words on the screen, which will put off some users, but we want our champions to feel that our words represent thoughtful
practice and not a shallow checklist of skills.

‘Breadth’ questions

Screen Shot 2018-05-14 at 14.48.55These are in many ways the most unproblematic. They offer a range of digital activities that staff may do already, may want to do, or may not even have thought about. As before, we try to clarify that we don’t think digital practices are always the best, but we do want people to extend their repertoire so they have more experience of what does (and doesn’t) work. We try to use wording that values skills users have, even if they can’t use them currently due to their role or organisational context. We have tried to avoid very role-specific activities, but not to preclude the possibility that people might develop some professionally-relevant skills in their personal lives, or take on tasks from ‘other’ roles that they enjoy. We include fairly basic activities that many users will be able to select, and quite advanced activities that offer something to aspire to. The ‘nudge’ information is obvious: think about doing some of these things if you don’t or can’t already.


What next?

We are always interested in your views on the questions and other content. The user feedback forms will remain live until the end of the pilot project and we expect to make some further updates to content at that point. Please keep asking your users to access this from the potentially platform.

If you are an institutional lead, you will shortly have an opportunity to give us feedback via your own detailed evaluation survey. You can also give us comments and feedback at any time via our expert feedback form – please direct other stakeholders and interested users to do this too.

Engaging users with the Digital discovery tool

There are only a few weeks to go before we wrap up this pilot phase of the the Digital discovery tool, but still time to get new users involved. Some pilot sites have finished engaging users and are now evaluating how things have gone, but others are still looking for staff and students to give the discovery tool a try.

There are five new promotional posters from the Jisc team that can help. These can be adapted with an institutional logo and the details of any live workshops or support materials.

Screen Shot 2018-04-19 at 23.17.53

Download your posters here:

There are other ideas for engaging users on our Guidance page: Engaging users.

Thinking ahead, lead contacts at all the pilot sites will be sent a survey about their experience on 5 June. The survey is quite comprehensive, as this is our best source of information about how the Digital discovery tool is being used in practice. There are 15 questions, covering user engagement, support and follow-up for the discovery tool, and whether there have been any individual or organisational benefits. We ask for this to be completed by 30 June.

Before completing the form, we suggest that leads run a focus group or consultation event with users. This will allow evidence to be gathered that can help to answer the evaluation questions. There are materials for running consultation events on our Guidance page: evaluating with users, but this doesn’t have to be complicated. It could be as simple as getting some users together and exploring a couple of the questions on the evaluation form.

Just now, we are using all the valuable feedback from users to make some refinements. You may notice these in the questions and feedback for staff. There will be more significant updates once the pilot has finished. It’s really helpful if you can point your users to these feedback forms, which are found on their dashboards. We can only make things better with their help – and yours!


Organisational data available

If you are part of our organisational pilot of the Digital discovery tool, you will now have access to your data dashboard with visual results from your staff users. Guidance for accessing and reading your data visualisations can be found here.

There is also a collaborate webinar on Tuesday 17 April at 13:00 which will walk you through the process and help you to make use of your data. You can access the webinar live here, or after the event you can access the recording here.

The rest of this post is about how you might make use of the data in your organisation. Please remember that the data provided as part of the pilot is still in development. We are in the process of finding out what data is useful. You should not rely on these data visualisations as a definitive source of information about staff training needs.

Making use of your data

You may want to use the number of staff completions – possibly broken down by department – to compare the number of staff who have fully engaged with the number of staff you hoped to reach at the start of the project. Who has and who has not engaged? Do you have feedback from your engagement sessions or a follow-up process (e.g. focus group) to explain any differences? How might you encourage engagement from other groups of staff?

You could also compare the number of staff who have completed the general (‘all staff’) assessment with the number completing the specialist teaching assessment(s). How would you explain any differences? Again consult with your users: were teaching staff more motivated and satisfied by the role-specific assessment?

The ‘in progress’ data allows you to see if there is a significant drop-off as staff are going through an assessment. This is a figure Jisc is looking at closely, as the user experience needs to be easy and supportive – that is our responsibility. But if you find differences in the drop-off rate across different staff groups, could this be because of differences in the support you make available to them?

Scoring band data should be interpreted with great caution. Jisc is using this data to ensure that the questions we ask produce a reasonably even spread of medians across the different areas of digital capability. But this is a broad aspiration: it is inevitable that some areas will prove more challenging to users than others. Also, some areas are essential for all staff (such as digital wellbeing), while others such as information, media or data literacy are more important in different roles.

This is why all our feedback to individual users asks them to reflect on their role and its demands before deciding how to prioritise their next steps. It is also why you should not compare scoring bands across completely different areas of digital capability and conclude that your staff have a ‘deficit’ in one area as compared with another. If you want to make comparisons, look at overall sector scoring bands and compare with the relevant banding in your organisation. But even this should be done with great care, particularly if you have a low number of users overall or in one departmental group, as this will skew the results.

Scores are all self-assigned, and their purpose is to ensure that users get appropriate feedback. If staff believe that their scores are being used for another purpose, they may not answer questions honestly, and the value of the Digital discovery tool will be severely limited.

Jisc encourages you to use the Digital discovery tool to support a dialogue with staff about the training and development they need. The spread of scoring bands across different departments may encourage you to target training in specific areas towards specific groups of staff. Because of the caveats above, you should not do this without consulting with the staff involved. Where staff score lower than others in their sector, this is definitely a cue for you to investigate whether they would appreciate more training and support, but it is not a performance measure and should never be used as such.

Following up and closing the feedback loop

The information you gather from the Digital discovery tool can be used to start conversations:

  • with HR and staff development about overall staff training and development needs;
  • with teaching staff about their confidence with digital teaching, learning and assessment, and their further development needs;
  • with IT and e-learning teams about support for specific systems and practices;
  • with budget-holders about investing in staff development resources and in online services.

You should report back to your staff users about how you are using this data, and what you are doing to support them more effectively in the future.

Using Discovery tool data to refine the questions and scoring

Thanks to the aggregate data we are getting from our first pilot users, we have been able to compare the median scores for each of the questions asked, and look at some other stats across the different assessments.

We were pleased to see from the first data returns that ‘depth’ and ‘breadth’ questions produce the medians we would expect, with one or two exceptions. We’ve worked on these outlying questions to make it a bit easier (or in one case a bit harder) to score in the middle range. This should bring the medians more into line with each other, making it easier and more valid to look across aggregate scores and compare areas of high and low self-assessment.

Median Question Scores - All capabilities

Median scores, ‘all staff’ assessment, snapshot from early March 2018: click for detail

There will always be some natural variation in average scores, because we are asking about different areas of practice, some of which will be more quickly adopted or more generally accomplished than others.

We were particularly pleased to find on testing that there is a positive correlation between confidence and responses to other questions in the same area (i.e. expertise and range). We would expect this, but it is good to have it confirmed. However, although there was a meaningful range of responses, almost no users were rating themselves less than averagely confident, so we are looking to adjust the scoring bands to reflect this. We don’t attach a great deal of weight to this question type, precisely because it is known that users tend to over-state their confidence, but is included to encourage reflection and a sense of personal responsibility.

You will see the impact of this work when we reach the mid-April review point, along with some further changes to the content and platform indicated by our user feedback. More about this below.

Scoring is designed to deliver appropriate feedback

As you see, we’re doing what we can to ensure that the scores individuals assign themselves are meaningful, so they allow relevant feedback to be delivered. The question types available don’t allow us to match selected items with feedback items (e.g. items not chosen in the grid or ‘breadth’ questions with ‘next steps’ suggestions in the personal report). This means relying on aggregate scores for each digital capability area. The pilot process is allowing us to find out how well the scoring process delivers feedback that users feel is right for them, and how the different areas relate to one another (or don’t!). However, the questions and scoring are not designed to provide accurate data to third parties about aptitude or performance. So scoring data, even at an aggregate level, should be treated with a great deal of caution. We are issuing new guidance on interpreting data returns very shortly.


The radial diagram gives a quick overview of individual scores

The aim of the Digital discovery tool is developmental, so it’s clear what progress looks like and ‘gaming’ the scores is simple. Our contextualising information is designed to remove this temptation, by showing that the discovery process is for personal development and not for external scrutiny. Our feedback from staff in particular suggests that if there is any suggestion of external performance monitoring, they won’t engage or – if required to engage – they won’t answer honestly. Which of course will mean there is no useful information for anyone!


The ongoing evaluation process


Showing where to find the evaluation form on the dashboard

As well as examining user data, of course, we have access to the individual evaluation forms that (some) respondents fill out on completion.This is giving us some really useful insights into what works and what doesn’t.  However, at the moment we think the sample of respondents is weighted towards people who already know quite a lot about digital capability as a concept and a project. The views of people with a lot invested are really important to us. But we also need the feedback from naive users who may have a very different experience. Please encourage as many as possible of your users to complete this step. The evaluation form is available from a link on the user dashboard (see right).

Screen Shot 2018-04-01 at 22.01.53In addition we have taken a variety of expert views, and we are just about to launch a follow-up survey for organisational leads. This will ask you about what you have found beneficial about the project, what has supported you to implement it in your organisation, what you would change, and how you would prefer Jisc to take the Discovery tool project forward. Please look out for the next blog post and launch!

Resources in the Digital discovery tool

The Digital discovery tool provides links to a wide range of resources for each of the digital capability framework areas.

The platform delivers these resources in two ways.

Browse resources on your dashboard

When people log-in to the tool they are presented with a tailored welcome page/dashboard offering appropriate assessments for them based on the selections they make during log-in.   The dashboard also includes sets of resources for each of the six broad digital capability areas. You can scroll through these sets and browse the resources that we have mapped to these areas. We offer a brief description of the resource in this view.


Once you see a resource that looks interesting you can click on it to find out more. For each resource we have identified key audiences and level as appropriate and provide a brief description to help you decide how relevant it is to you. When you click on the URL in the resource page you will be taken directly to that resource outside of the discovery tool.

For some resources we offer suggested activities or reflections and a space to record them to save for the future.


Find resources in your assessment report

When you complete an assessment, you receive a personal report which offers results, feedback and suggests some next steps that you could take. You are also offered links to selected resources for each area. These are offered in the same kind of scrolling list with a summary about the resource. When you print your assessment results report the resources are offered as a simple list of links so that you can revisit these at a time convenient to yourself.

Resource selection

Resources included in the discovery tool come from a wide range of publishers. They are checked for accuracy, relevance and quality. They are all free to use although some may require users to register.

These publishers include:

  • national or international bodies (such as Jisc, Nesta, HEFCE, SCONUL, EU bodies)
  • professional bodies (such as CILIP, AoC, UUK)
  • educational institution resources produced for staff or students but which could be of interest to a wide range of users.
  • individual academics who have set up websites or blogs
  • educational consultants or specialists who have websites or blogs
  • networks of educators or specialist collaborators (e.g. supporting citizenship, research, innovation)
  • wikipedia and wikiversity
  • commercial companies (such as Microsoft, Adobe, Google)

Jisc has been working closely with some publishers including the Microsoft educator community, and the Duke of York Inspiring Digital Enterprise Award (IDEA) to map their resources to the digital capabilities framework and include them within the tool. Jisc is also working with the subscription based online learning platform to map their resources to the framework.

Jisc is aware that many educational institutions subscribe to resource collections and may want the discovery tool to link out to them. This is something we are thinking about and hope to implement in the future.

Each resource included in the discovery tool is reviewed for relevance to the framework area, content and quality. Many of the resources also reflect the next steps suggestions.

Following feedback from our pilot phases we have attempted to limit the number of resources that are offered to prevent overload. The collection is not meant to be comprehensive – it has been selected to map to the digital capability framework, the questions and the feedback.

While we only have limited space, we are always looking for great new resources so please let us know if you can recommend one. Even if we can’t include it straight away we will review it for future use.

Resource description

We provide information to help you decide how relevant the resource might be for you. Each resource has a description of the aims and content.

We highlight if a resource is aimed at a specific audience, sector or level. Several resources are aimed at a specific audience but could also be of value to people in other sectors of with other roles. For example a resource aimed at students may be of value to a staff member if their capability levels are just developing in that area.

All the resources are mapped to the digital capability framework and to the different areas covered in the assessments. For example, the same resource may appear in the section about media literacy, or in the teacher assessment on creating learning resources.

Some of the resources have a very specific focus such as ‘managing your emails’ while others are broader and cover a range of digital literacies.

We have included a wide range of formats – from whole courses or sections of courses to downloadable learning resources. We have links to videos, websites, networks, screencasts, toolkits, reports and guides. We have included links to the Jisc guides as these often offer links to further resources. Some of the resources are in the pdf format which will require you to download a pdf reader such as Adobe Acrobat.

Resource management

Jisc has longstanding experience of managing resource collections and will be updating and maintaining this collection. This means that if you go back to an assessment report you may sometimes find different resources listed. Dead links will result in resources being removed from the collection. If you find any links that do not work please report it to us.

Piloting the Digital discovery tool with students

While our current pilot projects have been getting the Discovery tool into the hands of staff, we’ve been working behind the scenes on the student version. We’re pleased to say that this is user testing well, with students particularly keen on the detailed final report. We’ll be promoting this more positively to learners as the prize at the end of their journey. Meanwhile we’re making some final improvements to the content, thanks to all the feedback from users and experts.

All this means that we’re looking for existing pilot institutions that are keen to extend the experience to students. You can express an interest by completing this sign-up form, and you can read more about what’s involved below.

About the student Digital discovery tool

Screen Shot 2018-01-15 at 12.19.58

The student version is designed exactly like the staff version, as described in this blog post. So users answer questions of three types, received a detailed feedback report with suggested next steps, and links to resources.

The content is designed to be:

  • Practice based: users start with practical issues, and the language is designed to be accessible and familiar
  • Self-reported: we trust users to report on their own digital practices. We attach very little weight to self-reported confidence, but we do expect learners to report accurately on what they do in specific situations (depth), and on which digital activities they undertake routinely (breadth).
  • Nudges and tips: the questions are designed to get users thinking about new practices and ideas before they even get to their feedback reportScreen Shot 2018-03-12 at 22.34.12.
  • Generic: different subject areas present very different opportunities to develop digital skills – and make very different demands. We aim to recognise practices that have been gained on course (after all these make an important contribution to students’ digital capability!) but where possible we reference co-curricular activities that all students could access.

Student users will find only one assessment on their dashboard, unlike many staff who will find a role-specialised assessment alongside the generic assessment ‘for all’. Most of the elements in the student assessment are the same as in the staff generic assessment, mapped to the digital capabilities framework. But the content is adapted to be more relevant to students, and the resources they access are designed to be student-facing, even where they deal with many of the same issues.

The ‘learning’ element of the framework is split across two areas to reflect its importance to students. These are ‘preparing to learn‘ with digital tools (mainly issues around managing access, information, time and tasks), and ‘digital learning activities‘. There is also an additional element, ‘digital skills for work‘, that sits at the same level as ‘digital identity’ and ‘digital wellbeing’ in the framework, reflecting the importance of the future workplace in learners’ overall motivation to develop their digital skills.

The feedback encourages learners to think about which elements they want to develop further, based on their own course specialism and personal interests. Where they score low on issues such as digital identity that we know are critical, we prompt them to seek advice. So use of the discovery tool may lead to higher levels of uptake of other resources and opportunities – and we hope this is seen as a successful outcome!

Screen Shot 2018-03-12 at 22.36.05There is some minor variation between the versions for HE and FE students, but we have done our best to keep these to a minimum. Our research and consultations don’t suggest that sector is an important factor in discriminating the digital skills students have or need. However, we do recognise that students vary a great deal in their familiarity with the digital systems used in colleges and universities. So we’ve designed this assessment to be suitable for students that are some way into their learning career, right up to those preparing for work.

It is not intended for arriving or pre-arrival students. We are considering a special assessment  for students at this important transition, but there are some problems with developing this:

  1. These students vary much more in their experience of digital learning, so it is much harder to design content that is not too challenging (and off-putting) for some, while being too basic for others.
  2. We are concerned that organisations might see it as a substitute for preparing students effectively to study in digital settings – this is not a responsibility that can be delivered by a self-reflective tool.
  3. We have learned from students that the most important content of an induction or pre-induction ‘toolkit’ is institution-specific – depending on the specific systems and policies in place.

So at the moment our focus for arriving students is to work with Tracker users to design a digital induction ‘toolbag’. The ‘bag’ is simply a framework that colleges can use to determine for themselves – from their Tracker findings and other data – how they want arriving students to think about digital learning, and what ‘kit’ of skills, devices etc they will need. More of this over on the Tracker blog soon.

What the Digital discovery tool for students is not

As above, the Discovery tool is not an induction toolkit, or any kind of toolkit. It doesn’t deal with local systems and policies, which are critical to students becoming capable learners in your institution. It does prompt learners to think about a whole range of skills, including their general fluency and productivity with digital tools, which will support them to adopt new systems and practices while they are learning.

Screen Shot 2018-01-15 at 09.28.22The Discovery tool offers access to high quality, freely-available resources, in a way that encourages learners to try them. In future you may be able to point students to your own local resources as well. But it isn’t a course of study and there’s no guarantee that learners will follow up the suggestions or resources offered.

The scoring system is designed to ensure students get relevant feedback, and to motivate them to persevere to the report at the end. It has no objective meaning and should not be used for assessment, either formally or informally. We have deliberately designed the questions to be informative, so it’s always clear what ‘more advanced’ practice looks like. Users who want to gain an artificially high score can do so easily, but we don’t find this happening – so long as they see the development report as the prize, rather than the score itself.

About the pilot process

Just like the staff pilot, we’re looking for quality feedback at this stage. If you’d like to be part of the journey, we’d be delighted to have your support. You’ll need to complete this sign-up form before the end of 23rd March – it’s a simple expression of interest – after which we’ll notify participants and send out your log-in codes. Our Guidance has been updated to ensure it is also relevant to the student pilot, and you’ll have dedicated email support. Access will be open to students until the end of May 2018.

Because this is a pilot, we are still improving the content and still learning how best to introduce it to students to have the most positive outcomes. This means changes are likely. It also means we’ll ask you and your students to give feedback on your experiences, as with the staff pilot.

Join us now: complete the sign-up form.


Digital discovery tool launched today

Universities, colleges and independent providers that have signed up to pilot the Digital discovery tool will receive their access codes today. On this page you can learn more about the new Discovery tool, the platform, the different assessments available, and the guidance that will help you put it all into practice.

Where we are today

The open pilot is taking place in 101 organisations (57 HE, 35 FE and 9 ‘other’) between now and the end of May 2018. You can find out more about the pilot organisations and their different approaches in this blog post.

log-in screenThe version launched today:

  • is based on a new platform from
  • offers completely new, user-tested questions + feedback for staff
  • links to a host of new resources, all openly available
  • offers further specialist questions + feedback for staff with a teaching role (in HE or in FE and Skills)

The new platform is working with Jisc on the development of the new platform for the Digital discovery tool. The team has experience of delivering an accurate personality indicator to help students understand their strengths and ‘stretch’ areas across twenty-three traits and to prepare for employment. Their platform offers a clear visual interface for the Digital discovery assessments and feedback report.

Digital capability resources are available through the dashboard in an attractive, accessible style. This screenshot shows the browse view. Answering the assessment questions creates a personalised report for each user, with recommended resources to follow up.

Screen Shot 2018-01-15 at 09.28.22

The new design

The Digital discovery tool is designed according to the following principles:

  • Practice based: users start with practical issues as a way in to digital capability thinking
  • Self-reported: we trust users to report on their own digital practices. The scoring-for-feedback system means it is pointless for users to over-rate themselves.
  • Nudges and tips: the questions are designed to get users thinking about new practices and ideas, before they read a word of their feedback report.

Broad relevance: we have tried to avoid referencing specific technologies or applications to make the content relevant across a wide range of roles and organisations. Sometimes we use familiar examples to illustrate what we mean by more general terms.Screen Shot 2018-01-15 at 12.19.58All users are offered an assessment called ‘digital capabilities for all’, based on the 15 elements (6 broad areas) of the Jisc Digital capability framework. There are very few differences in the questions for staff in different roles or sectors, and students answer many of the same questions too, though the feedback and resources they get are a bit different.

Some users are also offered a specialist assessment, depending on the role they choose when they sign in. At the moment we are offering additional question set for teaching staff – ‘digital capabilities for teaching’ – as this was the priority group identified in our pre-pilot consultations. We will shortly offer another specialist set for learners, and one for staff who undertake or support research. More may follow, depending on demand. Users can choose to complete only the general or only the specialised assessment, but they must complete all the questions in an assessment before they get the relevant report.

The questions

There are questions of three kinds.

confidence question

Confidence question: rate your confidence with a digital practice or skill, using a sliding scale. The opportunity for self-assessment triggers users to be reflective and helps them to feel in control of the process.

depth question

Depth question: select the one response out of four that best describes your approach to a digital task. This helps users identify their level of expertise and see how more expert practitioners behave in the same situation.

breadth question

Breadth question: select the digital activities you (can) do, from a grid of six. We have tuned these so most users will be able to select at least one, but it will be difficult to select all six.

At the moment we know that some elements are harder to score highly on than others. Once we have a large data set to play with, we will be able to adjust these differences. But it may just be the case that some areas of digital capability are more challenging than others…

The feedback

Once all the questions in an assessment have been completed, users receive a visualisation of their scores, and a feedback report. The report can be downloaded to read and reference in the user’s own time – alone or with a colleague, mentor or appraiser.



The feedback report includes, for each element assessed:

  • Level: this is shown as one of ‘developing’, ‘capable’ or ‘proficient’. Some text explains what this means in each case.
  • Score: this shows clearly how the user’s responses have produced the level grading
  • Next steps: what people at this level could try next if they want to develop further
  • Resources: links to selected resources for exploration

report detailed

The resources

All the resources available through the Discovery tool – whether they are recommended in the user’s personal report, or browsed from the desktop – are freely available, quality assured, and tagged to different elements of the digital discovery framework.

Screen Shot 2018-01-15 at 10.42.39

Making it better

This is a pilot, which means we are still learning how the Digital discovery tool might be useful in practice, and making improvements to the content and interface. For example, there may be some changes to users’ visual experience during the next weeks and months.

  • End-users are asked to fill in a short feedback form once they have completed one or more assessments.
  • A smaller group of ‘pilot plus’ institutions are going through the process with additional interventions and monitoring from the Jisc Building digital capabilities team, to help us learn from them more intensively.
  • All institutional leads are being asked to fill in an evaluation form and to run a focus group with staff to explore the impacts and benefits of the project.

These interventions help us to improve the Discovery tool and the support we provide for digital capabilities more generally.

What next?

In another post we will explore how to understand and use the data returns to organisational leads. We are also developing, for launch in March 2018:

  • A version for students studying in HE institutions, and for students in FE and Skills
  • A prototype for the Building digital capability website, to bring all our digital capability services and resources together
  • Four institutional case study videos
  • A senior leaders’ briefing paper
  • A study into how HR departments are supporting the development of staff digital capabilities (see for more details)

Key resources

Digital capability community continues to grow

The second network event for our digital capability community of practice took place in Birmingham on 30 November 2017 with around 100 participants from over 54 colleges and universities coming together to share practice, exchange ideas and work together. The presentations, resources and Periscope recordings are available from our event page.

The strong interest in this community of practice signifies the centrality of digital capabilities to all aspects of educational practice and a recognition that digital capabilities are not only vital for the employability and future career prospects of our students but also have the potential to enhance institutional reputations and aid organisational efficiency.

Sharing practice

The contributions made on the day, by our presenters and by collaborative engagement in workshop sessions were greatly valued.

“I wanted to hear examples of other institution’s approaches to developing digital capability, and there were plenty of examples.”

Developing a holistic institutional approach

Our keynote speaker, Karen Barton, director of learning and teaching at the University of Hertfordshire, shared their approach to developing a holistic institutional approach to digital capabilities development. The formation of a digital capability steering group has been a key enabler, engaging senior stakeholders and sponsoring wider participation with their teams.

The Jisc digital capabilities framework proved useful in getting dialogue going and helping others to get to grips with the language and vocabulary used to describe digital capabilities. Karen also talked about the University of Hertfordshire’s model for staff development at different stages of their careers and the work on mapping where digital capabilities fit into their broader CPD framework and learning landscape.

Ongoing work includes the establishment of a student experience academic research group with a sub group focusing on technology enhanced learning and exploring whether the academic CPD model can be applied to other role profiles.

Watch the Periscope recording of Karen’s session or view the slides on our event page. See also their institutional story on their participation in the first stage pilot for the discovery tool.

Community-led discussions

Group work at digital capability community of practice 301117

Participants at the first digital capability community of practice event in May 2017 requested time for community-led discussions and topics identified by those registered for the November 2017 event included:

  • Effective staff development strategies – how to upskill staff with digital capabilities
  • Developing organisational approaches to digital capability and getting buy-in from senior managers
  • Measuring the impact of initiatives, tools and strategies on staff/student capability
  • Student digital capability, embedding digital capabilities into the curriculum and student/staff partnerships

Facilitated by community members, participants were tasked with identifying critical issues and sharing experiences and solutions – the outputs are captured on our padlet. A variety of strategies were used – in one group, participants were challenged to come up with 20 ideas/potential solutions in just five minutes with most achieving the target before being further challenged to identify one thing they could action the following day. This proved a very effective way of moving from general discussion to action-focused solutions in a short period of time.

Strategies for engaging staff

Community members cited staff engagement in digital capabilities as one of their most critical issues and so the opportunity to hear from four community members on their differing approaches was informative and insightful.

Non Scantlebury and Jo Parker both shared innovative techniques they’d used to engage staff in conversations around digital capability. Non asked participants to share their favourite apps and reveal their digital superpowers mapped to the framework; Jo used the ‘love letters and break up letters’ approach which elicited deep and more emotive feedback about the digital discovery tool.

Randeep Sami and Delon Commosioung shared strategies and practical examples of how they are engaging staff in their respective colleges.  Randeep explored the concept of the digital classroom and shared details of their 21st century teaching programme; Delon outlined how working as part of the quality team has helped to position effective use of technology as integral to teaching, learning and assessment.

One of the highlights of the meeting was a series of five Pecha Kucha sessions from community members willing to share their experience, practice and strategies. These short seven-minute presentations shared journeys so far, outlined institutional approaches and transformative ambitions, bringing the day to a well-paced end.

Video recordings and presentations are available to view on our event page.

Looking ahead to 2018

Building digital capability project update

A lot has happened since the first digital capability community of practice event in May 2017:

  • Tabetha Newman96+ institutions have signed up to take part in the second phase of our pilot of the digital capability discovery tool which runs from December 2017 to May 2018.  See Helen’s Beetham’s blog post Digging deeper with the discovery tool which provides a useful analysis of the motivations and aims of those signed up to pilot the new tool. See also Helen Beetham and Tabetha Newman’s update on the digital discovery tool including a succinct and entertaining guide on the differences discovery tool and the student experience tracker – complete with appropriate hats!
  • New for 2018: senior leaders briefing and video case studies – Recognising the strategic importance of digital capabilities, Jisc will be producing a senior leaders briefing in March 2018 along with four institutional video case studies. A study of how HR departments are supporting the development of digital capabilities is also underway with a report and case studies available in April 2018 – see Lou McGill’s blog post for details of how you can take part.
  • Visioning the new building digital capability service – Jisc is also working on the development of a new web-based portal designed to provide organisations and individuals with clear routes through the wealth of information, support options and resources available to support digital capabilities development. Keep up-to-date by signing up to the digital capability mailing list and the project blog.  As the prototype of the digital capability service is being developed we are looking for volunteers to get involved in some short online user testing activities (30 minutes or less). If you would like to take part, please get in touch with Alicja Shah.
  • A series of training events and webinars on curriculum confidence, digital well-being and identity, and digital leadership is also running over the next few months.

Shaping the next agenda: your take-aways and thoughts for future events

While Jisc has founded this community, the focus is very much on building a sustainable network and in facilitating participants to share the collective wealth of experience. Feedback from the event is very positive and naturally reflects the different stages people are at in their own personal and institutional journeys.

Participants valued the opportunities to hear the developmental journeys of others and highlighted other areas they would like to see more focus on at future events.

“It was good to be able to discuss issues and ideas with like-minded people as a small group”

“It has given me some ideas to try out.”

Suggestions for the next event include creating time and space for:

  • Discussion on the changing landscape around learner expectations and needs, societal views on education and the effect of this disruption and how digital capabilities feature in this
  • Feedback and case studies from students and their experiences in digital capabilities development
  • Networking with colleagues

What did you take away from the event?

What would you like to see on the programme for the next event?

Use the comments below or share your thoughts via the digital capability mailing list.

Save the date

We are delighted to announce that the next digital capability community of practice event will be hosted by the University of Leicester on 22 May 2018.

Do join us for what promises to be another rich exchange of ideas, approaches, strategies and resources.