Digital discovery: FAQs

This list of Frequently Asked Questions began life as the discussion thread from the Digital discovery pilot start-up webinar on January 10th 2018. It is updated regularly as questions arise, particularly on the Digital discovery pilot jiscmail list. Our main resources for institutional leads are accessed from this guidance page.

Questions for staff users

Q: How do staff sign up to the Digital discovery tool?
A: From the home page (https://jisc.potential.ly/):

  1. Click on the ‘sign-up’ link just below the ‘sign in’ box.
  2. Provide a name and email address.
  3. Create a password  (this needs to be at least 8 characters, including one capital letter and one number).
  4. Enter the code you have been given.
  5. Select your organisation, department and role from the drop-down lists
  6. Read the data privacy notice and tick to confirm
  7. Click ‘create account’

After this, users can sign in using just their email address and password. It should be possible to register in around a minute.

Q: Why do staff have to set up an individual account?
A: This is to ensure:

  1. they are provided with assessments appropriate to their sector and role, and
  2. we can provide anonymised data to organisations about staff completion rates and development needs, broken down for example by role or department.

Staff identities are not recorded with their answers, and Individual reports are never shared by us.

Q: What devices and browsers are supported?
A: Potentially is a web-based platform, designed for easy access, sign-up and use. The Digital discovery tool should run on most devices with a browser and internet connection. You should not need to download any software or browser you don’t already have.

The Discovery tool has been tested with both Windows and Apple Mac desktops plus a variety of tablets and mobile devices (Android, iOS) with the following browsers:

  • Google Chrome
  • Firefox
  • Microsoft Edge
  • Safari

For the best experience, we recommend completing the questions and viewing the report and resources on a full-sized screen on either a desktop, laptop or tablet at 1024×768 resolution using the latest version of Chrome or Safari. Google Chrome tends to work best.

Q: Is the Digital discovery tool fully accessible?
A: Potentially aims to meet level AA of the Web Content Accessibility Guidelines (WCAG 2.0) as a minimum and is working hard towards this goal. During the pilot process, Potentially will be continuously testing and developing the beta version of the tool towards meeting the AA standards, and any feedback from pilot users is welcome.

Q: How long should it take for an individual to complete the Digital discovery process?
A: The ‘capabilities for all’ assessment should take no more than 25 minutes and covers all 6 areas (15 elements) of the digital capability framework. Specialised assessments ‘for teaching’, ‘for learning’ etc should take around 15 minutes. On completion, users receive a report which they can download and review at their leisure.

Q: Does the process have to be completed in one session?
A: Users can leave a question and return to it, or change their answer, using the ‘next’ and ‘previous’ buttons. They can also save and exit the questions at any point and return for another session. Users will be asked to complete any questions they have missed before submitting their answers, which triggers the release of their report. If users choose to take the assessment again, their previous report will be over-written, so they are encouraged to dowload and save a version at the time.

Q: What should users do once they have completed their session?
A: Users should click to log out, as the system doesn’t currently log users out automatically. This is important for users of a shared machine to ensure their responses can’t be viewed by others.

Q: Will there be an option for users to indicate if a question isn’t relevant to them?
A: We have tried to avoid this situation by splitting questions into capabilities ‘for all’ and capabilities for specialised areas of practice. At the moment ‘for teaching’ is available, with slightly different versions for HE and for FE and Skills. ‘For all’ questions have slightly different versions for staff and students. We have user tested for perceived relevance. Users will only see the questions relevant to them, as determined by the information they give at log-in. For example, staff who identify as having an Academic/Training role will see the generic questions for staff, and the specialist questions for staff with a teaching role.

Any assessment that aims to be relevant across a wide range of roles and organisations will include items that are unfamiliar to some users. This is partly the point of the discovery tool – to raise awareness of opportunities beyond the user’s current knowledge and practice. So long as it is understood that the questions and the scoring are there to give feedback and not to judge performance, we hope staff will feel comfortable identifying that some activities are not in their current skillset.

Q: What is the difference between the personal feedback report and the reports available to institutional leads?
A: The personal feedback report exists only on screen and (if selected) as a downloaded pdf on the user’s own device or computer. Responses and reports are confidential to the end-user unless they choose to share them. Organisational leads receive aggregated, anonymised data about issues such as how many staff have completed the Discovery tool and their average scores.

Our research and testing has shown that users need to feel in control of the information they give about their digital capability, otherwise they may not answer questions accurately. The process depends on self-assessment and self-observation to allow users to receive relevant feedback. If staff do not participate confidently, any aggregated data provided to institutional leads will be inadequate or inaccurate or both.

Return to FAQ index

Questions for the institutional lead

Q: Why is there so much guidance – can’t we just point users to the URL?
A: We hope the Discovery tool is useful in itself. But we expect it to be much more useful if it is introduced in the context of a whole-organisation approach to digital capability. This is why we ask you to think ahead about issues such as: how you will engage staff (later students); how you will support staff to use the tool and respond to their report; how you will evaluate the process and investigate any impact. We also ask you to consider what other resources from Jisc could help your organisation in its digital journey.

Q: Can I see and give feedback on the content of questions and reports?
A: The content of questions is available from the main guidance page. We don’t recommend you use the questions in this format with end-users, but please use them to better understand the process. We don’t currently offer the content of reports as this is generated uniquely for individual users.

Please send us any feedback on the questions and report content, and on your own experience as a user, using this form. This has been designed to capture the comments of experts like yourself. Your users should be directed to use only the feedback form they can access from their dashboard.

Q: What is the difference between the personal feedback report and the reports available to institutional leads?
A: The personal feedback report exists only on screen and (if selected) as a downloaded pdf on the user’s own device or computer. Responses and reports are confidential to the end-user unless they choose to share them. Organisational leads receive aggregated, anonymised data about issues such as how many staff have completed the Discovery tool and their average scores.

Our research and testing has shown that users need to feel in control of the information they give about their digital capability, otherwise they may not answer questions accurately. The process depends on self-assessment and self-observation to allow users to receive relevant feedback. If staff don’t participate confidently, their responses won’t be accurate, and any aggregate data provided to institutional leads will be inaccurate too.

Q: Is there any cost to taking part in the pilot?
A: For the duration of the pilot, the Discovery tool and associated guidance are free of charge. We do ask you to undertake some evaluation of your experience as an organisation, and encourage your users to complete the evaluation form that is bundled with the Discovery tool. We are currently exploring the options post-pilot (from July 2018).

Q: What are our responsibilities to Jisc?
As a pilot project, your responsibilities are to support the staff taking part – bearing in mind that this is a pilot run of a beta product. You also undertake to provide us with some feedback to help us evaluate the overall experience and any impact at your organisation, which involves encouraging all staff to complete the feedback form which they can access from their tool dashboards. There is a fuller account of Jisc’s responsibilities and yours on the page of essential information for organisational leads.

Q: Does the tool allow benchmarking against the HE sector at institutional level?
A: We expect to share some benchmarking data e.g. the total number of completions and aggregated staff scores across the pilot. It won’t be possible to identify any institutions’ data separately. We will explore whether detailed benchmarking data is useful in practice, as part of our evaluation of the pilot.

Q: How do the staff and student Discovery tools differ from the student (and now staff) Tracker surveys?
The Digital experience tracker is a survey. We ask students (and soon we will be asking staff) to assess aspects of their digital experience. The survey provides objective, benchmarked evidence about organisational performance. It does not provide any information about individual performance. However, the customisable pages in the Tracker surveys provide an opportunity to ask questions that are key to digital capability, for example whether staff/students have taken up key training, are confident about using core organisational systems, or (agree that they) have specific digital competences.

There is no intrinsic benefit to staff and students in completing the Tracker survey, but the organisation benefits from their feedback and can pass the benefits back by responding and making targeted improvements.

The Digital discovery tool is a self-directed tool with the focus on raising awareness and empowering the user to manage her own digital development. The score is arrived at through self-assessment and self-reporting, and is used to provide suitable feedback and further resources. Scores are not objective, though it is may be meaningful to compare scores on the same questions e.g. across departments or organisations.

The Discovery tool primarily benefits individuals with understanding and resources. The Tracker primarily benefits organisations with data about how users experience the digital environment and curriculum. For more information on the Digital experience tracker see the project blog https://digitalstudent.jiscinvolve.org/wp/.

Tracker vs Discovery

Q: Why don’t you recommend the Discovery tool for staff induction?
A: Staff induction is a critical process that establishes the relationship between a new member of staff and the organisation. Important among the issues that will be covered are the digital systems in use by the organisation, the specific skills required for the job role, and how staff will be supported and developed in their role. These are organisationally very specific, and a generic tool can’t deal with them in sufficient detail. We don’t feel it is appropriate for an online experience to stand in for a personal discussion about organisational expectations and/or personal needs. The role-specific questions also take it for granted that the user understands the norms of their role and their professional practice. For all these reasons we ask that the Discovery tool be used with staff who are already established in their role. We are considering a version that could be used – perhaps with local adaptations – as part of an induction process.

Q: Does the Discovery tool support LTI or Shibboleth so we can use single institutional sign-on?
A: The tool currently doesn’t have this functionality but we are exploring single sign-on in future, in relation to institutional sign-on for Jisc services overall.

Return to FAQ index

Questions about the student version

Q: When will the student version of the Digital discovery tool be available?
A: The student version is currently in pilot and we expect to make it generally available in September as part of the new service. A new mailing list has been set up for communication and discussion among the pilots, and any further updates will be shared via this list.  The Guidance for institutional leads available from the blog has been updated to incorporate guidance on engaging both staff and students.

Q: Why don’t you recommend the Digital discovery tool for new and arriving students?
A: Student induction is a critical process that establishes expectations on both sides. The digital systems in use learning, teaching and assessment, and the sources of support available, are crucial elements of the induction process, but they tend to be very specific to the organisation. A generic tool can’t possibly deal with these issues in enough detail to meet incoming students’ needs. Although self-assessment is useful at transition points, we are concerned that the Digital discovery tool might be used instead of other more appropriate introductions to digital learning. The questions and feedback have therefore been designed for students who are at least somewhat established in their course and in their learning practices. In future we are considering a version for incoming students that could be used alongside other resources.

Return to FAQ index

Questions about the resources

Q: Are the resources that are recommended free?
A: Yes.

Q: Can we add our own resources?
A: We are looking at the possibility of providing this option in the future. The level of interest is something we will be assessing as part of the evaluation process. The use of metadata means that new resources can in principle be added, providing they are tagged to existing elements in the question sets (e.g. ‘data literacy’).

Q: Can users browse the resources without taking the assessments?
A: Yes, this is possible from the dashboard. The ‘browse’ view presents resources in the six broad areas of digital capability, not in the smaller elements which are the focus of the feedback reports.

Q: Is the resource metadata harvestable and shareable with other library search solutions?
A: This is an interesting question that we are exploring further.

Return to FAQ index

Questions about data collected on sign-up and organisational data returns

Q: When will we be able to see our organisational data returns?
A: You should be able to see your data returns w/c 9th April, using the ‘data’ icon in the left hand column on your tool dashboard. Only institutional leads will have access to the data report linked from that icon.

Q: What will the report provide me with?
A: The report will provide you with:

  • Number of staff completions by assessment (question set) and by department
  • Overall digital capability scoring bands (i.e. Developing, Capable and Proficient) for your organisation across the Digital capability areas
  • Digital capability scoring bands (e.g. Developing, Capable and Proficient) by department and assessment
  • Overall digital capability scoring bands in your Sector across the Digital capability areas

Guidance on making sense of this data will be available shortly linked from Guidance for institutional leads.

Q: If we have staff from different areas completing the tool,  Is there any way of reporting the different groups? 
A: Alongside overall digital capability organisational scoring banding results, we will also be providing anonymous visualisations at the department level. This data is based on the Department headings that users selected during the sign-up process (see the section on ‘Choice of departments’ in the sign-up guide for staff for a full list of these).

Q: Does the service comply with GDPR?
A: The service is GDPR compliant, because we only receive anonymised data as stated in the Potentially privacy notice (see statement in the Guidance for Individual Users which outlines the approach for individual use, and will be of interest to institutions too. This is also available from the tool dashboard).  If the data is properly anonymised so that individuals can not be identified  then it is GDPR compliant since it doesn’t put personal data at risk. We are in addition undertaking an additional audit to ensure that all Jisc services going forward are compliant as more guidance has been released since its introduction. You may also be interested in Jisc guidance on GDPR guidance for institutions.

Q: Can we ensure that our output is ours and is not to be used for other purposes?
A:  Jisc will only see an anonymised view of the data (by individual, although we can identify institutions for the purposes of sharing your data back with you). We may use your data for example to inform the development of the tool (for example in analysing responses to question sets to see how they are working), or to present publically an overall picture e.g. of overall use and outcomes, but nothing specific to any individual or organisation.  We will also share back your data back with all pilot institutions as part of an anonymised and aggregated sector level view for benchmarking purposes as requested by many institutions.

Return to FAQ index