Digital discovery: FAQs

This list of Frequently Asked Questions began life as the discussion thread from the Digital discovery pilot start-up webinar on January 10th 2018. It is updated regularly as questions arise, particularly on the Digital discovery pilot jiscmail list. Our main resources for institutional leads are accessed from this guidance page.

Questions for staff users

Q: How do staff sign up to the Digital discovery tool?
A: From the home page (https://jisc.potential.ly/):

  1. Click on the ‘sign-up’ link just below the ‘sign in’ box.
  2. Provide a name and email address.
  3. Create a password  (this needs to be at least 8 characters, including one capital letter and one number).
  4. Enter the code you have been given.
  5. Select your organisation, department and role from the drop-down lists
  6. Read the data privacy notice and tick to confirm
  7. Click ‘create account’

After this, users can sign in using just their email address and password. It should be possible to register in around a minute.

Q: Why do staff have to set up an individual account?
A: This is to ensure:

  1. they are provided with assessments appropriate to their sector and role, and
  2. we can provide anonymised data to organisations about staff completion rates and development needs, broken down for example by role or department.

Staff identities are not recorded with their answers, and Individual reports are never shared by us.

Q: What devices and browsers are supported?
A: Potentially is a web-based platform, designed for easy access, sign-up and use. The Digital discovery tool should run on most devices with a browser and internet connection. You should not need to download any software or browser you don’t already have.

The Discovery tool has been tested with both Windows and Apple Mac desktops plus a variety of tablets and mobile devices (Android, iOS) with the following browsers:

  • Google Chrome
  • Firefox
  • Microsoft Edge
  • Safari

For the best experience, we recommend completing the questions and viewing the report and resources on a full-sized screen on either a desktop, laptop or tablet at 1024×768 resolution using the latest version of Chrome or Safari. Google Chrome tends to work best.

Q: Is the Digital discovery tool fully accessible?
A: Potentially aims to meet level AA of the Web Content Accessibility Guidelines (WCAG 2.0) as a minimum and is working hard towards this goal. During the pilot process, Potentially will be continuously testing and developing the beta version of the tool towards meeting the AA standards, and any feedback from pilot users is welcome.

Q: How long should it take for an individual to complete the Digital discovery process?
A: The ‘capabilities for all’ assessment should take no more than 25 minutes and covers all 6 areas (15 elements) of the digital capability framework. Specialised assessments ‘for teaching’, ‘for learning’ etc should take around 15 minutes. On completion, users receive a report which they can download and review at their leisure.

Q: Does the process have to be completed in one session?
A: Users can leave a question and return to it, or change their answer, using the ‘next’ and ‘previous’ buttons. They can also save and exit the questions at any point and return for another session. Users will be asked to complete any questions they have missed before submitting their answers, which triggers the release of their report. If users choose to take the assessment again, their previous report will be over-written, so they are encouraged to dowload and save a version at the time.

Q: What should users do once they have completed their session?
A: Users should click to log out, as the system doesn’t currently log users out automatically. This is important for users of a shared machine to ensure their responses can’t be viewed by others.

Q: Will there be an option for users to indicate if a question isn’t relevant to them?
A: We have tried to avoid this situation by splitting questions into capabilities ‘for all’ and capabilities for specialised areas of practice. At the moment ‘for teaching’ is available, with slightly different versions for HE and for FE and Skills. ‘For all’ questions have slightly different versions for staff and students. We have user tested for perceived relevance. Users will only see the questions relevant to them, as determined by the information they give at log-in. For example, staff who identify as having an Academic/Training role will see the generic questions for staff, and the specialist questions for staff with a teaching role.

Any assessment that aims to be relevant across a wide range of roles and organisations will include items that are unfamiliar to some users. This is partly the point of the discovery tool – to raise awareness of opportunities beyond the user’s current knowledge and practice. So long as it is understood that the questions and the scoring are there to give feedback and not to judge performance, we hope staff will feel comfortable identifying that some activities are not in their current skillset.

Q: What is the difference between the personal feedback report and the reports available to institutional leads?
A: The personal feedback report exists only on screen and (if selected) as a downloaded pdf on the user’s own device or computer. Responses and reports are confidential to the end-user unless they choose to share them. Organisational leads receive aggregated, anonymised data about issues such as how many staff have completed the Discovery tool and their average scores.

Our research and testing has shown that users need to feel in control of the information they give about their digital capability, otherwise they may not answer questions accurately. The process depends on self-assessment and self-observation to allow users to receive relevant feedback. If staff do not participate confidently, any aggregated data provided to institutional leads will be inadequate or inaccurate or both.

Return to FAQ index

Questions for the institutional lead

Q: Why is there so much guidance – can’t we just point users to the URL?
A: We hope the Discovery tool is useful in itself. But we expect it to be much more useful if it is introduced in the context of a whole-organisation approach to digital capability. This is why we ask you to think ahead about issues such as: how you will engage staff (later students); how you will support staff to use the tool and respond to their report; how you will evaluate the process and investigate any impact. We also ask you to consider what other resources from Jisc could help your organisation in its digital journey.

Q: Can I see and give feedback on the content of questions and reports?
A: The content of questions is available from the main guidance page. We don’t recommend you use the questions in this format with end-users, but please use them to better understand the process. We don’t currently offer the content of reports as this is generated uniquely for individual users.

Please send us any feedback on the questions and report content, and on your own experience as a user, using this form. This has been designed to capture the comments of experts like yourself. Your users should be directed to use only the feedback form they can access from their dashboard.

Q: What is the difference between the personal feedback report and the reports available to institutional leads?
A: The personal feedback report exists only on screen and (if selected) as a downloaded pdf on the user’s own device or computer. Responses and reports are confidential to the end-user unless they choose to share them. Organisational leads receive aggregated, anonymised data about issues such as how many staff have completed the Discovery tool and their average scores.

Our research and testing has shown that users need to feel in control of the information they give about their digital capability, otherwise they may not answer questions accurately. The process depends on self-assessment and self-observation to allow users to receive relevant feedback. If staff don’t participate confidently, their responses won’t be accurate, and any aggregate data provided to institutional leads will be inaccurate too.

Q: Is there any cost to taking part in the pilot?
A: For the duration of the pilot, the Discovery tool and associated guidance are free of charge. We do ask you to undertake some evaluation of your experience as an organisation, and encourage your users to complete the evaluation form that is bundled with the Discovery tool. We are currently exploring the options post-pilot (from July 2018).

Q: What are our responsibilities to Jisc?
As a pilot project, your responsibilities are to support the staff taking part – bearing in mind that this is a pilot run of a beta product. You also undertake to provide us with some feedback to help us evaluate the overall experience and any impact at your organisation. There is a fuller account of Jisc’s responsibilities and yours on the page of essential information for organisational leads.

Q: Does the tool allow benchmarking against the HE sector at institutional level?
A: We expect to share some benchmarking data e.g. the total number of completions and aggregated staff scores across the pilot. It won’t be possible to identify any institutions’ data separately. We will explore whether detailed benchmarking data is useful in practice, as part of our evaluation of the pilot.

Q: How do the staff and student Discovery tools differ from the student (and now staff) Tracker surveys?
The Digital experience tracker is a survey. We ask students (and soon we will be asking staff) to assess aspects of their digital experience. The survey provides objective, benchmarked evidence about organisational performance. It does not provide any information about individual performance. However, the customisable pages in the Tracker surveys provide an opportunity to ask questions that are key to digital capability, for example whether staff/students have taken up key training, are confident about using core organisational systems, or (agree that they) have specific digital competences.

There is no intrinsic benefit to staff and students in completing the Tracker survey, but the organisation benefits from their feedback and can pass the benefits back by responding and making targeted improvements.

The Digital discovery tool is a self-directed tool with the focus on raising awareness and empowering the user to manage her own digital development. The score is arrived at through self-assessment and self-reporting, and is used to provide suitable feedback and further resources. Scores are not objective, though it is may be meaningful to compare scores on the same questions e.g. across departments or organisations.

The Discovery tool primarily benefits individuals with understanding and resources. The Tracker primarily benefits organisations with data about how users experience the digital environment and curriculum. For more information on the Digital experience tracker see the project blog https://digitalstudent.jiscinvolve.org/wp/.

Tracker vs Discovery

Q: Does the Discovery tool support LTI or Shibboleth so we can use single institutional sign-on?
A: The tool currently doesn’t have this functionality but we are exploring single sign-on in future, in relation to institutional sign-on for Jisc services overall.

Return to FAQ index

Questions about the student version

Q: When is the student version of the Digital discovery tool available?
A: We’re looking at small-scale testing of learner questions late Jan/Feb, and a further release in March. It’s likely we’ll want to do some phased approach to release, so if you are interested in being part of early testing please let us know at digitalcapability@jisc.ac.uk.

Return to FAQ index

Questions about the resources

Q: Are the resources that are recommended free?
A: Yes.

Q: Can we add our own resources?
A: We are looking at the possibility of providing this option in the future. The level of interest is something we will be assessing as part of the evaluation process. The use of metadata means that new resources can in principle be added, providing they are tagged to existing elements in the question sets (e.g. ‘data literacy’).

Q: Can users browse the resources without taking the assessments?
A: Yes, this is possible from the dashboard. The ‘browse’ view presents resources in the six broad areas of digital capability, not in the smaller elements which are the focus of the feedback reports.

Q: Is the resource metadata harvestable and shareable with other library search solutions?
A: This is an interesting question that we are exploring further.

Return to FAQ index

Questions about data collected on sign-up and organisational data returns

There will be more information about this shortly

Return to FAQ index