Digital Maturity Assessment 2023 – Section Eleven
Annex: Explanatory Notes
Methodology
The Scottish Government/COSLA Digital Maturity Assessment was conducted amongst 41 organisations from within the Scottish health and social care sector between April and June 2023
Survey content was developed via a series of consultations with 31 subject matter experts.
The survey, which consisted of over 400 questions grouped into 20 topics, which in turn were assigned to one of three themes, was presented as an online survey to be completed by either individual organisations or, where possible, as a joint effort across whole local healthcare and social care systems.
A bespoke online platform, which offered participants a number of relevant benefits in order to complete the survey efficiently, was used to host the survey. Those benefits included the ability to answer questions and sections in any order, the ability to assign whole sections to different colleagues, the ability to poll any number of colleagues on any number of questions to get their input on determining the best response, the ability to issue a generalized, shorter version of the assessment as a survey to general staff anonymously as a way of aiding respondents in finding their most appropriate assessment response, the ability to conduct remote and in-person conferences to work through any part of the assessment as a group, the ability to see further information for each question, including a definition of ‘what good looks like’, the ability to include notes with their submission, the ability to deposit supporting evidence with their submission, and the ability to contact support in real time to get assistance with technical matters and questions about the survey content.
Further support was provided via two weekly drop-in sessions held via MS Teams throughout the data collection period.
Overall, more than 1,500 participants from 41 organisations collaborated on over 30,000 distinct occasions to complete the assessment. Additionally, more than 5,900 staff from over 30 organisations completed the staff survey.
Following the data collection phase, a series of validation sessions intended to query outlying results from the survey and to gain qualitative context for organisations were held. As a result of those validation sessions, recommendations for changes to some organisations’ submissions were offered. For this reason, it is possible that the data reported in this document may change to a small extent after this report was written.
Assessment structure
The assessment is divided into three themes, each of which includes a number of sections. For some sections, responses were sought separately by service type.
Scoring and weighting
For the most part, the questions in the digital maturity assessment are qualitative in nature, and use a Likert-style answer option scale. To enable some of our analysis, we have assigned scores to each answer option:
- Disagree completely (Score: 0)
- Somewhat disagree (Score: 25)
- Neither agree nor disagree (Score: 50)
- Somewhat agree (Score: 75)
- Agree completely (Score: 100)
- Don’t know (Score: 0)
- Not applicable (Not scored)
Additionally, some of the capabilities sections include quantitative questions to assess aspects around the degree of proliferation of participants’ digital practices. These questions use a percentage scale with the following score assignments:
- 0% (Score: 0)
- 1% to 20% (Score: 20)
- 21% to 40% (Score: 40)
- 41% to 60% (Score: 60)
- 61% to 80% (Score: 80)
- 81% to 100A% (Score: 100)
- Don’t know (Score: 0)
- Not applicable (Not scored)
One of the quantitative questions within the Records, Assessments & Plans section, which concerns the format of digital records held, uses the following answers options and associated scores:
- Unstructured (Score: 0)
- Semi-structured (Score: 50)
- Structured (Score: 100)
No weighting by theme, section, service or question has been applied (while it is obvious that not all theme/sections/services/questions carry the same weight, it is the conclusion of our subject matter experts that this will vary greatly for every individual organisation and that a generalised weighting would do more to distort reporting than to enhance it).
Aggregations in this report are performed following the assessment’s hierarchy: Questions are aggregated into services (where available), questions or services into sections, and sections are aggregated into themes. Disregarding this hierarchy (E.g., by aggregating questions into themes) may produce varying results.
Score homogeneity
Throughout this report, we’re relying on averages calculated for different parts of our data; sometimes, this may include all data collected; at other times we might only use data from a relevant subsection of the data (for example, “mental health services”).
We have provided extra analysis whenever necessary to demonstrate the consistency or homogeneity of the data we are using. That’s because health and social care in Scotland is often fragmented, and we feel that the degree to which that affects digital maturity can often be very relevant.
No information pertaining to any single participating organisation has been published here, and no comparisons between individual organisations have been included.
Updates and future assessments
Following the Scottish Government/COSLA Digital Maturity Assessment 2023 reported on in this document, participating organisations will update their assessments at times of their own choosing so as to better accommodate their individual pace of change.
Updates to this report will be made available annually based on data collected.
Contacts
If you have any questions about this report or the Scottish Government/COSLA Digital Maturity Assessment, please contact sg@dma.works.