DfE Digital, Data and Technology
dfedigital.blog.gov.uk.web.brid.gy
DfE Digital, Data and Technology
@dfedigital.blog.gov.uk.web.brid.gy
News and updates from the Department for Education Digital, Data and Technology team

[bridged from https://dfedigital.blog.gov.uk/ on the web: https://fed.brid.gy/web/dfedigital.blog.gov.uk ]
Working together on user research
Image of palms facing upwards in a circle As a Fast Streamer, I spent six months at the Department for Education (DfE) working as a user researcher on our internal digital platform for reporting IT issues. The research, commissioned by leadership, aimed to understand how users experience the platform, covering everything from email problems to security incidents. The team was mainly made up of Business Analysts and Developers. Because the last research on the tool had taken place a few years ago, user research was unfamiliar to them. ## **Deciding how to do my research** I had a choice: to do the research alone and share my findings at the end or bring my team along with me for the journey. I chose collaboration. I knew that working together would make the findings more meaningful and demonstrate the value of user research. But, I also knew it would be more challenging as time was limited, and the team had competing priorities. I needed to make it easy for them to get involved. ## **What I did** First, I spoke to my leadership team about my intentions and got their support. I introduced user research to the wider team through a short talk, explaining what it is and what a user researcher typically does. We held a workshop to define what we wanted to learn. This helped shape research questions that reflected the team’s priorities, not just mine. Next, I ran a session on observing research, including interactive activities on unconscious bias. I wrote discussion guides based on our research questions and recruited participants. I shared a sign-up sheet so team members could join sessions when they were available. Nine out of ten colleagues attended a session. I then invited the whole team to an analysis workshop, where we reviewed the findings together and linked them back to our original questions. I synthesised the common themes and shared the final insights with the team. ## **How it went** I’m proud of what I achieved, but like anything in life, it wasn’t perfect. Some things that went well: * making sure everyone in workshops had a voice, regardless of grade. I made it clear that we would go around the room in many sessions to avoid hearing from the same people in every workshop or meeting. * I had a senior stakeholder attend to ensure they understood the research process, despite a busy diary. * getting everyone excited; many people gave feedback that they found the workshops and research sessions interesting. I think making the first workshop as engaging as possible really helped. * it also exposed everyone to my findings before the research was even finished, so the team was not taken by surprise. They had already seen the evidence for themselves and did not need to be convinced. Some things that could have gone better: * getting the timing right was hard. I never wanted to take up too much of my team’s time, as I knew they were busy with their own workstreams. This meant that sometimes I scheduled a workshop for one hour, even though I really wanted more time with everyone. * I perhaps didn’t engage with people beyond my immediate team due to the heavy focus on my closest colleagues, who worked on the tool. While the team already essentially knew what my final report would say, some stakeholders outside of the team hadn’t joined the entire journey due to data protection. Bringing them on board at the end needed to be done with care. Throughout, I tried to keep things light and engaging. A bit of humour helped us stay connected and made the process more enjoyable. ## **The result** Screenshot from social media platform X In the end, technical staff gained a deeper understanding of the user’s perspective, something that felt refreshingly different from their usual work. Most importantly, we identified valuable opportunities to improve the platform, making access to IT support quicker and easier for users. This helps them stay focused on delivering the government’s priorities.
dfedigital.blog.gov.uk
October 2, 2025 at 2:53 AM
Creating 3 principles of informed consent for user research
Informed consent is a process where an individual makes a choice whether to take part in research. They make this choice after a user researcher gives them information about what the research is about. Informed consent involves both communicating what participation involves but also making sure the form of communication and any paperwork are accessible to participants. Gaining informed consent can be a daunting task for user researchers depending on their level of experience and the complexity of the project. To help with this, the Department for Education (DfE) user research informed consent working group has created 3 principles: 1. We want research to be fair for everyone. We are aware of power imbalances and try to reduce them in our research. 2. We make sure people can give informed consent in ways that work for them. 3. We check that people understand what they are consenting to and are happy to take part. In this blog post, we cover: * what the DfE informed consent working group is * how we developed the informed consent guiding principles * how user researchers and anyone else involved in research can use the principles * the toolkit we’re developing to support the principles, and how you can get involved If you want to know more about using the 3 principles, you can view them in this slide-pack that you can share with colleagues: Informed consent guiding principles slide-deck ## **The informed consent working group** In 2024, a group of DfE user researchers formed a working group to find better ways to approach informed consent. We were inspired by a talk given to our UR community by Katherine Smales from University College London, about an innovative approach she used for research with young children. In our working group, we identified that: * informed consent is not a one-size-fits-all process * existing templates for gaining informed consent may not fully address the diverse needs of participants and the requirements of different research methods * gaining informed consent requires researchers to adapt their approaches * varying levels of expertise amongst researchers makes adapting of approaches harder To reflect this, we created the following problem statement: _How can we ensure that DfE URs and relevant colleagues understand how to always inform participants and gather their consent in appropriate ways?_ _And what kinds of guidance, support or tools are needed to help with this?_ ## **  How we developed the principles** Inclusivity and accessibility were central to our approach. We began with desk research focused on the needs of participants, including those with access needs, or young children. A key insight from our research was that there is no one-size-fits-all approach, especially since participants may have many overlapping needs. We developed flexible guiding principles which: * Include practical advice and real-life case studies * Help URs adapt their approach before, during, and after research takes place We then created a prototype slide-pack based on our findings and conducted 2 rounds of user testing with URs: * from 6 government departments and one local authority * ranging from junior to senior researchers who worked on diverse projects, including research with vulnerable groups. We also sought feedback from DfE’s user research community Ethics Forum and our Inclusive User Research working group. The principles went through two major iterations before the version we’re sharing today. ## **Using the principles** You can view the slide-pack and share it with colleagues. The guiding principles are a supportive tool rather than a rigid checklist. We hope that they will help URs and anyone else involved in research to navigate diverse contexts.  They can use this pack when planning how they will gather informed consent. They can also use it to develop or review their knowledge of informed consent. The pack includes: principles, their definition, explanations about why they matter, real case studies, tips for planning and delivering informed consent. ## **Next steps** The principles are a strong foundation. But our research revealed URs need practical tools to apply them. For example, they might need more support to identify situations where different forms of informed consent are needed, or to create tools like picture-based information sheets. Over the coming months, we’ll be developing a toolkit to address this need. The toolkit's purpose is to help user researchers apply the principles. > **Find out more or get involved** > > We’d love to hear from you! Whether you have feedback, ideas, or want to collaborate on the toolkit, please get in touch: > > 📩 **[email protected]** **Thank you** We thank all our past and present members: Imran Akhtar, Elena Bracey, Heather Bramwell, Katie Carnie, Solène Heinzl (project co-ordinator and Informed Consent working group lead), Rosalie Lord, Latifa Mahdi, Nataliya Mykhalchenko, Lucy Sutton and Denny Vlaeva. We thank Arrun Gaydhani and Rob Dale for their helpful contributions, and Tom Adams, head of user research at DfE, for his helpful guidance throughout the process. Last but not least, we are grateful to all the URs who engaged with us. These principles would not exist without you. We hope you’ll find it as interesting to engage with our future work.
dfedigital.blog.gov.uk
July 17, 2025 at 2:38 AM
Using research crits to improve our work
A research crit is a place for researchers to come together and provide each other advice and feedback on their work. The benefits of a crit are they: * encourage you to work collaboratively, in the open * allow for rapid sharing of experience and knowledge * are a place to get constructive feedback on early work * encourage researchers to learn from each other ## How they started In the Children & Families Digital portfolio at DfE, we have a group of experienced user researchers working across several projects and policy areas.  We wanted to find ways to share our knowledge and experience with each other to improve the quality of our work. We also wanted to share more about what we’re working on across the portfolio. Crits are a well-established part of the UCD professions at DfE, but they’re often design focused. We wanted to see how well they could work for the research process. One of our researchers, James Bolchover, had run research crits in a previous role and suggested we trial them. We came together to agree the terms for the crit. This written agreement for how the crits should run is recorded at the top of the Lucid board we use to run the session. The agreement outlines that we: * should be present in the session * respect and listen to everyone’s opinion – seniority isn’t a factor * support each other * inspire each other ## How we run them We already have a fortnightly meetup together on Mondays. We agreed that once a month, we’d repurpose this for a crit. We started off by asking a week before over slack if anyone had things to bring to crit the following week. Uptake was low and we did have to skip a few crits to start with. When we investigated, we found that a week in advance was too far to wait to crit a problem. People needed feedback on things they were solving in the moment. We changed the process and now put a message on Slack on a Monday morning asking what people want to discuss that day. Doing it this way means people bring things that are top of mind for them, and we can give them timely feedback. We’ve found recently that researchers are wanting crits more frequently, so we’re doing them almost fortnightly now. ## Things we crit We’ve covered many topics in our crits so far. The only criteria is that we don’t review anything that’s finished; it must be a work in progress. Some things we’ve looked at include: * A plan for a stakeholder workshop * A blog about sample bias * How to do roleplay as a research method * Ways to present a journey map * Versions of our insight libraries and how best to present them * Consent forms for research with young people * A survey plan ## Tools we use One of our principles is that crits shouldn’t take a lot of extra work. James created a Lucid board for us with the template below. This template covers: * Background of the project and what stage the research is at * What is being critted (links, screenshots etc) * Why we want a crit, including what the researcher is unsure about * Two scales to understand how sensitive the researcher is feeling about the feedback, and how far in the process they are / how able they are to make changes * A space for feedback from others * After session thoughts and updates The researcher fills out the relevant sections before the session, and we use the meeting to provide feedback and discuss it. We also have follow up sessions for researchers to share and reflect on what they eventually did with the work. Copy of a Crit template ## What we’ve learned We’ve found research crits to be a great way to share what we’re working on without adding too much time and burden onto our researchers. They’ve improved our practice as we’ve consistently made changes because of each other's feedback. To name a few examples, we’ve restructured our insight library to make it easier to read with labels and tags, we implemented pre-briefing for participants trying out a role play method for the first time, and we rewrote an information sheet to reduce the text for young people. > This year we’re looking for new ways to improve how we share our research and skills with each other. If you have any ideas to share, get in touch at [email protected].
dfedigital.blog.gov.uk
July 17, 2025 at 2:38 AM
Building relationships with gatekeepers to recruit participants
User researchers (URs) conduct research to understand the needs of people that we are designing services for. URs have several routes to recruit participants. You can explore these further by reading these blog posts about recruitment strategies and managing biases when planning recruitment. This post explains who recruitment gatekeepers are, how they can assist in recruiting participants, and how to build good relationships with them. We illustrate this with a case study from the Social Work Workforce programme. ## Working with gatekeepers A gatekeeper is a person or organization willing to help recruit research participants when the researcher has no direct access to participants. User researchers identify and approach gatekeepers before seeking support from suppliers of participants. This is factored into planning time. Building relationships with gatekeepers can take longer than other recruitment methods but provide longer term benefits. For example, it can support teams with: * Creating a list of users interested in taking part in future research * Developing good rapport with external stakeholders and users * Improving understanding of users and their communication and accessibility needs URs access gatekeepers via their department’s networks (e.g. the Department for Education has dedicated teams that liaise with local authorities) or by reaching out to organisations directly. ## How to work well with gatekeepers Gatekeepers may need warming up to government URs due to potential mistrust of government, research fatigue, and because they want to ensure their clients’ safety  (e.g. charities who work with people with negative experiences of government). When working with gatekeepers, some things to consider are: * Research who gatekeepers are to ensure they are the right person or organisation to collaborate with * Prepare concise communications about who you are and your request * Check with your team to identify any sensitive information that can't be shared externally * Agree on an ethical framework and safeguarding plan with gatekeepers to align with their procedures * Explain to gatekeepers what’s in it for the people you’re trying to reach * Consider what’s in it for the gatekeepers and plan to give it to them (e.g. research playback) * Listen to them and feedback to relevant colleagues * Make sure that you caveat that URs are not decision makers (we report to teams) * Handle the research operations and admin so gatekeepers don't take on additional work to help recruit. Figure 1 Photo by UX Indonesia on Unsplash ## Case Study: Social Work Workforce URs reaching out to new types of users **** ### Our problem How can we recruit research participants we haven’t recruited before in a short space of time? Our team started developing online content to help prospective social workers make informed choices about the right qualifying route. Prospective social workers were a new group of people we hadn’t approached before. We needed to understand their experiences so that we could meet their needs as best as we could. ### Our approach **** We recruited through organisations who had existing relationships with our target participants. We used three routes: a UR who had previously worked with our users, policy colleagues who linked us with gatekeepers they were collaborating with (e.g., training program suppliers), and course lecturers whose emails were publicly available. During our initial calls with organisations, we explained that we were government URs, why the research was taking place, who we were looking to speak to, and how long the research phase was. We also stated that participants would be thanked with a financial incentive for their time, respecting their busy lives. We discussed our research ethics, participant consent, and data management. We knew recruitment communications would be better coming from trusted sources within each organization. To minimise the burden on our gatekeepers, we provided a recruitment email template with a link of expression of interest form. Once gatekeepers sent these emails out, we processed the rest: sign-ups, speaking directly to potential participants, and booking sessions. At the end of the research, we made sure to email gatekeepers to thank them. ### Outcome We met our project timeline from recruitment to research delivery, recruiting and conducting research with 18 users with diverse profiles. This was possible due to the support from our gatekeepers. In addition to the research sessions, our discussions with gatekeepers helped us learn more about various journeys into qualifying. We also built good rapport with gatekeepers creating opportunities for future collaborations. Thank you to everyone who helped. We appreciate the support from all participants, colleagues, and gatekeepers. Figure 2 Photo by Kaleidico on Unsplash **Contact us:** [email protected] **Are you a DfE UR?** Please contact the research operations team before reaching out to organisations. They may have existing relationships and can help make sure any panel building or creation of lists aligns with the departmental approach. **Are you part of the social work workforce? ** Join our list people interested in taking part in research to improve digital services for the social work workforce: https://dferesearch.fra1.qualtrics.com/jfe/form/SV_03qTiKhMgXVbVfo
dfedigital.blog.gov.uk
July 4, 2025 at 2:37 AM
Making use of open source tools
The social work induction programme (SWIP) is a two-year programme requiring newly qualified social workers to demonstrate their skills and knowledge with evidence from their practice. At the end of the programme, social workers are assessed against the new post qualifying standards. By improving the quality and consistency of early career development and assessment, we aim to improve the quality of social work practice, leading to better outcomes for the children and families we serve. On the SWIP digital team, we’re designing a digital service that will support the delivery of this programme. Through our service, social workers will be able to access learning resources, submit assessments, receive feedback and track their progress towards the standards. **Choosing a learning management system** We initially explored the option of building a new custom platform to deliver the programme. While this would offer us the highest degree of design freedom, it became clear that lots of existing learning management systems (LMS) on the market already provided the features that we need for the SWIP digital service. Using an existing learning management system means that we can deliver the service quicker and more cost-effectively. Based on our research into existing systems, Moodle emerged as the most suitable platform for our needs. It is one of the most widely adopted learning management systems across education, business, and government sectors. One of Moodle’s key advantages is that it’s open source, offering the flexibility to tailor both its functionality and appearance to our specific needs while being more cost-effective than building a custom system from scratch. We can fully customise its appearance to be compatible with the GOV.UK Design System and adapt its functionality to meet the needs of our service users. Another key advantage of Moodle is its strong compliance with security and data privacy standards, ensuring that user information remains protected. It can also support large-scale deployments and handle high volumes of user traffic with some public sector sites hosting more than 500,000 users. **Use across UK government** Moodle is available through the G-Cloud 14 Digital Marketplace, making it a pre-approved solution for UK government bodies. It’s already used by several government departments and agencies, including the Ministry of Justice, Department for Business and Trade, Health Security Agency and the NHS. Moodle sites are used across the public sector to deliver: * compliance training and tracking * employee onboarding and skill development * career development and advancement * HR and workplace safety training * competency-based training and management Because of Moodle’s widespread use in a variety of services across government, we’re confident that we can adapt it to meet our needs and to comply with Government Digital Services (GDS) standards. **Implementing Moodle: our process** We integrated Moodle with our bespoke account management system, allowing us to separate users by organisation within a single site, improving efficiency and reducing costs. We also explored integration with GOV.UK services, including a proof of concept for single sign-on using GOV.UK One Login. To ensure a consistent user experience, we began developing a custom Moodle theme aligned with GDS design and accessibility standards. **Theming Moodle for GDS** Moodle supports a range of plug-ins, including custom themes. We created a custom GOV.UK theme that can be installed and selected on Moodle, ensuring visual consistency with the GOV.UK Design System. This is made possible through the govuk-frontend package which is maintained by the GOV.UK team. When the team adds new features, makes changes or fixes bugs, we can update our code using the latest version. Moodle utilises a range of components and we're progressively theming many of them, such as headers, footers, and buttons,so the platform increasingly mirrors the look and feel of a GOV.UK service. **Before** **The default homepage before the GOV.UK theme has been applied** **The Moodle home screen showing the default theme** **After** **A recognisable GOV.UK design after applying the theme** **The Moodle home screen showing the custom GOV.UK theme in use. The header, typefaces and components now match GDS design patterns.** **Contribute to our work** Theming Moodle for GDS will be an ongoing process as we determine which areas of Moodle we will use to deliver the features our users need. One of our goals is to share the work we’ve done on the Moodle GOV.UK theme so that other projects within DfE and wider government can use it and contribute to it. The govuk-moodle-theme repository can be found on GitHub, under the dfe-digital organisation. It is a public, open-source repository, so anyone can reference our theme’s releases in their project to apply the theme to their instance of Moodle and even choose to build on top of it. We’re inviting any teams who are interested in using Moodle for their services to contribute to this work and help us to expand the repository. Explore or contribute to the GitHub repository
dfedigital.blog.gov.uk
June 26, 2025 at 2:37 AM
How we’re providing real-time accessibility statements and issue management
In late 2023, the Department for Education (DfE) DesignOps team noticed a recurring problem across the services DfE provided to users. Many of our digital services did not have accessibility statements, or had out-of-date, or incorrect statements. This issue made it difficult for users to understand how accessible our services were, and we lacked the data needed to identify common accessibility issues or track progress. Dashboard showing summary of issues and statuses. **Identifying the challenge** We recognised the challenge – teams struggled to provide or keep accessibility statements accurate and compliant with the Public Sector Accessibility Regulations 2018. Without a central way to manage these statements, it was impossible to consistently identify accessibility risks or track fixes effectively. **Setting our objectives** We set ourselves the task of creating a service that could manage accessibility statements and issues centrally. In January 2024, we recruited a senior accessibility specialist who guided us in aligning our approach with best practices and legal standards. They were instrumental in developing learning and training tools in the department, working with digital teams in understanding accessibility requirements, and supporting them with audits, troubleshooting, advice, and recommendations. Throughout 2024, we also involved a group of T-Level students from Bury College who were with us as part of a work placement. They joined us one day each week, mapping user journeys, researching user needs, and identifying the crucial data points needed to capture accessibility issues to support the wider goal of automating accessibility statement generation. A screenshot of screen mockups in Figma showing the questions users would be asked when creating an issue. **Taking practical steps** By early 2025, with the students completing their placement, the DesignOps team took their valuable work forward. In April 2025, we launched a private beta involving several key DfE services to understand if it improved the management of issues and accessibility statements for teams. This beta allows teams to: * create, update, and close accessibility issues efficiently * assign issues to relevant team members * generate real-time reports aligned with WCAG criteria * provide dynamically generated accessibility statements via unique URLs We also built our product to support multi-tenant use, meaning arms-length bodies and partner agencies can use it with their own branding on their statements. **Positive early outcomes** The initial feedback from our private beta is very positive. Teams now have clearer visibility of accessibility issues, enabling them to act quickly and keep statements accurate and compliant. Users benefit from up-to-date accessibility statements, better support, and clearer information. A screen in the product which shows issues across all registered products in the service. We’ve also identified several future features: * recording detailed accessibility audit findings directly in our system * allowing auditors to upload audit results via CSV files, reducing manual data entry * monitoring GitHub, SVN, and Azure DevOps repositories for accessibility issues Additional ideas include integrating automated accessibility checks into continuous integration processes and enabling automatic onboarding from service registers for departments **Cross-government show and tell** We held a cross-government show and tell on 13 May, where we demoed the product. Questions and feedback were forthcoming and it’s clear there is a need for a product like this in other departments. As we work in the open, the code for the product is freely available to use and with a little configuration, could be used by anyone to track issues and provide real-time accessibility statements. **Our next steps** Over the coming months, we’ll: * onboard additional DfE services and partner agencies into the private beta * enhance our reporting based on user feedback * collaborate with GDS to consider broader government adoption * develop a public beta plan for how we could support other government departments to use the service By bringing together issue management, dynamic accessibility statements, and targeted guidance, we aim to significantly improve digital accessibility across government services.
dfedigital.blog.gov.uk
May 22, 2025 at 2:31 AM
Learning from Accessibility Experts: A Day at the DfE Lab
On a crisp spring morning, user researcher Sree and interaction designer Claire travelled to Sheffield to visit the Department for Education’s Accessibility Lab. Their goal was to understand how digital services function for those who navigate the world differently. ## Inside the Lab: Where digital barriers become visible _Claire, Sree and Jane at DfE’s Accessibility Lab, Sheffield_ We expected a technical demonstration with a run-through of tools and accessibility best practices. What we got was something much more human: a window into the lived experience of those who rely on assistive technologies daily. Guided by Jane Dickinson, an accessibility specialist at DfE, we explored tools like Dragon, JAWS, ZoomText, and Fusion. Jane not only explained how they work but showed us how easily they can fail when services aren't built with accessibility in mind. ## Insights from testing assistive tools ### Dragon: Voice recognition for hands-free navigation Dragon allows people with mobility impairments to control a computer using voice commands. Jane demonstrated how Dragon struggled with buttons on a DfE service and the BBC homepage because they weren't coded correctly. Highlighting a gap between design and code. ### JAWS: Screen reader for non-visual navigation JAWS relies on well-structured content: proper headings, labelled buttons, and descriptive links. Jane showed how unlabelled links like “Read more” or “Download” confuse JAWS users. This is due to missing descriptive ARIA labels, making browsing chaotic and frustrating. As Jane put it: >  “If a page isn’t structured properly, it’s a nightmare to navigate.” ### ZoomText: For low vision users ZoomText is a magnification tool that helps users navigate visually. However, it requires users to hover or click on links to have them read aloud, unlike JAWS, which reads automatically. At higher magnification, text can become distorted, affecting readability. ### Fusion: Combining JAWS and ZoomText Fusion provides auditory feedback and high-level magnification for individuals with partial vision loss, offering magnification up to 20x with auditory feedback. But Jane showed us that even a 3x zoom can cause layout issues, like pixelation and clipped content, especially when sites don’t reflow properly. ### Keyboard-only navigation Keyboard navigation is essential for users who can’t use a mouse, relying on shortcuts like the Alt key. But inconsistent implementation makes things harder. Jane pointed out unlabelled buttons on the BBC homepage that would leave keyboard users guessing: > “If something isn’t labelled properly, it just gets skipped over.” ### Captions for hearing impairments Captions aren’t just for deaf users, they help everyone. But live captions often lag, making comprehension harder. Testing BBC video content, we saw captions fall out of sync with speech. ## Seeing the world through the eyes of others _Sree and Claire testing visual impairment simulation glasses_ During our lab experience, we tested simulation glasses that mimic visual impairments such as cataracts (blurred vision), tunnel vision (loss of peripheral vision), and left-sided hemianopia (half the visual field disappears). It’s humbling to see how much of the digital world becomes difficult to use under these conditions. Highlighting the need for inclusive and thoughtful design. _The Visual Impairment North-East (Vine) Simulation Package_ ## In conversation with Accessibility Experts To enhance our understanding of accessibility, we conducted interviews with Jane Dickinson and Jake Lloyd, two prominent accessibility specialists at the Department for Education (DfE). Jane highlighted a significant concern: the tendency to address accessibility only at the final stages of development.“It’s not enough to test for accessibility. Real users need to shape the design from the beginning.” Jane highlighted how many users hesitate to disclose their accessibility needs for fear of being seen as difficult. Even when reports are written to improve accessibility, they often go ignored. >  “I can spend a whole day writing a report, and sometimes nothing changes.”  Despite these challenges, Jane celebrated the wins,a blind user who was able to access their payslip independently for the first time: > “One of our blind users told me, ‘For the first time, I didn’t have to ask someone to read my payslip. I could do it myself.’ That made all the work worth it.” Even small changes,like properly labelling buttons,can make a service more usable. Jake emphasised the importance of building for keyboard navigation and screen readers from the very start. > “There are so many accessibility issues that come from not thinking about keyboard accessibility… It affects focus, visibility, and how well voice and assistive tech tools work.” He highlighted issues like repetitive, unclear links in patterns such as “Check your answers”: > “Something like the ‘Check your answers’ pattern has links that just say ‘Change’… If you're just using a screen reader and you're navigating through a bunch of links… you're only going to hear “change”. So providing some hidden screen reader text, giving more context to that link can be really helpful.” ## A holistic approach to accessibility The accessibility specialists broke down their layered approach to testing accessibility of services: * **Automated testing** to catch common issues early. * **Manual testing** using only a keyboard or different zoom levels. * **Assistive tech checks** like screen readers and voice controls. * **Code reviews** to ensure correct HTML and component use. As Jake put it, accessibility goes beyond the Web Content Accessibility Guidelines (WCAG) standards: >  “I’ll also record issues that don’t fail WCAG but still create barriers—like having to tab 30 times to reach an ‘apply filter’ button.” Jake warned against treating accessibility as an afterthought: >  “Where teams haven't thought about accessibility and inclusive design up front and early on, complex issues tend to come out of that.” ## Not boring. Not optional. A myth Jake wants to debunk is that accessible design equals boring design. > “You can still be innovative. Your website can look good and be accessible if you plan it that way from the start,” he said. “Unfortunately, some organisations continue to treat accessibility as an afterthought, which remains a cultural issue”. Our specialists pointed out that advocacy and awareness are key to changing this mindset: > “Having people with actual lived experience that can demonstrate the way that they interact with digital content, can be really powerful… Here's someone who is blind. They use a screen reader to navigate your service, and they can't do it.” They stressed how one in four people have a disability—can you afford to turn them away with inaccessible services? ## Why accessibility matters for everyone Jane and Jake made it clear: accessibility isn’t just for disabled users. It benefits all of us. Captions help on a noisy train. Good contrast helps in bright light. And if zooming to 400% breaks your layout,it’s not just low vision users who suffer. > “If it’s not thought about up front, then it affects a lot of people.” ## Accessibility isn’t a task—it’s a mindset  As user researchers and designers, we focus on how people interact with digital services. In Sheffield, we were learners, not experts. This experience wasn't about merely checking off accessibility guidelines. It was about understanding the impact when those guidelines are not met. Leaving Sheffield, we carried a renewed resolve to champion accessibility. The best accessibility work ensures people don't need to ask for help in the first place. ## Useful resources * Training - Accessibility manual * Making your service accessible: an introduction - Service Manual - GOV.UK * Accessibility and inclusive design manual - Accessibility manual * Home | Web Accessibility Initiative (WAI) | W3C * W3Cx: Introduction to Web Accessibility | edX * Practical Accessibility — Practical Accessibility for web designers and developers
dfedigital.blog.gov.uk
April 26, 2025 at 2:28 AM
Introducing the DfE Plain Language standard
In digital, data, and technology in DfE, we’ve published a Plain Language standard. The Plain Language standard ensures that users can: * find the information they need * understand what they find * act on what they understand The standard incorporates our ways of working and best practise for user-centred design in government. It should not create any more work for content designers, or anyone working on content. ### A department-specific standard to support delivery There are lots of different standards that we need to meet when building digital government services. While services are typically assessed against the 14 points of the government Service Standard, sometimes, we use other standards to further assure our services. For example, this can include: * accessibility conformance * data protection * personal data handling in user research The DfE Plain Language standard provides a framework and a departmental commitment to use plain language in our services. Our working hypotheses is that a standard which formalises the use of plain language will support teams to advocate for its use. ### Applying standards and guidance In DfE service teams, we follow and apply GDS Service Manual guidance, and work to meet the Service Standard. But not all services or websites go through assessment checks. We know that existing guidance can be challenged or that people might not know it exists. The Plain Language standard aims to help us to meet our commitment as a public body to build services that everyone can use. ### How we designed the standard When the International Organization for Standardization (ISO) published the Plain Language standard, content designers in DfE mapped out how we work to meet it and identify any gaps. ISO Plain Language standard mapping against the Service Standard We shared this work with the DfE content community, user-centred policy design colleagues, and across government with content designers at HMRC and GDS. We also explored the idea of whether having a language-based standard could support the work we do as content designers. ### What we learnt Our research showed that a Plain Language standard could support or add to GDS guidance and help people advocate for using plain language. We also knew that other disciplines have standards, for example, Tech Code of Practice (TCoP), which support service delivery. The scope of our standard could be a minimal viable standard to support raising the quality of services. ### Scoped the standard We ran a workshop with DfE senior content designers and content designers across DfE and GOV.UK to explore what a DfE Plain Language standard could look like, how could we measure it, and when it would be used. DfE design workshop to explore a Plain Language standard We took the first draft to the wider DfE content community to ask if they could: * apply the standard * identify what guidance might be needed * identify any gaps We heard, ‘a standard gives more clout than guidance’ and that it could be shared with stakeholders and contractors, to support and reinforce our ways of working. We also received challenge about overly formalising the content design process and that sometimes government needs to use jargon. Meeting the standard is about working to GDS guidance and best practice. We do recognise that as a government department we must use jargon sometimes, the jargon is what users recognise, rather than plain, simple wording. This is why we acknowledge that if there is a business, or user need, we always focus on writing clear content, directed by user research. We iterated the standard, shared it with GDS and had a second community crit. We refined guidance for tools we use to support meeting the standard and considered cross-government content discussions about questioning how helpful a reading age is, so we removed it. The Plain Language standard was approved by our standards forum in February, as part of a wider standards piece of digital, data and technology work in DfE. We’ll be working with our senior content designers to understand how useful the standard is to support service delivery.
dfedigital.blog.gov.uk
March 20, 2025 at 2:47 AM
Applying filters consistently in DfE
Filters are a common component used in many services across DfE. There are many variations and we as a department have no guidance on how to use filters. There are a lot of differences in filter styling and functionality in the department. ## Discovery Our patterns and components working group ran a discovery to look at the different ways teams use filters. We reached out to the GOV.UK design system team to discuss their filter component. Due to prioritisation of other components, it will be a long time before they add it to their design system. We asked designers and user researchers in DfE to share examples of how they use filters on a Lucid board. We documented 14 different services using a filter. This included any insight about user research or accessibility issues. We then reviewed them as a group and found that in most cases: * services use a vertical filter on the left of the page with a govuk-grid-column-one-third div * they display results on the right with a govuk-grid-column-two-thirds div * services use a variation of the MoJ filter component ## What we learnt During the discovery we found some common problems and areas we need to learn more about. This included: * users missing “Show filters” button on mobile devices * position of the “Apply filters” button causing users to scroll excessively on long lists * difficulty finding specific filters when there are many options in a category ## Mobile design We heard several instances of users missing the “Show filters” button when using a mobile device. We think users may miss this grey secondary action button for a few reasons, including: * it looks inactive * the colour of the button is too similar to the background colour of the filters * the button is too far away from the results it relates to ## Moving the “Apply filters” button We heard from several teams that adding an apply filters button at the bottom of a long set of filters was helpful for their users. It stopped users having to scroll down a long list to select required filters, then scroll back up to apply them. ## Categories with many filter options Filter categories that contain many options can become long and hard to read or navigate. The MoJ design system suggests a few ways to help: * show or hide filter categories in an accordion * enable users to scroll through the list of filter options in a category * reduce the amount of filter options in a category * make long filter categories searchable However, there can be some issues with some of these options. For example, it can be hard for a person using a mobile device to scroll through a list of filter options accurately. ## Conclusion We decided that the MoJ filter component is suitable for what most teams need when adding a filter component to their service. We chose not to create our own filter component, but have created some guidance on how to use filters in DfE. We recommend teams use the MoJ filter in most cases. But, be flexible on how you present filters based on the context of the service and user needs. ## Share your feedback > We want to hear from you if you’ve been in a team that have used any of these approaches and found any usability or accessibility issues. > > Please share any insights you gather. > > You can contact the DfE patterns and components group, on DfE Slack or contact the MoJ design system team on cross-gov Slack.
dfedigital.blog.gov.uk
March 4, 2025 at 2:26 AM