I* Make Awesome * Bobbu

Photo of Bobbu

"No problem. Here is the case study on the work they did for Disclosure Scotland's online application form."

The ask

As part of the digital transformation program for Disclosure Scotland, the online application for their basic disclosure certificates needed an overhaul. I worked with my colleagues at BJSS to turn a poor user experience into a clear, accessible, responsive, easy-to-use, fast user experience. Of course, we also hoped to bring a little bit of delight into it, too.

Challenges

There was little to no useful data on the existing form – no analytics, so we had only the volumes and a breakdown of some of the content of the applications. There was ongoing debate around whether our higher priority should be transformation of the end-to-end process of applying for a disclosure, or transition of the existing process onto a new tech system. GDS and MyScot.gov guidelines differed, and in some places conflicted with each other, and we had to follow both as best as possible.

Approach

Identifying requirements

The first order of business was to gather the business requirements, as the application form serves a very important function in gathering information for a background check. We worked with a BA from BJSS to identify what data was required by the service, and in what format. We also discussed with Disclosure Scotland what we wanted the experience of filling in the application to be – identifying key goals such as “fast,” “easy” and “informative.” The design team built a Sketch pattern library together to use during wireframing, which was built using the MyScot.gov design guidelines, supplemented by the GDS style guide when necessary. This created a unique and pretty comprehensive set of components to use, which would enable us to work swiftly and consistently

Research

We ran extensive research with the Disclosure Scotland research team; including home visits, user interviews, information gathering from partner companies who worked more directly with users, and analysing what data we had on current applications. This provided us with a wealth of qualitative information about user experiences with the current form, and quantitative data about the contents of applications.

User journeys

Taking the existing form, we broke it down into a user journey that represented how a standard user would traverse it. We used this to identify particular points of friction that could be erased. Then we developed, with BJSS BAs, the Disclosure Scotland PO and research team, a user journey plan that separated groups of questions that served common business needs into coherent questions to pose to the users. This helped to adopt the conversational approach to the experience that we wanted to create.

Wireframes

Using the pattern library we had built up, we was able to quickly put together high-fidelity designs in Sketch that allowed us to quickly test our initial idea based on the user flow. We printed out these quickly composed screens, and presented them to some members of the public in a quick and dirty bit of guerrilla testing. This was extremely useful for gathering some rough feedback about how much sense our question groupings made to end users, and gave us some very positive feedback. With our initial ideas tentatively validated, and some criticism to build upon, we felt confident enough to put some more work into this idea.

Prototype

We coded a prototype from scratch, using only a very basic javascript script for managing multi-step forms. We chose to avoid the GDS prototype toolkit, as we wanted to make our initial prototype as mobile as possible, so that we could run it anywhere without having to set up an environment. This was made to be as close to the finished interface as possible, without plugging in any back-end functionality or worrying about code quality, but while ensuring we kept to high accessibility standards.

Testing

We took this prototype out to do as much testing as possible, across as wide a variety of our audience as possible. This included business users; home users; people with disabilities; folks from different ethnicities, genders and sexualities; people with and without criminal records; a number of different trades and market cross-sections. All of this testing was done with an appropriate combination of mobile and desktop usage.

Iteration

We took the information we gathered after each round of testing and identified all the comments that were significant, and allocated priorities. This included things like flagging up show-stopping bugs and dead ends in the user journey, noting discrepancies in the design consistency, identifying friction points, and any unclear content. we would address these problems in order of priority, working through solutions, testing them again, and putting appropriate solutions into the backlog for delivery.

Delivery

As features we considered to have been tested enough to ensure they worked well enough as an MVP, they were transcribed into user stories and put into the backlog for delivery. Improvements discovered by testing would be added in as they were discovered, and run through estimation and prioritisation exercises to ensure it is delivered at the right time. This constant addition of features and improvements allowed the development team to operate in a truly agile fashion, adjusting the angles of upcoming sprints to accept new enhancements while also keeping to a well-defined set of priorities.

Post-live

We did not stop iterating and improving after the initial release of the application form into beta. Once it had real users interacting with it, we had the opportunity to examine real-time analytics that we installed on the site, as well as feedback provided via a form added to the end of the journey specifically for users to provide us with their thoughts. We used the information gathered by these to help inform and direct our continued user testing, and refine the experience.

Outcome

We can happily report that the end result of our work was a responsively designed online application form, that consistently received very positive usability test feedback, an extremely low drop-off rate in our analytics, glowing feedback from the feedback forms, a AAA-rated accessibility experience, and collected reliable data for the processing of applications. A success all round!