Enterprise Software Design & Research

Bluetooth Launch Studio

  • Timeframe: 22 Months (2014-2016)
  • Personal role: Lead researcher and designer
Bluetooth Launch Studio

Overview

Bluetooth is a wireless technology found in billions of devices. How do people make new products with Bluetooth while ensuring compatibility with existing devices? The Bluetooth Special Interest Group creates standards and tools to help.

I led the two year process of creating Launch Studio, a web app for testing Bluetooth devices. As the lead UX Designer and Researcher on the project, I drove the product vision from inception through beta releases, using a deep understanding of engineering practices, policies, and divergent user needs.

All new Bluetooth products must be tested to ensure compliance to standards. Launch Studio determines testing requirements and documents test results. This tool provides 75% of the SIG’s income, around $15 million annually.

Challenges
  • Integrate a fragmented collection of tools for testing electronic hardware and managing regulatory requirements.
  • Provide control to experts while making the tool accessible for newcomers to Bluetooth product development.
  • Collaborate with an industry establishment that is invested in status quo. Entrenched consultants depend in part on the tool being too complex for newcomers to use on their own.
  • Address the inherent complexity in the underlying system of technical requirements.
Outcomes
  • Cultivated a collaborative design process within an engineering driven organization. Learned how to work with other roles adjacent to the design team.
  • Created a starting point for novice users that eliminates the most complex and time consuming steps of testing.
  • Addressed underlying user needs throughout the system, rather than simply refreshing the UI.
  • Developed the maturity and influence of UX practice within the organization.
My Role
  • Design leadership, strategy, and vision
  • User research field studies
  • Information architecture
  • User interface design and testing
  • Prototyping
  • Stakeholder management

Launch Studio replaces a fragmented, siloed set of tools. Over the preceding decade, layers of complexity were added as policy changed. The process was so complex and opaque that a cottage industry of paid consultants grew to support this tool. Novice users were so intimidated that when testing was required, 97% choose to hire these consultants rather than learn the tool themselves.

The Research Process

The Bluetooth SIG is a member driven organization with advisory boards for various technical roles. I worked with a board of experts, BTI, which defines testing procedures and policies. This is an independent group of expert users, but one with with political power and insider status. I also worked with the BQEs, the independent consultants that guide clients through the testing process.

In a 3 day workshop, we explored the tools, pain points, and scope in terms of policy and regulation impediments. While the group had prepared multiple specific feature and UI suggestions, I kept them focused on the overall process. By stepping back from the user interface and examining the underlying process, we were able to discover the root causes of difficulties and better explore how the tools could work.

Expert group workshop. We created role based personas that exposed collaboration needs and established priorities. Members had never before been asked to contribute to our tools. The activity built the political capitol for the design team to continue the project.
The existing process, according to the experts. Nobody had realized how exceptions and corner cases added so much complexity to the process. Making these exceptions clear became our strategy for simplification the tool.
An ideal, simplified process that serves all roles in the Bluetooth ecosystem. Consciously accommodating corner cases and complexity in the underlying process allowed a UI that could follow one straightforward path, making requirements clear. Because much of the old process was not dictated by policy documents, but by what the existing tools supported, we had to carefully navigate policy and convention.

Research Findings

The research workshop revealed a handful of critical insights:

The business began the project expecting to combine two main tools: a web app for determining test requirements, and an offline tool for conducting the testing. I discovered that not only was the existing approach working well, but that people using the tool suite’s offline testing platform (PTS) were an entirely different role from those who used the online tools for making a test plan. The existing division of tools, though archaic, aligned well with the different user groups and their needs.

The Managers

This group of experts asked for more features, rather than making the existing features work well. Individual interviews revealed an ulterior motive: an easy to use tool undermines consultants' business. They wanted the SIG to provide more expert-oriented features, but not to remove the barriers to entry for novices. Challenging the expert-centric culture of the existing tools - both internally and externally - became a cornerstone of the product.

The Testers

Later, I spoke to a different user group: professionals who conduct the testing. Unlike the managers and policy makers from the earlier workshop, these testers knew the technology and terminology, but infrequently worked with the existing online tools. More commonly, they consumed test plans from the online tools and submitted their results to test managers.

Paper prototype
Paper prototype for user testing. Initial concepts were tested with people in diverse roles. This helped expose additional needs and requirements.

I conducted interviews by first asking about the existing process and tools, then testing a paper prototype. These conversations revealed new issues surrounding collaboration and tasks manually conducted outside of our tools.

Test plans are created by test managers using one of the online tools. The test plans are then used by test engineers, who run the tests on multiple platforms. Test engineers then document the test results and evidence, which is uploaded back to the online tools. Documentation is completed manually, usually in an excel spreadsheet. This process is tedious and prone to error. With thousands of tests and testers, plus multiple versions of the hardware and software under test, test managers never get a clear picture of their progress. Speaking with people who manage evidence but don't run their own tests, I was able to better understand the process and solve problems that would not have been identified with a narrow, UI-centric approach.

Challenge: Onboarding

Research exposed a process with a complex set of testing requirements that could shift in scope depending upon seemingly minor details. People didn't know how to get started or how much work would be required. They needed a clear starting point to expose the required steps, adding certainty to the process and scope of the testing requirements.

The underlying process that Launch Studio supports can be complex, but in the majority of cases, it can be reduced to a simple product declaration form. The challenge is in directing users down the correct path - simple only when allowed, full and complex process only when required.

Launch Studio supports two main processes - Qualification and Declaration. For simple cases where the hardware changes are minimal, policy requires only a simple declaration. For scenarios where the Bluetooth hardware design has changed, qualification testing is required before declaration. Launch Studio uses two distinct paths to make the simple cases effortless, but without marginalizing the complex scenarios.

Two versions of the navigation bar
Main navigation. Launch Studio is a unified tool with differing paths based upon individual requirements. This provides a clear sense of the work to be done. Steps may be completed in any order, easing collaboration between each step's specialists.

The core issue is determining whether a user needs to conduct qualification testing. The qualification process takes months and costs thousands of dollars, so it’s important to avoid it if not required. Unfortunately, the requirements are hundreds of pages long and still don’t provide a clear answer.

The first approach to this problem asked two simple questions at the beginning of every project, which alter the resulting steps based on the requirements.

Early version of project basics questionnaire
An early version of project basics used a questionnaire to determine if Qualification is required.

When the design was tested, the experts resented having to answer questions when they already knew which process they needed. Furthermore, the questions oversimplified the process and didn’t capture every nuance of the requirements.

Charged with making the the tool accessible for everyone, but unable to hide the technical requirements, the Getting Started page was created. This page orients novices by explaining that there are two processes, then provides simple examples of each scenario. Users consciously choose the correct path based upon their requirements. For most cases, this provides a clear answer.

Final version of getting started
The getting started page orients users and exposes the process. Users consciously choose the correct path based upon their requirements.

For those who still don’t know which path to take, a second method is attempted. The “Help me decide” button opens a modal box with yes and no questions. Through progressive disclosure, up to three questions are asked and a verdict is rendered.

This was a delicate compromise between distinct user groups. Experts insisted on power and direct control of all aspects. The other 90% of people needed help understanding the requirements. The help wizard offers direction and explanation of requirements while not hindering experts.

Help me decide questionnaire
The “Help me choose” button opens a modal box with yes and no questions in simple language. Through progressive disclosure, up to three questions are asked and a verdict is rendered. The help wizard offers direction and explanation of requirements while not hindering experts.

If the correct path remains unclear, links to customer service, in depth guides, and even the policy documents are provided. This provides everyone a simple way to get a resolution, but without hiding the necessary complexity.

Interactive Prototype

An interactive HTML/CSS/JS prototype was developed early in the design process. This helped the UX team establish a pattern library and style guide. The prototype was used extensively for user testing and internal decision making. After a private beta, Launch Studio was released in 2017 and the prototype taken offline.

Conclusion

  • We created a unified starting point so that anyone could understand the the process requirements, making the scope of product testing clear.
  • I led research into understanding needs in a technically complex policy environment.
  • I managed and communicated product compromises among distinct and often opposed stakeholders.
  • I introduced a collaborative design process to an engineering organization, enabling future UX team impact.