Date:
June 2017
Client:
Ultrasonic Welding Manufacturer
Role:
UX Designer and visual design
My Role
- Lead user experience designer tasked with applying previous research, information architecture, ideation, prototyping, user testing for the creation of a new HMI.
- Set visual design direction and assisted in visual design pattern library
The Project
- Ultrasonic welding manufacturer was looking to design a next generation welder with an updated HMI that closely tied into the new industrial design
The Challenge
- Create a unified design for 5 devices with various features & abilities within 5 months
- Large/medium screen (7” to 12”)
- Small screen (4.3”)
- Mobile/remote access
- Desktop
- Tablet
- Phone (Portrait and landscape orientation)
- Account for five broad tasks & 4 personas
- Weld or recipe creation
- Data analytics
- Troubleshooting & diagnostics
- Weld production
- Error handling
- Personas
- Engineers, technicians, managers/supervisors, operators
- Unforeseen challenges
- Client was not familiar with software development and required more upfront product management
- Loss of a project manager
- Understaffed project
The Strategy
- Targeted design sprints for each device
- Objected oriented design
- Constant iterative process with key stakeholders & SMEs
The Solution
- A dashboard landing screen with quick access to information & navigation pertinent to all of the personas
- A unique recipe creation process that allows engineers to view results immediately and adjust weld parameters on the fly with innovative views of graphs, charts and tables
- Integrated data analytics that encompasses the system, weld production, and recipe creation
- Beautiful piece of software intertwined with a sleek industrial design
Continue on for the process…
The Process - Design sprints
The client had spent a year doing in depth research about their current product line that examined their users and how they interacted with the HMI. They categorized the tasks and grouped them into seven broad categories. They identified the five most important categories and tasks that they wanted to design solutions for, and identified key personas that used the software in various ways. The personas were broken into tasks, traits, habits, motivations and technological experience.
Our goal was two run design sprints for each of the devices because they would all be used in different ways. Sprints lasted two weeks and broken into five phases: Understanding, diverge, converge, prototype & test.
Understand
We synthesized the clients upfront research by breaking out the personas and listing goals and trends. We took the five major categories and extrapolated a process in which how and when they are used. We also identified problems and opportunities within each of the tasks that we have to design for.
Diverge
We selected 2-3 problems and opportunities, identified in the previous phase, from each category to diverge on. This list was generated and presented to the design team the following day. We were able to gather five designers and two key client stakeholders. After the team was presented the knowledge and understanding phase, we set out to diverge and design for each major category.
Converge
After the team converged on their initial ideas, we presented the concepts to the client with a description of each sketch. They got back us immediately and we setout to converge and iterate on those ideas further.
Prototyping & Testing
We decided to prototype using Axure and created 1 - 4 screens per persona workflow. We presented the concepts to SMEs using qualitative testing methods, asking questions such as ‘What do you see here?’, ‘What do you think you can do here?’, ‘How would you perform tasks using this screen?’, and a SUS test on their current product line. Their current system scored a 70.
Both prototypes (large/medium & mobile) were created using Axure. We sued Axure breakpoints to show the software in landscape and portrait on mobile devices.
The feedback was primarily positive however we needed to revise the recipe creation and simplify the dashboards.
Testing Round 2
We spent a two and a half weeks after our initial testing in order regain our thoughts and reiterate on the solutions we came up with. In order to make sure we were on the right track, we wanted to retest the revisions especially on recipe creation, dashboard, data collection and general system behavior. We also had enough time to increase the fidelity. Often times, I find that users have difficulty abstracting concepts and visual elements. It’s best to be as explicit as possible.
The Process - Iteration, prototyping & user testing
We decided to do a week long whiteboarding session with three UX designers and iterate on the solutions we had created and received feedback on the previous testing sessions. We listed out all of the key features of each tasks or workflows on a given section and then went to town.
Here is an example of a whiteboarding sketch that was based off of my concept for weld creation, and a previous concept of a production screen that another designer envisioned.
Because we had answered so many questions early on through our longer sprints, we were able to hit the ground running on our prototypes. We decided to stick to designing for the main persona workflows and based our prototypes on accomplishing the key tasks such as production, recipe creation, and data analytics. As opposed to the previous prototypes, these were clickable and fully functioning.
One key feature that engineers asked for was the ability to view data and edit weld parameters at the same time. We solved this by creating a bank of parameters that allowed users to set necessary parameters such as control modes, and then add additional ones such as suspect and reject limits. These were stored in a sliding drawer on the right side of the screen. In the center were the weld results and on the left side was the weld history list.
Each designer was given a workflow to prototype. Once prototyping was complete, we created a script to test the various functions. User testing occurred remotely over GotoMeeting. Our testing scripts were typically 4 pages long and each test was around an hour. We needed to validate our simplified dashboard concepts among all personas, while the rest of the test was persona specific (Engineers, Operators Techs, etc). We asked users to perform tasks and scored based on a pass or fail. Quality based questions were held till the end.
After the test, we asked participants to take a SUS of the prototype. Our new design scored an 82.5 (The original software scored a 70) and each participant said the system either improved or greatly improved previous workflows. Feedback was overwhelmingly positive.
The Process - Visual design
The client was interested in a modern look. Originally we went with a anthropomorphic design. The client was happy with the direction at first, however, after explaining to them that it would increase development time and reduce performance they asked us to look into other possible designs. We eventually decided to use Google Material Design patterns. Material design offers a modern look because it’s clean and understated, but more importantly, it’s better for scaling across platforms and lends itself well to repeatable design patterns.
The font type was OpenSans. It tests well, has a wide range of styles, and is free.
The last deliverable was a pattern library. I am a believer that the best pattern libraries are built using building blocks. Using the Pattern Lab approach created by Brad Frost, we designed each component of the UI to be built off basic elements. For instance, the basic block is a background element used in almost every panel. We provided the margins & padding, background colors, described the behavior and provided a physical unit of scale to align with a 10px grid. Unlike web design, software design isn’t as grid based, however, we suggested to build elements to a scale of 10px. Another example is the large component. We provided colors, text styles and margins. From there the client could build buttons, dropdowns, parameter bank buttons, etc.
Device: 7” - 10” Touchscreen
Device: Android/iPhone mobile
Wrap Up
- Our new design had an increase of 12.5 points on the SUS scale. We brought a C level design to an A.
- Design sprints were not the best process solution for such a large application however we made them work because the final prototype fairly broad.
- The pattern library was designed to allow the client to use repeatable elements and patterns efficiently without needing many screen comps.