I had a conversation recently with a huge technology company, and they wanted to know if their work in human-centered design guards against experience bias. The short answer? Probably not.
When we say experience bias, we’re not talking about our own cognitive biases; we’re talking about it at the digital interface layer (design, content, etc.). The truth is that pretty much every app and site you interact with is designed either based on the perceptions and ability of the team that created it, or for one or two high-value users. If users don’t have experience with design conventions, lack digital understanding, don’t have technical access, etc., we’d say the experience is biased against them.
The solution is to shift to a mindset where organizations create multiple versions of a design or experience customized to the needs of diverse users.
Going back to this tech company I was talking with, any company’s investments in empathetic design are essential, but, as someone who has launched and runs design functions, we need to address a few dirty secrets.
The first is that UX and design teams are often instructed on very limited target users by a strategy or business function, and experience bias starts there. If the business doesn’t prioritize a user, then a design team won’t have the permission or budget to create experiences for them. So even if the company is pursuing human-centered design or employs design thinking, they’re often just iterating against a user profile based on commercial interests and not aligned with any definition of diversity in terms of culture, race, age, income level, ability, language or other factors.
The other dirty secret is that human-centered design frequently assumes humans design all of the UX, services and interfaces. If the solution to experience bias is to create tailored variations based on users’ different needs, this hand-crafted UI model won’t cut it, especially when the teams making it often lack diversity. Prioritizing a variety of experiences based on user needs requires either a fundamental change in design processes or leveraging machine learning and automation in creating digital experiences — both necessary in a shift to experience equity.
How to diagnose and address experience bias
Addressing experience bias starts with understanding how to diagnose where it might appear. These questions have been helpful in understanding where the problem can exist in your digital experiences:
Content and language: Does the content make sense to an individual?
Many applications require special technical understanding, use jargon oriented to the company or industry, or assume technical knowledge.
With any financial services or insurance website — the assumption is that you understand their terms, industry and nomenclature. If the days of an agent or banker translating for you are going away, then the digital experiences need to translate for you instead.
UI complexity: Does the interface make sense based on my abilities?
If I have a disability, can I navigate it using assistive technology? Am I expected to learn how to use the UI? The way that one user needs to navigate an interface may be very different based on ability or context.
For example, design for an aging population would prioritize more text and less subtle visual cues. In contrast, younger people tend to do well with color-coding or preexisting design conventions. Think about terrible COVID-19 vaccine websites that made it your problem to understand how to navigate and book appointments — or how each of your banks has radically different ways to navigate to similar information. It used to be that startups had radically simple UIs, but feature upon feature makes them complex even for veteran users — just look at how Instagram has changed in the past five years.
Ecosystem complexity: Are you placing responsibility on the user to navigate multiple experiences seamlessly?
Our digital lives aren’t oriented around one site or app — we use collections of tools for everything we do online. Almost every digital business or product team aspires to keep users locked into their walled garden and rarely considers the other tools a user might encounter based on whatever they’re trying to accomplish in their lives.
If I’m sick, I may need to engage with insurance, hospitals, doctors and banks. If I’m a new college student, I may have to work with multiple systems at my school, along with vendors, housing, banks and other related organizations. The users are always to blame if they have difficulty stitching together different experiences across an ecosystem.
Inherited bias: Are you using systems that generate content, design patterns built for a different purpose or machine learning to personalize experiences?
If so, how do you ensure these approaches are creating the right experiences for the user you’re designing for? If we leverage content, UI and code from other systems, you inherit whatever bias is baked into those tools. One example is the dozens of AI content and copy generation tools now available — if those systems generate copy for your site, you import their bias into your experience.
To start building more inclusive and equitable experience ecosystems right now, new design and organizational processes are needed. While AI tools that help generate more customized digital experiences will play a big role in new approaches to front-end design and content in the coming years, there are five immediate steps any organization can take:
Make digital equity part of the DEI agenda: While many organizations have diversity, equity and inclusion goals, these rarely translate into their digital products for customers. Having led design at large companies and also worked in digital startups, the problem is the same across both: a lack of clear accountability to diverse users across the organization.
The truth is that at big and small companies alike, departments compete for impact and who is closer to the customer. The starting point for digital experiences or products is defining and prioritizing diverse users at the business level. If a mandate exists at the most senior levels to create a definition of digital and experience equity, then each department can define how it serves those goals.
No design or product team can make an impact without management and funding support, and the C-suite needs to be held accountable for ensuring this is prioritized.
Prioritize diversity in your design and dev teams: There’s been a lot written about this, but it’s vital to emphasize that teams that lack any diverse perspective will create experiences based on their privileged background and abilities.
I would add that it’s essential to cast for people who have experience designing for diverse users. How is your organization changing its hiring process to improve design and developer groups? Who are you partnering with to help source diverse talent? Are your DEI goals just check boxes on a hiring form that are circumvented when hiring the designer you already had in mind? Do your agencies have clear and proactive diversity programs? How well-versed are they in inclusive design?
A few valuable initiatives from Google are exemplary: In its efforts to improve representation in the talent pipeline, it has shifted funding of machine learning courses from predominantly white institutions to a more inclusive range of schools, enabled free access to TensorFlow courses and sends free tickets to BIPOC developers to attend events like Google I/O.
Redefine what and whom you test with: Too often, user testing (if it happens at all) is limited to the most profitable or important user segments. But how does your site work with an aging population or with younger users who don’t ever use desktop computers?
One of the key aspects of equity versus equality in experience is developing and testing a variety of experiences. Too often, design teams test ONE design and tweak based on user feedback (again, if they’re testing at all). Though it might be more work, creating design variations considering the needs of older users, people who are mobile-only, from different cultural backgrounds, etc. allows you to link designs to digital equity goals.
Shift your design goal from one design for all users to launching multiple versions of an experience: Common practice for digital design and product development is to create a single version of any experience based on the needs of the most important users. A future where there’s not one version of any app or site, but many iterations that align to diverse users, flies in the face of how most design organizations are resourced and create work.
However, this shift is essential in a pivot to experience equity. Ask simple questions: Does your site/product/app have a variation with simple, larger text for older audiences? In designing for lower-income households, can mobile-only users complete the tasks you’re expecting, as with people who would switch to desktops to complete?
This goes beyond simply having a responsive version of your website or testing variations to find the best possible design. Design teams should have a goal of launching multiple focused experiences that tie directly back to prioritized diverse and underserved users.
Embrace automation to create variations of content and copy for each user group: Even if we create design variations or test with a wide range of users, I’ve often seen content and UI copy be considered an afterthought; especially as organizations scale, content either becomes more jargon-filled or so overpolished that it’s meaningless.
If we take copy from existing language (say, marketing copy) and put it into an app, how are you limiting people’s understanding of what the tool is for or how to use it? If the solution to experience bias is variation in front-end design based on the needs of the individual, then one smart way we can dramatically accelerate that is to understand where automation can be applied.
We’re at a moment in time where there is a quiet explosion of new AI tools that will radically change the way UI and content are created. Look at the volume of copy-driven AI tools that have come online in the last year — while they’re largely aimed at helping content creators write ads and blog posts faster, it’s not a stretch to imagine a custom deployment of such a tool within a large brand that takes users’ data and dynamically generates UI copy and content on the fly for them. Older users may get more textual descriptions of services or products that have zero jargon; Gen Z users may get more referential copy with a heavier dose of imagery.
The no-code platforms show a similar opportunity — everything from WebFlow to Thunkable speaks to the possibility of dynamically generated UI. While Canva’s designs may feel generic at times, thousands of businesses are using it to create visual content rather than hire designers.
So many companies are using the Adobe Experience Cloud but seemingly ignore the experience automation functions that are buried inside. Ultimately, the role of design will change from handcrafting bespoke experiences to being curators of dynamically generated UI — just look at how animation in film has evolved over the past 20 years.
The future of design variation powered by machine learning and AI
The steps above are oriented toward changing the way that organizations address experience bias using current state technology. But if the future state of addressing experience bias is rooted in creating design and content variations, AI tools will start to play a critical role. We already see a huge wave of AI-driven content tools like Jarvis.ai, Copy.ai and others — then there are automation tools built into Figma, Adobe XD and other platforms.
AI and machine learning technology that can dynamically generate front-end design and content is still nascent in many ways, but there are interesting examples I’d call out that speak to what’s coming.
The first is the work that Google released earlier this year with Material You, its design system for Android devices that’s intended to be highly customizable for users as well as having a high degree of accessibility built-in. Users can customize color, type and layout, giving them a high degree of control — but there are machine learning features emerging that may change the designs based on user variables such as location or time of day.
While the personalization aspects are initially pitched as giving users more ability to customize for themselves, reading through the details of Material You reveals a lot of possible intersections with automation at the design layer.
It’s also important to call out the work that organizations have been doing around design principles and interactions for how people experience AI; for example, Microsoft’s Human-AI eXperience program, which covers a core set of interaction principles and design patterns that can be used in crafting AI-driven experiences alongside an upcoming playbook for anticipating and designing solutions for human-AI interaction failures.
These examples are indicators of a future that assumes interactions and designs are generated by AI — but there are precious few examples of how this manifests in the real world as of yet. The point is that, to reduce bias, we need to evolve to a place where there is a radical increase in variation and personalization for front-end designs, and this speaks to the trends emerging around the intersection of AI and design.
These technologies and new design practices will converge to create an opportunity for organizations to radically change how they design for their users. If we don’t begin to look now at the question of experience bias, we won’t have an opportunity to address it as this new era of front-end automation takes hold.
Source Link Design’s dirty secrets and how to address experience bias