The Branch platform had a problem of inconsistent design, which can cause frustration with users and hinder development. I took part in the UI Enhancements Project, which tackled this by evaluating the platform and streamlining its components.
What is Branch?
Branch is a tech company headquartered in Palo Alto, CA, and they provide mobile deeplinking services to over 100,000 brands around the globe. I was lucky to come on as their UI/UX intern from May to August 2024!
The Design Problem
This isn’t just a minor inconvenience. For users, these inconsistencies increase cognitive load, forcing them to work harder to navigate the product. They also harm the brand’s credibility, making the product feel less thoughtfully crafted. Internally, the lack of standardization wastes valuable resources, as engineers and PMs spend time debating minor design decisions instead of focusing on functionality and solving real problems.
Project Overview
Role: UI/UX Intern
Task Force:
Mollie Cox, Senior Director of Product Design
Ali Ardalan, Principal Product Manager
Craig Simpson, Engineering Manager
Julie Wang, UI/UX Intern (me!)
Wilson Khieu, Product Manager Intern
Timeline: 3 months (May–August 2024)
Our Goal
While at first we didn't know how the project would pan out, we continuously took the next step. Ultimately, the project went through three distinctive phases:
Step 1
UI/UX Design Audit
Step 2
Documentation of Inconsistencies
Step 3
Updating the Design System
I'll go through each of these phases in more detail in the following sections!
Getting Our Bearings
To develop a high-level understanding of the platform and identify areas for improvement,
Using Nielsen’s 10 Heuristics as a guiding framework, I explored the platform, documenting design inconsistencies as well as intuitive observations about usability. When I was onboarded, my manager had just begun the process of the UI/UX audit and had me come on to help her evaluate the platform. I also reviewed recorded customer calls to see what others had to say.
My observations and notes helped to validate her findings, as well as provide suggestions on how the issues could be addressed. From my findings I began noticing common issues that we grouped together.
Audit Scope
Our audit focused on three key areas on the platform:
Summary Page: The most visited page on the platform, as it serves as the starting point for all users.
LinkHub: A major product where customers can track, manage, and create links.
Journeys: A newer product that empowers customers to control the entire… well, journey.
We chose these three areas because they represent high-traffic, high-value parts of the platform, offering a well-rounded sample for our audit.
Evaluating as a First-Time User
I happened to be new to the Branch platform, which made me the perfect test case for evaluating its intuitiveness. Each of the three focus areas extended beyond their dashboard pages, so I started by exploring the platform naturally, jotting down my initial impressions as a first-time user.
There were lots of elements on the platform that were unintuitive. Initially, I dismissed my confusion as a lack of familiarity with the product—but I quickly challenged that mindset. After all, isn’t the goal to design a user experience easy enough for anyone?
Example: The bookmark function allows a user to save the currently applied perimeters, but the icon is under the "Shareable Reports" section, which doesn't quite make sense. Also, previously saved perimeters don't have any descriptive details—only a title and the initials of the creator.
After my first-impressions run, I went through the pages again more methodically. I examined each area in detail, mapping out user flows with flowcharts, evaluating and documenting the design from top to bottom, and cross-referencing my findings with Nielsen's 10 Heuristics to ensure a thorough evaluation.
Example: A flowchart I made for LinkHub. I made several more flowcharts for the other pages and deeper subflows within them!
Compiling Findings
From this process, I compiled 51 pages of notes! A bit padded from ample screenshots, but whew. These notes broke down each focus area into general themes, specific UI/UX concerns, and recommendations for improvement.
General Themes Found:
Inconsistent UI in looks & behavior
Uncommon interaction design
Essential information hidden in tooltips
each representing “fruit on the ground—we just need to pick it up” (a phrase our Head of Product often used). These areas were selected as quick wins that could immediately elevate the platform’s user experience with minimal effort, making them the central focus of this project’s actionable next steps.
7 Key Focus Areas Identified:
Tooltips
Error Messages
Date Pickers
Navigation
Form Fields
Buttons
Empty States
Audit 2.0, like a Super Audit
With the 7 key areas identified, I then located and documented all inconsistencies within these categories to
I conducted a detailed sweep of the platform, expanding beyond the scope of the audit. With support from QA Engineers, I gained access to test environments, allowing me to interact with every possible page and interaction within the platform. I meticulously documented instances of inconsistencies, categorizing them under the 7 key focus areas.
This deeper dive revealed the complexities of the platform’s inconsistencies. It became clear that solving these issues would involve more than simply standardizing individual components. In many cases, entire UI layouts were fundamentally different from newer designs and required significant overhauls to align with updated standards. What seemed like a straightforward task—such as replacing buttons—turned into a larger effort to rework entire sets of components.
In Every Timeline…
To ensure I could document every possible issue, I enlisted help from our QA engineering team, who provided me with multiple dummy accounts with varying access levels. This allowed me to explore the platform thoroughly and access every possible page.
This step expanded on the initial audit, taking a broader and deeper approach. For example, I documented every variation of filter buttons on the platform to find out how many truly existed (I identified 7 distinct types—not including states like hover or disabled).
Screenshot of issues documented under the areas Form Fields and Buttons. Each main focus area had multiple types of issues, which were further grouped. This information was also transferred to a spreadsheet format for the PMs to do analytical work and prioritization.
By the end of this process,
The documentation also eventually became the team's source of truth, as I later updated it with solutions to every problem, and it served as a comprehensive playbook for which problems to fix and how to fix them.
Establishing Stricter Rules
The existing design system allowed for an overly broad spectrum of variations, leading to inconsistent usage across the platform. This flexibility led to ad-hoc creations, further compounding the inconsistency problem.
By updating the design system and restricting allowable patterns,
I tackled this by reducing unnecessary component variations, reducing the power of choice, and providing clearer guidelines within the design system. But it was more than just that; different strategies had to be used for different cases:
Simple cases: For components with multiple existing variations, we selected one standard and removed the rest.
Redesigns: Some components required slight redesigns before standardization, which involved additional iterations.
Legacy components: Older UI kits presented the biggest challenges. I collaborated closely with engineers to update these components without breaking functionality, balancing modernization with technical feasibility.
Example: Validation Messages in Form Fields
Validation messages are a part of a component we call "Form Fields", which actually had several problems associated with them. But in this example we will just focus on the format of the validation messages, which are the responsive feedback messages that appear underneath a field in response to user input.
Similar to the general themes uncovered in our audit and documentation, validation messages had various formats that were used throughout the platform. And while there was some sort of design system guidance on that topic, it was not documented clearly and it seemed that no one actually tried adhering to the guidance anyway.
On the left, there were various formats of validation messages on the platform. Alignment, letter case, icon, and even font varies. On the right, our design system documentation listed every possible permutation of a form field, making it hard to parse and understand the design intention.
This is an example of a simple case. Many variations already existed—all we needed to do was pick one as the standard, and enforce its design. Meanwhile, I also updated the design system documentation so that it wouldn't list every single possible form, but informs what is and is not an optional element within the component.
New Design System Documentation
I set up the components so that optional elements were boolean layers in Figma, so future designers can simply select Yes/No for their appearance.
Example: Alert Messages
One of our 7 key focus areas, Error Messages, actually referred to what is formally known internally as Alerts. This is an example of a component that wasn't a simple case, and needed a bit more work to set straight.
In our design system, it documented that there were multiple optional elements within an Alert (it can have an icon, or a button, or a title, etc.), and listed every single permutation that could exist—listing 48 versions! And on the live platform, we were seeing all sorts of variations that seemed to be created ad-hoc!
On the left side, our design system did not specify different behavioral types of Alerts, but listed 48 versions. With the right side, I grouped Alerts found on the platform by behavior. This would help inform the later components that needed to be defined.
For this case, I conducted research to understand common practices and the typical behavior of informational messaging. In the image above, I’ve already categorized them by behavior, but deciding what groups should exist required careful consideration of their usage and functionality.
Ultimately, I identified three distinct types of informational messaging: Default Alerts, Toast Messages, and Big Alerts, each serving a unique purpose.
New Design System Documentation
Updated documentation removes the power of options. For example, there is no choice regarding the appearance of icons—icons are required on all Alerts. This gives both designers and developers one clear path as to what should be used.
Through this process,
significantly streamlining the platform’s design system. Each updated component was documented alongside its corresponding issue from the inconsistencies documentation, providing engineers with a clear solution to each problem. Additionally, I worked closely with engineering teams to strategize and execute these updates in the codebase.
*within the 7 focus areas. This number is roughly calculated; there are multiple complex cases where a direct reduction calculation does not accurately reflect the standardization done.
What's Next: Implementation is a WIP
My time at Branch was an incredible learning experience. It was my first opportunity to formally work on design systems, and I had the privilege of collaborating with talented coworkers and mentors who consistently set me up for success. With support from my manager, I gained the confidence to step into a more proactive role, which marked a significant milestone in my professional growth!
Our goal was:
Streamline Branch design components, thereby improving customer experience as well as internal development processes.
Did we achieve it?
While I cut down on the amount of variations in our focus areas, there is still implementation to do—and beyond that, even larger UI/UX considerations beyond the focus areas!
By the end of my internship, engineers had a clear and actionable plan for updating components thanks to our UI/UX audit, the documentation, and design system updates. Implementation was already underway, and I had actively participated in several meetings to collaborate with engineers, understand their constraints, and strategize effective solutions.
Impact:
Internal Efficiency -> High-Quality Product -> Customer Trust & Retention -> High ROI ($$$)
While implementation was still in progress when I left, the importance of this work cannot be overstated.
Streamlining design components and patterns is about more than aesthetics—it’s about creating a platform that earns and maintains users’ trust. Consistent and thoughtful design signals that Branch values its customers’ experiences and pays attention to the details. This foundation of trust strengthens the brand’s relationship with its users and reinforces its credibility.
Internally, standardizing design patterns eliminates unnecessary back-and-forths over minor decisions, enabling teams to focus on impactful changes. By reducing friction in the design and development process, we set the stage for faster iterations and a more cohesive product that better serves both customers and internal stakeholders.