Help Center

How to measure the dev side of a design system

Measuring a design system is both essential and challenging to ensure its longevity and success. There are already some ways to measure it, but mainly from a design point of view. Still, developers play a critical role in the success of a design system. They are responsible for implementing the design system’s components and patterns into the projects, and thus, are the cornerstone of the design system’s efficiency. By reducing efforts and saving time, design systems can reduce development time. 

Feedback from both the system and its users is crucial to determine progress and improve the design system in the long run. However, obtaining this data can be challenging and is often not a priority until major issues or barriers have emerged. Here are some tips and best practices for quickly measuring a design system’s effectiveness from a technical perspective.

Measuring design system’s adoption

Tracking adoption is an essential metric for assessing the success of a design system. This includes monitoring the usage of the design system’s components and patterns by developers and adherence to the guidelines and best practices outlined in the system. By measuring the impact of these initiatives, the design system team can demonstrate concrete progress and justify the need for continued resources to maintain the design system in the future.

Tracking components use isn’t the right metric

To measure the adoption of a design system, tracking the frequency of components and patterns in the codebase as well as the percentage of code that follows the guidelines may be tempting. For React libraries, React Scanner can scan and extract data, while CODEOWNERS can help you to identify teams using components at the repository level on Github.

Some mature design systems find this to their liking, since they may be able to refine their list of components to increase their consistency by offering limited UX flows. For instance, you may want to remove a form component in some cases since it has very little use and can be substituted with another one. 

However, while counting the number of components used in a project may indicate the effectiveness of its design system, it may not provide a complete picture, and a high number of components does not necessarily indicate a well-designed system. Tools like React-Scanner are more helpful to measure the risk for future changes more than defining success. A large number would rather point to being careful more than being a success indicator.

Steve Dennis, when discussing how to measure adoption of Onfido’s design system, highlights a potential issue with tracking component usage. He points out that tracking the number of times a component is used may not always be accurate. For example, if a button component is included in the header of an app and that header appears on 30 different pages, should that be counted as 30 buttons or one button? This illustrates that tracking component usage alone may not provide a complete picture of the design system’s adoption. The header component utilizes a button count of one, while the number of instances where the header (including the button) is used on the site provides a different metric. The former is an internal measure within the design system, while the latter is a measurement of usage “in the wild.” Neither metric serves as a measure of success, but they both have other benefits.

Track the projects using components

Instead of tracking the number of times a component is used, Dennis proposes measuring the number of products that have implemented a particular component at least once in their codebase. This approach allows them to determine which components are being utilized and which are not, as well as identify which products make heavy use of the design system components and which ones make minimal or no use of them.

Data charts from Onfido's design system.webp

Data-chart example from Onfido’s design system measuring component adoption across projects

At Twilio, they use Octokit to collect data by tracking which organizations have a package.json file or Node.js code, helping them to know which projects are using Twilio’s Paste design system. They developed a command line utility that generates a report through the analysis of import statements at the top of each file in their codebase. An index is constructed that links these files to their dependencies, allowing the index to be queried by the package and, if desired, the export. If you want to know more, the Twilio Segment team has shared their open sourced solution.

In one year, Twilio tracked their design system adoption curve growth from 7 organizations and 11 repositories to 19 organizations and 60 repositories. 

Another way of doing it is highlighting the system elements directly where it is applied on the pages. Onfido and ProductBoard leveraged their consistent naming of Design System colors to create a tool for visualizing usage through color overrides. This tool helped them quickly identify opportunities for using Design System components and provided developers with specific guidance. Onfido simply use the prefix they have on their -ods CSS classes to track the components and design tokens from their design system. They also use Custom Style Script browser extension to override existing styles on a page to highlight the design system elements by selecting any elements with their-ods class prefix and put an outline to identify them. This helped them to get an overview about their UI coverage by toggling on/off on any web-based product that uses Onfido’s design system.

Highlighted components on a page from Onfido's design system.webp

Visualizing Castor’s components on a page for Onfido Design System

Tracking the lack of adoption as a metric

An understanding of how many projects used the design system and how many used custom code would be helpful in evaluating adoption. This will help you identify which projects still need the design system. Identifying their reasons for not adopting the design system is a good way to start understanding why they’re not. It’s also interesting to evaluate your design debt, identify which projects or teams are generating this debt, and try to understand why. 

Filip Daniško explains how they track adoption at ProductBoard by using tools like ESLint and Stylelint to detect non-design system values. This approach has two benefits: it can alert the formation of new technical debt and also provide the ability to track it. Although, having technical debt can also present an opportunity and lead to productive conversations as it can act as a use case scenario that can enhance the component within the system.

To effectively implement this, they maintain a monorepo containing all their projects, which allows them to easily run ESLint and Stylelint on all of them by simply saving the output to a JSON file using the outputFile CLI argument.

Additionally, you can track the number of ESLint rules overridden to determine how the Design System is being used and where individual developers need different styles or components. 

An open source project was even built by Arseny Smoogly from Rangle.io to track and compare the use of design system components against one-off components and understand what percentage of the code is from the system versus custom code. Arseny delivered a presentation on this topic at React Summit 2022 if you want to learn more.

Similarly, Twilio found it helpful to identify which projects are not using their components correctly, as a way to track more than adoption. Knowing which repositories are using Twilio’s design system, they use NodeGit to clone every relevant repository into a “repos” folder and then scan the repositories with react-scanner to track the components imported, frequency of imports, utilized props, and the files they appear in. As a result, they can quickly spot some components being misused because some props are missing or because someone added a // @ts-ignore. Twilio then uses this information to initiate communication with the team by creating an issue or reaching out to the developers via Slack, providing sufficient context for a productive conversation.

Measuring design system versioning

Another point to consider is how your design system is implemented in the projects and the different versions it might have. Suppose you have a project that uses the components from your version 4 design system while you are currently running version 8 of your design system. It is interesting to identify which projects or teams are using older versions of your design system and to think about how to ensure that everyone is using the latest versions of your design system.

At Twilio, they were able to execute a modified version of the scan to monitor the adoption of alternative component libraries. This helped keep a record of how previous systems were being phased out and which components were most widely used. The version numbers proved to be very useful when they encountered a customer-facing issue with the @twilio-paste/core package prior to version 4.2.4. With this report’s help, they could identify the teams with impacted products and assist in resolving the problem.

Measuring changes

Measuring the amount of changes a developer makes is interesting to track to evaluate the benefits of a design system in a project. 

Cristiano Rastelli of HashiCorp writes about the significant reduction in changes and their frequency due to the company’s design system in this insightful article. Cristiano began by using a command line to gather data from the entire git history and extract information on commits related to LESS file changes. This allowed him to determine the dates of the commits, the number of files altered, and especially the number of lines of code added or removed with each change. With this data, Cristiano utilized a git-log parser and converted it to JavaScript, creating a Node-based parser/processor that takes the output of the commands as text files and produces massive JSON files with a daily list of all changes. 

With the help of these metrics and visualization techniques, Cristiano uncovered valuable insights into the impact of HashiCorp’s design system on their engineering team. The data clearly showed a reduction in the team’s workload, attributed to the use of pre-existing UI components in their design system, instead of writing new CSS for every new feature. This proved that the design system was indeed saving the engineers time and effort.

Datavisualization showing the amount and frequency of changes at HashiCorp.webp

The difference between before/after the introduction of a design system at HashiCorp showing the amount and frequency of change

Measuring developer experience

Don’t forget for whom you are building this design system: those who will use it. Sadly, it happens too often that teams need to pay more attention to good implementation and use of components, resulting in a poor developer experience, potentially forcing them to hack the system and write their own markup or code.

Design systems are a lot of things: tooling, methods, process etc. But if there’s one thing that can ensure your design system to be efficient and to bring the right experience for your developers (but for designers too), it’s about speaking the same language across teams. Measuring how easy and convenient your design system is for your developers is an interesting metric as it will lead to better collaboration and faster implementation.

Twilio is even considering developing a GitHub bot to automate workflows as they think their design system’s consumers would benefit greatly from the automatic opening of Github Issues or PRs on impacted repositories for breaking changes along with comprehensive “how-to upgrade” guides.

Measuring developer experience isn’t just about getting key figures, but getting qualitative data too. Gathering feedback, having a team temperature check, sending regular surveys, etc., to understand how developers use your design system can be valuable too. Check our other article about how to measure a design system to know more about qualitative metrics; there’s even a survey template you can reuse.

How to use the data you get from your design system?

Collecting data is one thing, but knowing how to use it is another. Consider your design system metrics as pure gold that will help you grow your design system by asking for more budget, and resources while proving its value.

Show it, share it, spread it

First, you need to find the right way to display your data. One way is to create detailed reports summarizing key metrics and usage patterns and sharing them with relevant stakeholders. Use tools like Google Analytics, Metabase or Mixpanel to create interactive dashboards that display real-time data and metrics. You could also use data visualization tools like Tableau or D3.js to create interactive visualizations that make it easy for stakeholders to understand the data and metrics.

You can also build a portal or website that allows stakeholders to view and download data from your design system so that these metrics live on a specific platform and are accessible anytime, anywhere to anyone.

Datavisualization of changes at HashiCorp.webp

Visual representation of the changes in two codebases at HashiCorp

Use these visual reports to make presentations that highlight key findings and insights, and present them to stakeholders at meetings or team retreats. Consider having regular meetings, like once per quarter, to present your metrics and the progress you’re making with your design system to your organization. It will show transparency and will help with adoption by the other teams to see your design system is getting more and more serious.

Design System ROI

In general, data is only one piece of the puzzle, but it is helpful in demonstrating the design system ROI (return of investment), especially if multiple metrics are used.

You could estimate the time saved per component by comparing the efficiency of using a reusable component versus building one from scratch, then multiply the metric by the number of components used to determine time saved. This is what our friends at Sparkbox did with an interesting experiment to check if “design systems help developers produce better code faster”.

Charts from Sparbox showing the time saved with a design system.png

One of the result from Sparbox’s experiment 

The improvement of your daily work will also provide you with an opportunity to track your design system ROI. Getting metrics about your system adoption can also help you make better-informed decisions about how to prioritize your work, which is a huge win. Having metrics that support and validate your choices can be powerful to ensure you’re leading your design system in the right direction.

Finally, speed up the onboarding of new members is a precious metric that would prove the value of your design system. The faster they learn how to work with your system, the more valuable your design system is for your organization. 

By providing support to new starters and reducing bounce rates, Twilio’s design team measures their onboarding success. They can identify their newest adopters by watching for repositories adding dependencies and reach out to them for assistance. Additionally, they can find out why people stop using their design system and contact them.

Wrap up

Overall, measuring a design system from a developer’s point of view involves tracking its adoption rate, time savings, impact on the developer experience, and maintenance effort. By gathering this data, you can better understand how the design system benefits the team and the product, and identify areas where it could be improved.

However, when you demonstrate that your design system is efficient, cost-saving, and reduces effort, this does not mean that people will lose their jobs or that work hours will be reduced. Introducing a design system to a company is not intended to replace people, but to enhance their work processes. The goal is to help employees focus on high-value tasks and minimize repetitive work, not replace them. Design systems free up people from these routine tasks, allowing them to showcase their unique talents and bring value to the organization. After all, there’s no substitute for the creativity and ingenuity of a human being.