The problem with “just using the theme” in dataviz
READ TIME - 6 minutes ⏳
Most organizations are very deliberate about what their people are allowed to do with data.
They train employees on compliance, security, privacy, and governance. They document processes. They certify behaviors. They formalize responsibility.
![]()
But when it comes to how the organization visually expresses itself, the approach is often the opposite: implicit, informal, and largely undocumented.
Yet the paradox is obvious.
Every day, dozens (or hundreds) of people inside the company create artifacts that represent the brand. Dashboards, slide decks, operational reports, steering committee material, social posts, internal tools. These artifacts do not live in marketing silos; they circulate broadly and carry meaning.
If someone can open PowerPoint, Google Slides, or a BI tool and publish something that contains company data or communicates direction, they are already acting as a brand operator.
The only question is whether they are trained for it.
The hidden cost of untrained identity
![]()
Most companies do not lack brand guidelines. They lack operationalized identity.
Logos exist. Color palettes exist. Typography exists. Sometimes there is even a PowerPoint template or a brand portal. But these elements tend to live as static references rather than executable decisions.
As soon as work moves into tools like Power BI, things start to drift.
Values are re-entered manually. Colors are approximated. Typography is guessed. Spacing becomes subjective. Even when a theme is applied, edge cases appear almost immediately: an extra color, a special state, a non-standard layout.
At that point, people improvise.
A few improvised choices are manageable. Hundreds of them create confusion.
What looks like a small visual inconsistency is, in reality, a trust leak. Users rarely articulate it, but they feel it. If the interface looks loosely controlled, the underlying rigor of the data itself becomes easier to question.
Form and substance are not independent. In data visualization, they reinforce (or undermine) each other.
![]()
Design decisions without design tokens are like bricks stacked without cement.
Each brick may be well made. The material can be high quality. But without a binding agent, the structure relies on careful placement and constant vigilance. The moment scale increases, or someone new joins the site, the integrity of the whole thing becomes fragile.
Design tokens are the cement. They do not replace the bricks; they make them work together.
Colors, typography, spacing, and components only work when they are tied together in one system. That is what makes them architecture, not just parts.
Why Design Tokens change the nature of the problem
Design tokens are often introduced as a technical mechanism. A way to share colors between tools. A JSON file. A design-system concept.
That framing undersells their real function.
A design token is not a value. It is a design decision with a name.
- Instead of encoding “#DF2A4C,” you reference
brand.primary. - Instead of choosing a font size, you reference
text.body. - Instead of hard-coding spacing, you reference
space.m.
This shift sounds subtle, but it fundamentally changes how organizations work.
Once decisions are named, they become transferable. Designers, developers, and data teams stop translating intent into their own local languages. They start trading in a shared currency.
One decision can then be expressed in many environments: Figma, codebases, BI themes, documentation platforms. Different outputs, same source of truth.
That is why tokens scale. Not because they are modern, but because they eliminate repeated interpretation.
Scaling without rewriting reality every time
![]()
Yes, it is possible to work without design tokens.
It is also possible to deploy dashboards without version control, governance, or testing.
It works...
...until volume, velocity, or organizational complexity increases.
Without tokens, every update requires re-encoding decisions. New colors must be reintroduced everywhere. Corrections become manual.
Consistency depends on people being careful, not on the system helping them.
Tokens invert that relationship. Change happens once, and propagates outward. You are no longer managing outputs individually; you are managing inputs centrally.
For data visualization teams operating at enterprise scale, this is the difference between control and exhaustion.
Rethinking the Design System as infrastructure
A mature design system is not a style guide. It is infrastructure.
Every element inside it (colors, typography, semantic states, component properties) exists to reduce ambiguity.
Nothing is decorative. Everything is referential.
From that perspective, design tokens are not constraints. They are enablement. Analysts stop making visual decisions and start making analytical ones. Designers stop policing consistency and start improving systems. Governance shifts from enforcement to architecture.
The work becomes quieter and more reliable.
Where this journey actually starts
Adopting tokens does not start with tooling. It starts with observation.
What already exists inside the organization?
What do marketing teams use?
What do analysts complain about?
What workarounds have quietly become standard practice?
Running an inventory, talking to designers, and having informal conversations with data practitioners reveals the real gaps faster than any audit framework. The goal is not perfection. It is alignment.
From there, the transition is gradual. Values stop being typed in. They start being referenced. Libraries grow organically, from colors, to typography, to higher-level constructs.
The system matures as usage matures.
The real outcome: confidence at the point of creation
![]()
The success of a design token strategy is not measured by documentation completeness or tooling sophistication.
It is measured at the moment an analyst opens their BI tool.
If they feel confident, aligned, and unblocked
+ if the system guides their choices instead of forcing them to invent them
= then the work is doing its job.
Users recognize patterns. Patterns create familiarity. Familiarity builds trust.
And trust, more than any feature or interaction, is what drives adoption of data products.
When systems are well designed, people stop thinking about them.
They feel guided rather than constrained.
In that sense, design tokens are less about visual control and more about confidence: confidence for the people creating, and trust for the people consuming what gets created.
People rarely disagree on the problem, only on when to address it.
See you next week!
Julien