Kyle WongProduct & UX Designer · Data Visualization

Senior UX Designer · FM Global

Enterprise Data Visualization Redesign

Poor data literacy costs organizations millions annually in inefficiencies. It’s not because data is missing, but because it’s misinterpreted. According to IBM, 43% of executives ignore dashboards because they fail to connect metrics to strategic growth.

At FM Global, I redesigned a mission-critical dashboard used for risk assessment and loss prevention. The redesign reconstructs the decision layer between data and action. Design reduces ambiguity in systems where every decision carries financial consequences.

Project TypeEnterprise dashboard redesign for decision infrastructure
RoleSenior UX Designer
ImpactReduced system-induced decision errors by 30% · improved user confidence by 40%
CompanyFM Global, leading global commercial and industrial property insurer, engineering-driven approach to risk management and loss prevention
FM Global dashboard screens shown across desktop and mobile devices

From fragmented data views to a unified decision system: the redesigned FM Global risk assessment dashboard.

From fragmented data views → unified decision system

Project Goals

The main goal was to conduct an end-to-end redesign of the decision layer between data and action-where small UX failures can introduce real financial risk.

Main Problem

The primary failure wasn’t a lack of data. It was a lack of clarity.

The dashboard allowed inconsistent data states across charts, creating ambiguity in how users interpreted results. Identical actions could produce different outputs depending on context, breaking trust in the system.

This turned a tool meant for decision-making into a source of hesitation and uncertainty.

Solution overview

FM Global solution overview showing the redesigned dashboard system

Introduced a global filtering system and structured data hierarchy to align system behavior with user expectations.

Where is decision risk actually coming from?

Before redesigning the interface, I needed to align stakeholders around a more fundamental question: where is decision risk actually coming from?

I led a stakeholder discovery phase across product leadership, design, account managers, and risk engineering to uncover how each group defined the problem. While everyone agreed the dashboard needed improvement, their mental models differed.

To surface these differences, I tailored my questions to each discipline.

VP of Data & Analytics

Defining decision risk at the executive level

  • What decisions are users expected to make using this dashboard?
  • Where do incorrect interpretations create financial or operational risk?
  • What defines a “high-confidence” decision in this context?
  • Where are users hesitating, second-guessing, or escalating decisions?

Principal and Design Leads

Identifying where the interface introduces cognitive load

  • Where do users struggle to understand relationships between data points?
  • What patterns have tested well or failed in previous iterations?
  • Where does the interface introduce cognitive load or ambiguity?
  • How do we validate that users are interpreting data correctly?

Account Managers

Understanding the day-to-day decision workflow

  • What are your primary goals when using the dashboard day-to-day?
  • What decisions are you expected to make or support with this data?
  • How do you translate this data into insights for clients or internal stakeholders?
  • Where do you feel most confident vs. least confident when interpreting the data?
  • What parts of the dashboard require the most manual effort or explanation?
  • How often do you need to export, reformat, or recreate data outside the system?

Risk Engineering

Tracing how data ambiguity surfaces in real-world decisions

  • What decisions are you responsible for making using this dashboard?
  • How do these decisions impact real-world outcomes (risk exposure, resource allocation, cost)?
  • When reviewing data, what makes you trust or distrust what you’re seeing?
  • Have you ever made a decision and later realized the data was interpreted incorrectly? What happened?
  • What part of the workflow feels the most ambiguous or open to interpretation?
  • If this dashboard disappeared tomorrow, what would your fallback process look like?
These conversations revealed a critical pattern: The system was technically correct, but operationally ambiguous. Users weren’t making mistakes because of bad data, they were misinterpreting correct data due to inconsistent system behavior.

Grounding the team in decision risk

To validate the pattern uncovered in stakeholder interviews, I partnered with UX Research to run contextual inquiry sessions and user interviews, observing how users actually worked across dashboards, Excel models, and reporting workflows.

We mapped the end-to-end decision process and uncovered how small inconsistencies in filtering, data hierarchy, and system feedback compounded into larger breakdowns in trust.

To maintain alignment as we moved into design, I embedded continuous feedback loops across teams:

  • Weekly cross-functional reviews to align on system behavior
  • Async Figma workflows with targeted feedback requests
  • Early engineering collaboration to validate edge cases
  • Ongoing synthesis of research into product decisions

By grounding the team in a shared understanding of decision risk, we shifted from debating UI preferences to designing a system that reduces ambiguity at every layer.

Each iteration exposed a different failure mode

Through collaboration with UX Research, I uncovered that the real issue wasn’t inefficient workflows, it was inconsistent data behavior across the system. Filtering, sorting, and chart settings operated in fragmented ways, forcing users to repeatedly reapply logic while introducing uncertainty about what data they were actually seeing.

Usability testing revealed that frustration was only a symptom. The deeper problem was a lack of trust: users couldn’t confidently tell whether multiple charts were reflecting the same underlying data, turning what should be a decision-making tool into a source of ambiguity.

Legacy Design

Localized Filtering

System Behavior: Each chart operated as an independent data context.

Why this is dangerous:

  • Users assumed filters applied globally
  • System behaved locally
  • This mismatch created false confidence in outputs

Outcome:15% error rate. Errors weren’t random, they were system-induced misunderstandings. Two charts on the same screen could represent different realities without users realizing it.

Legacy design: localized filters

Legacy FM Global dashboard design showing localized chart filters

Iterations

Centralized but Hidden Controls

System Behavior: Filters moved to a global location, but visibility decreased.

Why this failed:

  • Users lost awareness of active filters
  • Increased cognitive load
  • Reduced system transparency

Hiding controls reduced visual noise, but increased decision ambiguity. Error rate increased to 20–35%. Users hesitated, second-guessed, or misinterpreted results.

Iteration 1: centralized filters

FM Global dashboard iteration with centralized filter controls

Iteration 2: centralized filters

FM Global dashboard iteration showing a refined centralized filter layout

Final System

Global, Persistent Filters

System Behavior:

  • Single source of truth for filters
  • Persistent and visible across all charts
  • Applied consistently across the system

Final system: global persistent filters

Final FM Global dashboard system with global persistent filters

From Dense Data to Clear Decision Context

The Account Snapshot wasn’t failing because it lacked data. It was failing because it didn’t clearly communicate what data users were looking at, and how to interpret it.

In its original state, the layout blended too many metrics into a single visual pane: account-level performance, industry benchmarks, and supporting metrics. Users could see the data, but struggled to understand the relationship between metrics, making it harder to confidently interpret performance.

Original Design

Shared visual hierarchy between internal and external data

The account snapshot displays a high-level overview of important account information. The original design had some key issues: industry comparisons lived directly next to account metrics, which resulted in:

  • Account metrics and industry benchmarks sharing the same visual hierarchy
  • No clear distinction between internal performance and external comparison
  • Dense table structure requiring manual cross-referencing
  • Visual weight that did not reflect importance

This resulted in high cognitive load, slower interpretation, and increased risk of misreading performance context. The interface required users to do the work the system should have been doing.

Legacy design: account snapshot

Legacy FM Global account snapshot design with dense account metrics

Iteration 1

Dual Data States

System Behavior: Unfiltered and filtered data displayed simultaneously.

Why this is dangerous:

  • Introduced competing sources of truth
  • Users couldn’t tell which dataset to trust

Observed Behavior: Users tried clicking values, asked clarifying questions, and misinterpreted static vs. dynamic data.

Iteration 1: account snapshot

FM Global account snapshot iteration showing dual data states

Adding more data increased perceived transparency, but actually introduced competing sources of truth.

I approached this as a decision clarity problem: How do we structure the interface so users can immediately understand performance without needing to interpret relationships manually?

Final System

Singular, Contextual Data Model

System Behavior:

  • One clear data state
  • Visual hierarchy communicates what is affected by filters
  • Static vs. dynamic data separated spatially

This reduced cognitive load, misinterpretation risk, and decision hesitation.

Final system: account snapshot

Final FM Global account snapshot design with a singular contextual data model

Designing for Consequence, Not Convenience

30%reduction in system-induced decision errors
40%improvement in user confidence

This redesign reinforced a key principle: in enterprise systems, the role of design is to ensure data is interpreted correctly.

By restructuring both interaction patterns and information hierarchy, the dashboard evolved from a collection of visualizations into a reliable system for decision-making.

We removed ambiguity not by adding clarity, but by eliminating competing interpretations of the same data.