MOBILE / Chentronics Flame Sensor

Chentronics Field Diagnostics — From Manual Checks to Guided Mobile

.NET

.NET

60%

60%

60%

8

8

3

3

BACKEND

ARCHITECTURE

BACKEND

ARCHITECTURE

FASTER

DIAGNOSTICS

FASTER

DIAGNOSTICS

MONTHS TO

DELIVERY

MONTHS TO

DELIVERY

TEAMS

ALIGNED

TEAMS

ALIGNED

CLIENT

CLIENT

CLIENT

ROLE

ROLE

ROLE

TEAM

TEAM

TOOLS

TOOLS

Chentronics &

VividCloud

Chentronics &

VividCloud

Senior

UX Designer

Senior

UX Designer

Senior UX

Designer

UX, Product, Engineering

UX, Product,

Engineering

UX, Product,

Engineering

Figma, Jira, Confluence

Figma, Jira,

Confluence

Figma, Jira,

Confluence

THE PROBLEM

Field technicians diagnosing industrial flame sensors with manual checks, inconsistent logs, and trial-and-error.

Chentronics needed a diagnostic app for field technicians as the iScan® 3+ launch approached — manual checks creating inconsistent logs, no guided process, and no record of what was found in the field. For a product launching to industrial customers, that inconsistency was a liability and a support cost. I was brought in as the sole UX designer to architect a mobile-first solution built to scale to desktop — React Native chosen upfront so the UI wouldn't need to be rebuilt when the second surface arrived.

THE CORE TENSION

Complex hardware signals needed to be translated into clear, actionable diagnostic steps — in real-time, in the field, under pressure.

Technicians were relying on manual checks, inconsistent hardware logs, and trial-and-error to diagnose flame sensor health. I mapped the field workflow with technicians and brought the findings to product and engineering before any design work began — every decision had to solve for field conditions today and hold up on a desktop screen tomorrow.

RESEARCH & DISCOVERY

EXISTING DESKTOP TOOL — FORNEY HD CONNECT, THE SOFTWARE TECHNICIANS WERE WORKING AROUND

What field technicians actually told us.

Research started in the field — with technicians using the iScan 3+ under real working conditions. I led field observations and interviews, then synthesized findings into a clear brief for product and engineering. What we heard made the mandate explicit: the hardware was ahead of the software meant to support it.

Heard Directly

Stakeholder Inquiry

Heard Directly

Stakeholder Inquiry

"I'm checking flame sensors by feel and experience — there's no guided process."

"I'm checking flame sensors by feel and experience — there's no guided process."

Technicians were diagnosing complex industrial hardware using tacit knowledge built over years. There was no structured diagnostic flow — just manual checks, pattern recognition, and trial and error when something didn't behave as expected.

"Every device behaves differently. The app needs to know which one I'm looking at."

"Every device behaves differently. The app needs to know which one I'm looking at."



The iScan 3+ worked across a wide range of flame sensor models — each with different fault states, signal ranges, and diagnostic logic. A one-size-fits-all interface would give the wrong guidance to the wrong device every time.

The iScan 3+ worked across a wide range of flame sensor models — each with different fault states, signal ranges, and diagnostic logic. A one-size-fits-all interface would give the wrong guidance to the wrong device every time.

Heard Directly

Stakeholder Inquiry

Heard Directly

Stakeholder Inquiry

"My logs are on paper. Nothing connects back to the device history."

"My logs are on paper. Nothing connects back to the device history."

Diagnostic records were kept manually — on paper, in personal notebooks, sometimes not at all. There was no connection between what a technician recorded in the field and what the device had actually experienced over time.






"I'm working in the dark on unfamiliar equipment in bad conditions."

"I'm working in the dark on unfamiliar equipment in bad conditions."

Field conditions were frequently difficult — poor lighting, confined spaces, time pressure, unfamiliar installations. The app had to work as a real-time guide, not a reference document the technician read before going in.

KEY FINDING

The hardware signal needed to drive the diagnostic experience — not the other way around.

THE STRUCTURAL FINDING

Complex hardware signals needed to be translated into clear, actionable diagnostic steps — in real time, in the field, under pressure.

The most important research finding wasn't about the interface — it was about the relationship between the physical device and the software meant to interpret it. I brought it to product and engineering before any architecture decisions were made: the iScan 3+ was generating rich signal data, none of it being translated into guided action at the point of diagnosis. That single finding drove every architecture decision that followed.

"The app didn't need to teach technicians about flame sensors. It needed to tell them exactly what this sensor, right now, was doing wrong."

CORE TENSION

Field simplicity vs hardware complexity — two needs the design couldn't trade off against each other.

The hardware was complex. The field conditions were demanding. The technician needed simplicity. I ran a working session with product and engineering to make the tradeoff explicit — engineering needed the full diagnostic signal, technicians needed it invisible. The design had to hold all three simultaneously.

FIELD TECHNICIANS WANTED

Simple guided steps on a phone

  • Works in bad conditions

  • Device-specific guidance

  • No manual to read first

  • Fast fault identification

ENGINEERING

NEEDED

ENGINEERING NEEDED

Full fault-state taxonomy surfaced

Full fault-state taxonomy surfaced

  • Real sensor signal data

  • Accurate device identification

  • Scalable to desktop workflows

  • Cross-platform .NET architecture

DESIGN RESOLVED

Hardware-aware guided flow

Hardware-aware guided flow

  • Fault-state taxonomy as UX logic

  • React Native on .NET backend

  • Mobile-first, desktop-ready

  • Full device spectrum supported

The Resolution

The technician sees a clean, guided diagnostic flow. The system is reading device type, signal data, and fault state in real time. Simplicity on the surface. Full hardware complexity underneath.

DESIGN DECISIONS

Every decision traced back to a finding.

Five distinct design choices — each one a direct response to something we heard in the field.

Five distinct design choices — each one a direct response to something we heard in the field.

  1. Hardware-aware guided flow — the app identifies the device before it gives any guidance.

DEVICE IDENTIFICATION — HARDWARE TYPE AND SERIAL NUMBER SURFACED AT FIRST INTERACTION

FINDING

FINDING

Every flame sensor model had different fault states and signal ranges. Generic guidance was worse than no guidance — it sent technicians down the wrong diagnostic path on the wrong device.

Every flame sensor model had different fault states and signal ranges. Generic guidance was worse than no guidance — it sent technicians down the wrong diagnostic path on the wrong device.

INSIGHT

INSIGHT

The diagnostic flow couldn't start until the app knew exactly which device it was talking to. I brought this framing to product and engineering as a foundational architecture requirement — device identification wasn't a setup step, it was the logic everything else had to be built on.

DESIGN

DESIGN

DESIGN

Device identification built into the first interaction — hardware type surfaced, device-specific fault states, signal thresholds, and diagnostic steps loaded automatically. I worked with engineering to define the device taxonomy model early so the UI logic could be spec'd against real hardware behavior, not assumptions. Nothing generic, nothing that could apply to the wrong sensor.

Device identification built into the first interaction — hardware type surfaced, device-specific fault states, signal thresholds, and diagnostic steps loaded automatically. I worked with engineering to define the device taxonomy model early so the UI logic could be spec'd against real hardware behavior, not assumptions. Nothing generic, nothing that could apply to the wrong sensor.

  1. Fault-state taxonomy as UX logic — turning engineering signal data into guided steps.

FAULT STATE SURFACED AS PLAIN-LANGUAGE GUIDANCE

FINDING

FINDING

The iScan 3+ was generating complex fault signal data that technicians couldn't interpret in the field without deep product knowledge. The signal existed. The translation didn't.

The iScan 3+ was generating complex fault signal data that technicians couldn't interpret in the field without deep product knowledge. The signal existed. The translation didn't.

INSIGHT

INSIGHT

The fault-state taxonomy wasn't just an engineering reference — it was the content architecture for the entire diagnostic experience. I worked with engineering to map every fault state to plain-language guidance before any screens were designed, so the UX logic and the backend logic were built in parallel, not reconciled after the fact.

DESIGN

DESIGN

DESIGN

Fault-state taxonomy mapped directly to guided diagnostic steps — each signal state surfaces as a plain-language action, not a raw error code. I spec'd the full state matrix with engineering before build began, so every possible hardware condition had a defined UX response. The technician gets exactly what to do next, not a number to look up.

  1. React Native on a .NET backend — mobile-first architecture designed to scale to desktop.

MOBILE-FIRST, DESKTOP-READY — FULL DEVICE SPECTRUM SUPPORTED

FINDING

FINDING

Field technicians needed a phone-based experience. Engineering needed an architecture that could grow into desktop workflows as the product expanded across the full device spectrum.

Field technicians needed a phone-based experience. Engineering needed an architecture that could grow into desktop workflows as the product expanded across the full device spectrum.

INSIGHT

INSIGHT

The platform choice wasn't just a technical decision — it was a product strategy decision. I brought this framing to the product owner and engineering lead before any architecture was committed to: React Native on a .NET backend was the only choice that could be mobile-first today and desktop-ready tomorrow without a rebuild.

DESIGN

DESIGN

DESIGN

React Native front end on a .NET backend — mobile-first, cross-platform, architected for future desktop expansion. I worked with engineering to define component structure and state management patterns upfront, so every screen was built with the full device spectrum in mind, not just the current phone use case.

  1. Designed for the full device spectrum — not just the most common sensor.

FULL DEVICE SPECTRUM — EQUIPMENT GROUPS, LIVE DIAGNOSTICS, AND DEVICE DISCOVERY

FINDING

FINDING

Technicians worked across a wide range of flame sensor models and installation types. An app designed around the most common scenario would fail everyone working with anything else.

Technicians worked across a wide range of flame sensor models and installation types. An app designed around the most common scenario would fail everyone working with anything else.

INSIGHT

INSIGHT

Edge cases in industrial diagnostics aren't edge cases — they're the actual job. I brought this framing to product before the IA was finalized: designing for the most common sensor only would mean failing every technician working with anything else.

DESIGN

DESIGN

DESIGN

Full device spectrum built into the information architecture from day one — every sensor type, fault state, and signal range accounted for in the guidance system. I worked with engineering to test against the actual iScan 3+ hardware, not a simulated device, so spec assumptions were validated against real behavior before build completed.

  1. Prototype validated against real hardware — sensor signal as the source of truth.

LIVE HARDWARE SIGNAL — FREQUENCY SPECTRUM FROM THE ISCAN 3+

FINDING

FINDING

Field conditions — poor lighting, confined spaces, time pressure — meant a prototype that looked right on a desk could completely fail in a real diagnostic situation.

Field conditions — poor lighting, confined spaces, time pressure — meant a prototype that looked right on a desk could completely fail in a real diagnostic situation.

INSIGHT

INSIGHT

Validation against the actual hardware signal was the only test that mattered. I pushed for prototype testing against the iScan 3+ itself before any build began — not against a use case description or a simulated fault state. Product and engineering aligned on that standard before testing started.

DESIGN

DESIGN

DESIGN

Prototype validated against live hardware signal from the iScan 3+ — guided repair flow tested for how and where the signal was actually used, not where it was assumed to be used. I coordinated testing sessions with field technicians and fed every failure point back into the spec before engineering began build. The sensor was the source of truth throughout.

Prototype validated against live hardware signal from the iScan 3+ — guided repair flow tested for how and where the signal was actually used, not where it was assumed to be used. I coordinated testing sessions with field technicians and fed every failure point back into the spec before engineering began build. The sensor was the source of truth throughout.

MY ROLE

Led the UX effort to translate complex hardware behavior into a clear, actionable diagnostic experience.

WHAT I OWNED

End-to-end UX across a .NET-backed React Native diagnostic app — field research, fault-state taxonomy, component architecture, and implementation-ready specs for iOS, Android, and the .NET backend team.

HOW I WORKED

Every screen designed mobile-first — thumb reach zones, glove-friendly targets, high-contrast states for outdoor lighting. I aligned with engineering on React Native's single codebase as a deliberate architectural choice before any component work began. The same UI layer powering iOS and Android today will power the desktop app when it ships.

THE CONSTRAINT

Hardware signals are complex and non-linear. Every diagnostic decision had to be legible on a small screen, under pressure, in the field — within a component model the .NET backend could serve consistently across platforms.

PROCESS

Five phases — from hardware audit to live prototype validation. Research drove architecture. Architecture drove every screen.

1

Field Workflow & Hardware Audit

Observed technicians diagnosing in the field — mapping friction points, failure moments, and the manual workarounds the app had to replace.

2

Fault-State Modeling & Signal Mapping

Collaborated with firmware engineers to translate raw sensor data into a shared fault taxonomy — every UI state mapped to a real hardware condition.

3

Diagnostic Flow Architecture

Designed structured troubleshooting paths that surfaced root causes and real-time sensor feedback at every decision point — no ambiguity in the field.

4

UI Design & Interaction Patterns

Built for glanceability under industrial conditions — sensor health hierarchy, color-coded states, 8pt grid, component state matrices, and mobile breakpoint specs across the full device inventory.

5

Hardware-Integrated Prototype Validation

Ran task-based usability sessions with technicians using prototypes connected to real iScan® 3+ hardware signals — findings drove final iteration before build.

FAULT-STATE TAXONOMY

Shared language for design & engineering

Built a shared fault taxonomy with firmware engineers so every UI state mapped to a real hardware condition. I drove the mapping sessions with engineering before any screens were designed — no assumptions, no abstraction gaps, no reconciliation after build.

IMPLEMENTATION SUPPORT

Stayed in until it shipped correctly

I stayed in through integration to catch inconsistencies and refine interactions based on real engineering test sessions. Design intent matched shipped behavior because I was in the room when it didn't.

FAULT-STATE COLOR SYSTEM — BRAND AND ALERT STATES MAPPED TO HARDWARE CONDITIONS

RESPONSIVE ACROSS EVERY SCREEN SIZE

Designed for the full device spectrum not just one phone

React Native's codebase had to hold across iOS, Android, and varied screen sizes. Every component was specified for the smallest screen first, then validated on real devices.

FIELD-CONDITIONS DESIGN CONSTRAINTS

Designed for how and where the app actually gets used

Tap targets, contrast states, and navigation patterns were designed for gloves, outdoor lighting, and single-hand use. The field environment shaped every layout decision.

MOBILE-FIRST LAYOUT — PLANT VIEW, SIGNAL DISPLAY, AND DEVICE LIST ACROSS SCREEN SIZES

PROTOTYPE

From sensor signal to guided repair validated against real hardware, built to run on more than one surface.

Built in React Native on a .NET backend, I designed the complete technician journey end-to-end — device discovery, real-time sensor health visualization, fault-state identification, guided troubleshooting steps, and repair confirmation. Validated against real iScan® 3+ hardware signals before build completed. The same architecture that runs this mobile experience will power the desktop version when it ships.

Available now directly from Chentronics download at:

chentronics.com/solutions/iscan3.


Not yet on the Apple App Store or Google Play — coming soon.

The iScan® 3+ — the device the app was built around.

"Testing against real hardware signals — not simulated data — is what made the difference between a prototype that looked right and one that actually worked. Designing within a .NET-backed architecture meant there were no surprises when it came time to connect the UI to the data."

OUTCOMES

A scalable diagnostic framework built for cross-platform growth and faster fault resolution in the field.

30% faster fault identification during field diagnostics — reduced time-to-resolution for technicians working on industrial installations. Faster diagnosis means less downtime per site visit and lower support cost per resolution.

Reduced technician guesswork through guided troubleshooting flows, replacing manual cross-referencing with clear, step-by-step repair paths.

Clear visibility into sensor health and failure modes — technicians could immediately understand device state without consulting hardware manuals.

A cross-platform foundation established — the same UI layer, the same .NET backend, ready to serve a desktop surface when the roadmap calls for it.

PROJECT GALLERY

From hardware signal research to shipped diagnostic app

Competitive Analysis

Signal Visualization Research

Information Architecture

Technician Workflows

Sensor Configuration

Visual Identity

Data Visualization

Wireframes

Data Visualization

Final Designs

Final Designs

Final Designs

What I'd do differently

Get hardware test rigs in front of the team earlier. The most valuable feedback came from testing against real sensor signals, but that happened late in the process. Starting desktop layout exploration in parallel with mobile would have validated the component model against both surfaces before mobile decisions were locked in.