Paul Wentzell UX
Paul Wentzell UX

CAMP Systems — Eliminating Paper from Aircraft Oil Analysis

CAMP Systems — Eliminating Paper from Aircraft Oil Analysis

THE PROBLEM

CAMP Systems needed to modernize the oil and filter sample submission process for aircraft maintenance teams. Paper checklists were creating frequent identification errors, incorrect filter usage, and costly resubmissions — in an industry where a missed maintenance signal is a safety and compliance risk.

CAMP Systems needed to modernize the oil and filter sample submission process for aircraft maintenance teams. Paper checklists were creating frequent identification errors, incorrect filter usage, and costly resubmissions — in an industry where a missed maintenance signal is a safety and compliance risk.

Client / CAMP Systems

Client / Camp Systems

Client / Camp Systems

Industry / Aviation Maintenance

Category / Aviation Maintenance

Category / Aviation Maintenance

Team / UX, Product, Engineering

Team / UX, Product, Engineering

Team / UX, Product, Engineering

Role / UX Designer

Role / UX Designer

Role / UX Designer

Tools / Figma

Tools / Figma

Tools / Figma

Timeline / 3 months

Timeline / 3 months

Timeline / 3 months

  • Aviation

    Aviation

  • IoT

    IoT

  • Date Visualization

    Date Visualization

  • Mobile-Android

    Mobile-Android

  • Responsive

    Responsive

100%

100%

100%

4

4

4

1

1

1

PAPER PROCESS

ELIMINATED

PAPER PROCESS

ELIMINATED

PLATFORM

INTEGRATIONS

SPECTROMETRIC

OIL ANALYSIS

LIVE ON GOOGLE

PLAY

LIVE ON GOOGLE

PLAY

LIVE ON GOOGLE

PLAY

MY ROLE

MY ROLE

Led UX research and design for CAMP Systems — from field research and workflow analysis through mobile UI and final handoff.

WHAT I OWNED

End-to-end UX — remote research with maintenance teams, wireframes, interactive prototypes, modular UI component design, contextual filter recommendation logic, lab proximity mapping, and final mobile UI screens.

HOW I WORKED

Designed for the mobile experience of CAMP's SOAP platform — the primary interface where maintenance crews track engine data, compliance, and service records in the field. I worked directly with engineers and product to align on data behavior and workflow logic before any screens were finalized. Research was done through remote interviews with maintenance teams to understand how technicians actually move through a job, not how the system assumed they did. Deliverables were annotated mockup files handed off to development.

THE PROBLEM

THE PROBLEM

The Honeywell SOAP Paper Form

Completed by hand in the field before every oil sample submission.

The Honeywell SOAP Paper Form

Completed by hand in the field before every oil sample submission.

  • 30+ fields completed by hand, no validation

  • Filter selection left entirely to the technician

  • Lab routing decided after the sample was already sent

  • Errors only surfaced once the sample reached the lab

  • 30+ fields completed by hand, no validation

  • Filter selection left entirely to the technician

  • Lab routing decided after the sample was already sent

  • Errors only surfaced once the sample reached the lab

  • 30+ fields completed by hand, no validation

  • Filter selection left entirely to the technician

  • Lab routing decided after the sample was already sent

  • Errors only surfaced once the sample reached the lab

THE CORE TENSION

THE CORE TENSION

Aviation maintenance demands zero-error data. The workflow had no way to enforce it.

In aircraft maintenance, a wrong filter or incomplete submission doesn't just cause rework — it compromises engine health data and delays safety diagnostics. I brought this framing to product and engineering before any design work began, making the case that every UI decision had to close the error gap at the point of entry, before mistakes reached the lab.

In aircraft maintenance, a wrong filter or incomplete submission doesn't just cause rework — it compromises engine health data and delays safety diagnostics. I brought this framing to product and engineering before any design work began, making the case that every UI decision had to close the error gap at the point of entry, before mistakes reached the lab.

CHALLENGE

CHALLENGE

Complaints were surface-level, issues ran deeper.

Technicians described delays and missing context but couldn't say why — they had normalized the friction and built workarounds into their daily routine. Remote interviews got past the stated complaints and into what was actually breaking the process.

Technicians described delays and missing context but couldn't say why — they had normalized the friction and built workarounds into their daily routine. Remote interviews got past the stated complaints and into what was actually breaking the process.

RESPONSE

RESPONSE

Latent needs the users didn't know to request.

I mapped what maintenance teams were actually doing against what the workflow assumed they were doing, then brought those findings to product and engineering before any forms were designed. Validation logic and guided entry were built against real submission failure patterns — not assumed pain points.

DESIGN DECISIONS

DESIGN DECISIONS

Every decision traced back to something that wasn't working.

Three distinct design choices — each one a direct response to something we found in the field

Three distinct design choices — each one a direct response to something we found in the field

DECISION 1

DECISION 1

Validation logic at point of entry — close the error gap before mistakes reach the lab.

FINDING

Paper-based submissions had no validation — wrong filter selections and incomplete entries only surfaced as errors after samples reached the lab. By then, the cost was a failed submission and an engine without a diagnostic record.

Paper-based submissions had no validation — wrong filter selections and incomplete entries only surfaced as errors after samples reached the lab. By then, the cost was a failed submission and an engine without a diagnostic record.

INSIGHT

This wasn't a training problem — it was a workflow design problem. Validation had to happen at point of entry, before the technician walked away from the aircraft.

DESIGN

Required fields, filter confirmation, and entry checks enforced inline. Wrong selections surface immediately, in context, before the sample leaves the field.

DECISION 2

DECISION 2

Contextual filter recommendation logic — the right filter for the right aircraft, not a list to choose from.

FINDING

Incorrect filter usage was a primary driver of costly resubmissions. Technicians were selecting from a generic list with no guidance tied to the aircraft in front of them.

INSIGHT

This wasn't a knowledge problem — it was a context problem. A list that could return the wrong answer was worse than no list.

DESIGN

Filter recommendation logic tied to aircraft type and engine model. The correct filter surfaced automatically. No generic lists, no room for a plausible-but-wrong selection.

DECISION 3

DECISION 3

Lab proximity mapping — routing made at point of collection, not corrected after.

FINDING

Manual lab routing was adding days to engine health diagnostics. Corrections happened after delays had already accumulated.

INSIGHT

Lab selection was a maintenance planning decision, not a logistics afterthought. Getting it wrong at submission meant delays that rippled into engine health decisions.

DESIGN

Lab surfaced automatically based on technician location and sample type. No manual lookup, no post-submission correction. The right lab was part of the submission.

OUTCOMES

OUTCOMES

Streamlining oil sampling and reducing errors — the GoDirect SOAP app shipped on Android.

REDUCED ERRORS & RESUBMISIONS

Guided input and validation logic eliminated duplicate entries and filter misidentification — the primary drivers of costly resubmissions. Fewer errors at point of entry means fewer failed samples reaching the lab and lower rework cost per submission.

IMPROVE SAMPLE ACCURACY

Filter-hour calculations, contextual recommendations, and lab proximity routing reduced errors that compromised data integrity in the field. Accurate samples mean faster, more reliable engine health decisions for every aircraft in the maintenance program.

FASTER MAINTENANCE DECISIONS

Real-time filter-hour calculations and lab routing gave technicians the information they needed at point of collection — not after a failed submission. Maintenance teams received cleaner data faster, reducing the lag between sample submission and actionable engine health decisions.

LIVE ON GOOGLE PLAY - ANDROID

The GoDirect SOAP app is available for Android on the Google Play Store.

PROJECT GALLERY

PROJECT GALLERY

From field research notes to live submission app on Google Play

Pain Points Research

Pain Points Research

Concept Sketch

Concept Sketch

Final Design

Final Design

What I'd do differently

Get into the field earlier. The most valuable insights came from watching technicians work in real conditions, but that happened later in the process than it should have. Earlier field access would have shaped the validation logic and filter guidance from the start. I'd also push for testing against real lab workflows before handoff, not after. Edge cases in sample routing and filter selection only surfaced once the system met actual submission conditions.