THE PROBLEM

Led UX research and design for CAMP Systems — from field research and workflow analysis through mobile UI and final handoff.
WHAT I OWNED
End-to-end UX — remote research with maintenance teams, wireframes, interactive prototypes, modular UI component design, contextual filter recommendation logic, lab proximity mapping, and final mobile UI screens.
HOW I WORKED
Designed for the mobile experience of CAMP's SOAP platform — the primary interface where maintenance crews track engine data, compliance, and service records in the field. I worked directly with engineers and product to align on data behavior and workflow logic before any screens were finalized. Research was done through remote interviews with maintenance teams to understand how technicians actually move through a job, not how the system assumed they did. Deliverables were annotated mockup files handed off to development.
Aviation maintenance demands zero-error data. The workflow had no way to enforce it.
Complaints were surface-level, issues ran deeper.
Latent needs the users didn't know to request.
I mapped what maintenance teams were actually doing against what the workflow assumed they were doing, then brought those findings to product and engineering before any forms were designed. Validation logic and guided entry were built against real submission failure patterns — not assumed pain points.
Every decision traced back to something that wasn't working.
Validation logic at point of entry — close the error gap before mistakes reach the lab.
FINDING
INSIGHT
This wasn't a training problem — it was a workflow design problem. Validation had to happen at point of entry, before the technician walked away from the aircraft.
DESIGN
Required fields, filter confirmation, and entry checks enforced inline. Wrong selections surface immediately, in context, before the sample leaves the field.
Contextual filter recommendation logic — the right filter for the right aircraft, not a list to choose from.
FINDING
Incorrect filter usage was a primary driver of costly resubmissions. Technicians were selecting from a generic list with no guidance tied to the aircraft in front of them.
INSIGHT
This wasn't a knowledge problem — it was a context problem. A list that could return the wrong answer was worse than no list.
DESIGN
Filter recommendation logic tied to aircraft type and engine model. The correct filter surfaced automatically. No generic lists, no room for a plausible-but-wrong selection.
Lab proximity mapping — routing made at point of collection, not corrected after.
FINDING
Manual lab routing was adding days to engine health diagnostics. Corrections happened after delays had already accumulated.
INSIGHT
Lab selection was a maintenance planning decision, not a logistics afterthought. Getting it wrong at submission meant delays that rippled into engine health decisions.
DESIGN
Lab surfaced automatically based on technician location and sample type. No manual lookup, no post-submission correction. The right lab was part of the submission.
Streamlining oil sampling and reducing errors — the GoDirect SOAP app shipped on Android.
REDUCED ERRORS & RESUBMISIONS
Guided input and validation logic eliminated duplicate entries and filter misidentification — the primary drivers of costly resubmissions. Fewer errors at point of entry means fewer failed samples reaching the lab and lower rework cost per submission.
IMPROVE SAMPLE ACCURACY
Filter-hour calculations, contextual recommendations, and lab proximity routing reduced errors that compromised data integrity in the field. Accurate samples mean faster, more reliable engine health decisions for every aircraft in the maintenance program.
FASTER MAINTENANCE DECISIONS
Real-time filter-hour calculations and lab routing gave technicians the information they needed at point of collection — not after a failed submission. Maintenance teams received cleaner data faster, reducing the lag between sample submission and actionable engine health decisions.
LIVE ON GOOGLE PLAY - ANDROID
The GoDirect SOAP app is available for Android on the Google Play Store.
[ ↗ ] View on Google Play
From field research notes to live submission app on Google Play

What I'd do differently
Get into the field earlier. The most valuable insights came from watching technicians work in real conditions, but that happened later in the process than it should have. Earlier field access would have shaped the validation logic and filter guidance from the start. I'd also push for testing against real lab workflows before handoff, not after. Edge cases in sample routing and filter selection only surfaced once the system met actual submission conditions.




