
MagicEdit
Used machine learning to mostly automate the part of the app people liked least: editing their round after playing.
Role
Lead Designer (UX & UI)
Team
Design, Data Science, Engineering, Product Management
Timeline
2 Months
Company
Arccos Golf
The Post-Round Problem
After finishing a round, golfers are supposed to review their shot data and fix anything the sensors got wrong. In practice, most people don't bother. It means scrolling through every hole, checking every shot, and dragging things around on a map. After four hours of golf, nobody wants to do that.
So their stats end up inaccurate, which makes the whole product less useful. The data that's supposed to help you improve your game can't do its job if half of it is wrong.
What We Heard
We observed golfers going through their post-round routine and spent time with the support team understanding what people actually struggled with. A few patterns kept coming up.
The scorecard is what people trust. If their score looked right, they felt okay about the rest of the data. Individual shot positions could be off, but as long as the total added up, golfers moved on. The scorecard was their mental model for "correct."
Editing felt risky. People worried they'd accidentally delete a shot or move something and mess up their round. The undo path wasn't obvious, so most golfers just didn't touch anything. Better to have slightly wrong data than risk making it worse.
Scorecard-First
Instead of asking golfers to check every shot on every hole, we start with something they already understand: confirming their scorecard. It's the same motion as signing a card at the clubhouse. Quick, familiar, and low-stakes.
Once scores are locked in, the system uses ML to compare sensor data against those confirmed scores and cleans up the shots automatically. The golfer only gets pulled in when the system isn't confident about what happened.
The activity feed showing completed rounds. Most golfers hit View and never come back to Edit.
The prompt asking the golfer to confirm their scorecard before MagicEdit processes the round.
The score and putt confirmation grid where golfers verify their numbers hole by hole.
Only What Needs Attention
After the golfer confirms their scorecard, the system goes to work. It auto-verifies most holes by comparing sensor data against the confirmed scores. When it's done, a summary tells the golfer exactly how many holes still need a look. No guessing, no scrolling through 18 holes to find the problems.
The holes that need review show shot traces on the map with confidence indicators. The golfer can see exactly what the AI is unsure about and make a quick decision. Everything the system is confident about is already handled.
The summary modal showing 16 of 18 holes verified automatically, with 2 holes flagged for review.
A clean hole where the AI is confident about all shot positions. The golfer can confirm and move on.
A flagged hole with confidence indicators showing where the AI needs the golfer's input.
Why It Worked
The completion screen after MagicEdit finishes processing a round.
The redesigned activity feed with a prominent MagicEdit call-to-action on unedited rounds.
The scorecard-first approach worked because it anchored the automation to something golfers already do. Confirming your score feels like signing a card at the clubhouse. Starting from that familiar moment made the whole process feel trustworthy instead of opaque.
Editing went from a long repair job to a quick review. The system does the heavy lifting, and the golfer only steps in where it matters. Most rounds that used to go unedited get cleaned up in under a minute.













