How an iPhone, a physics engine, and a neural network learned to read a green.
To teach a model to read any green, you have to show it every kind of green.
So we built one. Procedurally generated surfaces — every undulation, slope, swale, ridge and shelf a real green might have — paired with a physics simulator that finds the optimal putt from any point. The model learns not by memorising. By watching the physics play out, fifty thousand times.
The result: when you scan a real green, the network recognises it — not because it has seen that specific green, but because it has seen its physics.
The average gap between the model's aim and the ball's true line. In context:
At ten feet, inside a golf-ball width. At sixty, inside the three-foot circle. Closer than your eyes can read.
Two golfers miss the same putt differently. The model adapts — and the more you play, the more it knows about you.
Hole 7 at Trump International Dubai. Two and a half degrees of break, eighty-eight feet, uphill — and three callouts the model wrote after watching the last ten putts of the round.
Different golfer, different callouts. Happening on real greens, today.
iPhone 12 Pro or later with LiDAR. TestFlight invites rolling out now.