When we built Space Tracker back in 2019, we used ARKit for the AR satellite spotting feature. It was the obvious choice at the time. Apple handled the camera, the device pose, and the rendering of the overlay, and we focused on the part we actually had expertise in, which was computing where a satellite would be in the sky at a given moment. The two layers met cleanly: we told ARKit where in the world to put a marker, and ARKit figured out how to project that onto whatever the camera was seeing.

So when we started Flyby in 2026, we more or less assumed we'd do the same thing. The orbital propagation code had gotten more sophisticated since 2019, the visibility calculations were doing a lot more work, and the device we were targeting was much faster than anything that existed when we wrote Space Tracker. ARKit had also had seven more years of updates. If anything, the AR layer should have been more of a solved problem the second time around.

What we found, once we got far enough into development to actually test outside at night, was that ARKit really does not get along with the sky.

What ARKit is actually doing

ARKit, at its core, is a visual-inertial tracking system. It uses the camera to detect feature points in the environment around you, watches how those points move across frames, and fuses that visual signal with data from the device's IMU to estimate where the phone is and which way it's pointing. When it works, it works very well. It can plant a virtual object on your kitchen floor and have that object stay convincingly in place as you walk around the room.

The catch is in the first step. ARKit's accuracy depends on the camera seeing visual features it can lock onto. In a typical AR scenario you're indoors or on a city street, and the world is full of textured surfaces, edges, and corners. The system has plenty to work with.

The night sky has almost none of that. Stars are too dim and too small for the camera to register as reliable feature points. There's no horizon if you're really pointed up. From ARKit's perspective the visual input is essentially noise, so it ends up leaning harder on the IMU. The IMU on its own drifts over time, and the result is an orientation estimate that slowly walks away from the truth. By the time a satellite pass reaches its peak, the marker can easily be off by ten degrees or more, which on a fast-moving target is the difference between actually seeing the ISS and staring at empty sky.

We hit this in 2019 too, but Space Tracker was a side project and we worked around it. Users could re-center the marker with a tap, passes were rarer back then, and the bar for "good enough" was lower. With Flyby we wanted something we'd actually trust, and that someone holding the phone for the first time wouldn't have to learn to compensate for.

ARKit
Visual feature tracking
looks for edges and surfaces
Sky is featureless
nothing to lock onto
Marker drifts
no locks, constant recalculation
Direct Sensors
IMU + GPS + magnetometer
raw gyro, accel, lat, lon, heading
Our orientation math
compute where the phone is pointed
Marker stays put
stable outdoors, in the dark
The two pipelines. They have the same goal, but very different inputs.

Rebuilding it on raw sensors

We pulled ARKit out of the AR tracking layer entirely and rebuilt it on top of the device's sensor APIs directly. The IMU gives us angular velocity and linear acceleration. The magnetometer, combined with the device's location, gives us true heading. We compute device orientation from those signals ourselves, render the camera feed as a plain background, and place the satellite or launch marker based on our own math against our own state.

The advantage isn't that any single component is doing something exotic. It's that we know exactly what's feeding the orientation estimate, we control how it's filtered, and we're not fighting a higher-level system that's trying to use information that isn't there. For outdoor sky tracking specifically, this turns out to be a much better fit. The sensors ARKit was falling back on are now the entire input, and we can tune them for the case we actually care about, which is a phone pointed at a dim sky with nothing on the horizon.

It also made debugging dramatically easier. When the marker is off, we can look at our pipeline and reason about why. There's no opaque visual SLAM system in the middle making decisions we don't have visibility into.

Closing thoughts

None of this is meant as a criticism of ARKit. It's a great piece of software for the problem it was designed to solve, which is room-scale AR in environments with visual texture. We use it elsewhere in Flyby for tasks it's well suited to. The point is just that pointing a phone at the sky is sufficiently far outside its design envelope that the abstractions break down, and you may not realize how badly until you're standing outside in the dark and the overlay is drifting in front of you.

If you're working on anything that needs accurate device orientation against the sky, it's worth considering whether you actually need a visual tracking system at all, or whether the sensors on their own get you most of the way there. For our case the answer was the second one. The sensor APIs are well-documented, the math is established, and the failure modes are far easier to reason about when you own the whole pipeline.

We probably would have arrived at this in 2019 if we'd had a reason to push on it then. We didn't, and so we used the obvious tool and accepted its limits. Seven years later the limits were the same, but the bar we were trying to clear was much higher, and that was enough to make us take the longer path.

See it in practice

Flyby is on the App Store. Take it outside on a clear night and tell us if the marker stays where it should.

Get Flyby