Before you draw a single polygon, the system now knows approximately how much lawn you're looking at.

Two APIs fire in parallel the moment an address is geocoded. Google's Solar API returns building footprint data — specifically wholeRoofStats.groundAreaMeters2, which gives the structure's ground-level footprint. A property boundary API returns the parcel polygon. Subtract the structure from the lot and you have an upper bound on treatable lawn area. The whole thing runs through a Supabase edge function using Promise.allSettled, so either API can fail independently without breaking the other.

A "Property Estimate" card appears on the measurement control panel: lot size, minus structure footprint, equals maximum lawn. It's an estimate, not a measurement — prominently labeled as such. But it reframes the measurement task. Instead of staring at a satellite image and guessing where to draw, the operator starts with a number and draws to refine it. The parcel boundary renders as a white low-opacity polygon on the map, non-editable, visible just long enough to orient you before AI detection takes over.

The fire-and-forget pattern keeps the experience fast. Property insights are supplementary, not blocking. The address geocode completes, the map centers, and the insights request launches in the background. If it comes back before you start drawing, great — there's a card with numbers. If it doesn't, you haven't waited for anything.

The Solar API integration was surprisingly clean. Same GCP key, single endpoint, one field in the response that gives exactly what's needed. The property boundary API provides the lot perimeter. Neither is precise enough to be a measurement. Together, they're precise enough to be a starting point — and a starting point is what turns a blank satellite image into an informed workspace.