RobotX ran a crop-inspection pilot of the RX BRAIN spatial operating prototype in a California almond orchard this month, mounted on a Unitree B1 quadruped. The run covered row traversal, per-tree perception capture, and 3D mesh reconstruction of the canopy for anomaly detection. The prototype hardware, firmware, and perception stack shipped to the field were the same ones we run on our warehouse and industrial chassis — no bespoke retraining, no chassis-specific firmware build.
In the field
The B1 walked aisles autonomously at a steady inspection pace, using RX BRAIN's onboard perception to hold row centerline, avoid irrigation lines and low branches, and recover gait after uneven ground. There was no orchard-specific tuning in the navigation loop — the same planner that handles warehouse aisles treats tree rows as structured corridors.
At each tree, RX BRAIN captured a perception sweep of the canopy, associated the capture with a tree identifier, and flagged candidate anomalies — stressed foliage, irregular fruit set, broken scaffold limbs — against the reconstructed local geometry. Capture and tagging ran continuously across the traversal; no stop-and-scan cycle.
What RX BRAIN sees
The images below are not renders or marketing visualizations. They are the live 3D mesh reconstruction RX BRAIN produces per frame from the onboard sensor stack — RobotX's proprietary vision output, generated on-device while the robot is walking. The canopy geometry, trunk structure, and inter-row spacing are all recovered directly from the same perception pass that drives navigation and anomaly tagging.
We develop a mesh-based perception layer like this one for each real-world use case we deploy into. The orchard canopy here is the same class of artifact as the warehouse-aisle reconstruction, the solar-array reconstruction, or the retail-shelf reconstruction — one perception stack, one engineering workflow, per-domain outputs.
Close-ups in motion
The close-up clips below were captured during the same traversal. They give a sense of footing behavior and how close the robot gets to the canopy during per-tree capture.
What universality buys us
The pilot is evidence for a claim we have been making since the RX BRAIN launch: the perception stack is the product, the chassis is a carrier. When the same binary, with the same trained weights and the same navigation primitives, runs on a legged agricultural platform in the morning and a warehouse AMR in the afternoon, the cost of entering a new vertical collapses toward the cost of integration work — not the cost of rebuilding perception.
That is the commercial thesis in plain terms. A deployment surface like a solar-farm inspection loop, a warehouse aisle audit, a retail shelf-compliance sweep, or an indoor industrial inspection route is, to RX BRAIN, another instance of the same problem shape. The orchard run is one concrete example of the pattern. We expect to publish additional deployments on the same stack through the rest of the year.






