The integration of augmented reality (AR), extended reality (XR), and virtual reality (VR) technologies in agriculture has shown significant promise in enhancing various agricultural practices. Mobile robots have also been adopted as assessment tools in precision agriculture, improving economic efficiency and productivity, and minimizing undesired effects such as weeds and pests. Despite considerable work on both fronts, the combination of a versatile User Interface (UI) provided by an AR headset with the integration and direct interaction and control of a mobile field robot has not yet been fully explored or standardized. This work aims to address this gap by providing real-time data input and control output of a mobile robot for precision agriculture through a virtual environment enabled by an AR headset interface. The system leverages open-source computational tools and off-the-shelf hardware for effective integration. Distinctive case studies are presented where growers or technicians can interact with a legged robot via an AR headset and a UI. Users can teleoperate the robot to gather information in an area of interest, request real-time graphed status of an area, or have the robot autonomously navigate to selected areas for measurement updates. The proposed system utilizes a custom local navigation method with a fixed holographic coordinate system in combination with QR codes. This step toward fusing AR and robotics in agriculture aims to provide practical solutions for real-time data management and control enabled by human-robot interaction. The implementation can be extended to various robot applications in agriculture and beyond, promoting a unified framework for on-demand and autonomous robot operation in the field.