Developers

Power mobile augmented reality experiences with ArcGIS Runtime!

We’re proud to announce support for building production-ready, augmented reality (AR) experiences for iOS and Android devices using ArcGIS Runtime!   ArcGIS Runtime toolkits for Android, iOS and .NET (Qt is coming in the near future) have been updated for version 100.6 to include a new AR view component (eg ArcGISARView) and API, built on the SceneView and 3D capabilities in ArcGIS Runtime.   You can use this AR view with device sensors such as a compass and camera to build flyover, tabletop AR and world-scale AR experiences.

Flyover – Allows you to explore a scene using your device as a window into a virtual world.   The viewpoint of the camera in the scene will respond in the position and orientation of the device.   A typical flyover experience will start with the scene’s camera positioned over an area of interest.   The user can touch to change position, or move and rotate the device to navigate the virtual world.

A scene that shows flying over an integrated mesh that represents surface structures and vegetation in a remote location.

 

Tabletop –  Visualize a scene at a relatively small scale, similar to a 3D model, anchored to a physical surface in the real world, such as the top of a table.    Scene content overlays a camera feed that represents the real, physical world.   Users can move around the tabletop to view the scene from different angles.   This experience can be ideal for visualizing and collaborating around a 3D view of a remote location or work site for strategic planning, site selection, and remote assessment scenarios.

A scene showing buildings and imagery in a table top AR experience.

 

World-scale –  Visualize a scene at world-scale where virtual content in the scene and physical content from a camera feed are represented at full-scale, in the same real-world coordinate system.   The location and orientation of the scene camera and device camera must match so that the real-world location and proportional scale of virtual and physical content match.   This experience can be ideal for visualizing and interacting with geographic data on-site, in the field, in the real (3D) world.   This includes workflows to support asset inspection and creation, navigation, resource assessment, and location intelligence.

A scene showing sub-surface utility infrastructure in a world-scale AR experience.

 

The most significant challenge to building and delivering a world-scale AR experience is maintaining an appropriate level of accuracy for the position and orientation of a device.  Ideally, the overlay of virtual (from scene camera) and physical  (from device camera) content matches perfectly.   As the device moves in the real world, the position and orientation of both cameras should remain in sync.  In reality, limitations of devices and AR frameworks and differences in the precision and availability of positioning systems will usually require your AR workflows to include an initial calibration effort to improve overlay accuracy.   Fortunately, the ArcGIS Runtime toolkits are open source, so you as the developer, can enhance the ArcGISARView component to use one or a combination of techniques to drive the calibration experience and maintain position and orientation.   For example, you can use a device’s built-in GNSS (eg. GPS) to get an initial position, provide a UI to adjust the position of the virtual surface and features to overlay those in the real world, then use an AR framework (ARCore, ARKit) to handle changes in relative position.    AR samples included with the ArcGIS Runtime SDKs provide initial examples for calibration experiences to help you get started.

It’s also important to consider the readiness of your geographic data for a world-scale AR experience.  Since your geographic data needs to overlay the physical, 3D world, its accuracy in the horizontal (eg latitude/longitude) and vertical (eg elevation) space is important.   In some cases, the location of features may have been established using coarse grained workflows and/or elevation information was not included or was derived from a low resolution elevation surface.   The accuracy required in your geographic data may differ depending on how it will be used.  For example, to find the general location of a gas meter on a large property may only require ~10m accuracy, whereas a solution to determine where it’s safe to dig to avoid a sub-surface gas pipeline may require centimeter level accuracy.   In short, consider what is needed to achieve the appropriate level of positional accuracy for the device and your geographic data in any world-scale AR experience.

To get started, use the guides in the following ArcGIS Runtime SDKs.   Each guide will discuss AR capabilities, present code examples, and direct you to the toolkits and samples help you get up and running.

Also, we would like to thank over 350 participants in the ArcGIS Runtime AR/VR Beta program.  Your feedback was critical to delivering a developer framework to power AR experiences with ArcGIS Runtime.  Thank you!

Enjoy!

About the authors

Rex Hansen

Rex Hansen is a Product Manager for ArcGIS Runtime. He has over 25 years of experience in GIS, spatial analytics, and computer mapping. Recently, he has helped guide the development of native solutions and technologies in the GIS industry to use authoritative geospatial data in immersive, extended reality experiences.

Connect:

Nick Furness

Nick Furness is a Technical Product Manager for the ArcGIS Runtime SDKs for iOS and macOS. He's spent over 20 years working in the GIS space building projects ranging from small mom-and-pop solutions all the way up to enterprise utility and national government deployments. Nick presents at various Esri Developer Summits, the User Conference, and many other events, almost always talking about something to do with the Runtime SDKs although you might find the odd bit of JavaScript thrown in there.

Connect:

Next Article

Getting to Know the Data Loading Tools

Read this article