Software development

Positions of GPS Satellites in 3D

Calculating the 3D positions of GPS satellites can be challenging due to the complex trigonometry involved in translating 2D coordinates plus angular directions to a 3D space. In this article, let’s explore how to calculate the positions and display satellite locations in 3D, real-time, and even in Augmented Reality!

This is a revision of a previous post I wrote in the Esri Community – Native SDKs Blog in 2021. The main concepts remain the same, while some new features are added to the project, and the codebase is migrated to adopt the new 200.x ArcGIS Maps SDK for Swift.

Concepts

Handheld GPS/GNSS receivers are widely used devices that leverage signals from satellites to determine precise geographic locations. The receivers are used in various occasions such as navigation, outdoor recreation, surveying, and scientific research. And they can connect to computers and phones easily. They often use the NMEA protocol to communicate location data and other satellite-derived information in a standardized format. This protocol allows GPS receivers to be integrated with mapping software, expanding their usability beyond standalone navigation to include real-time data analysis and visualization.

A typical representation of a satellite tracking view on a mobile device.

Many handheld GNSS/GPS receivers support tracking satellite positions in 2D (schematically demonstrated in the diagram above). They typically use $GPGSV sentences from the NMEA (National Marine Electronics Association) specifications, which describe satellite positions through elevation and azimuth angles. These angles enable the calculation of approximate satellite positions in the sky by factoring in the receiver’s location coordinates, Earth’s radius, GPS satellite orbit altitude, elevation angle, and azimuth angle. On certain devices, each satellite shows up as a signal bar; other devices use 2 concentric circles to depict the earth and the satellites’ orbit, and the satellites are plotted on the graph. These graphics are OK in terms of giving people a general sense on where the satellites are currently located, but are neither intuitive, nor informative in 3D space.

To gain that 3D perspective, let’s look more at the elevation and azimuth angles provided in the $GPGSV sentences from the NMEA specs.

What are these angles? Imagine yourself standing in an open field (depicted by the gray plane in the image below), holding both a GPS receiver and a compass. Azimuth (the blue vector) tells us what direction to face at, and Elevation (the red vector) tells us how high up in the sky to look at. Turning around from the true north and raising the head, you can gaze at the approximate direction of the satellite in the sky with these 2 angles. These 2 values are simple to interpret for someone standing on the surface of the earth. However, things get a bit trickier when it comes to mapping software.

Concepts of Azimuth and Elevation angles in 3D space

How to represent azimuth and elevation on a map, or a 3D globe? There are a few known variables. The device location is represented as a pair of latitude-longitude (lat-lon) coordinates, which is a point on the earth’s surface. We also know where we should look at with these 2 angles. In a simplified way, the earth can be treated as a sphere (rather than an ellipsoid, which would complicate the math by quite a bit). Thus, we don’t need to take the variation of the radius of the earth into account, as it is always 6371 km.

The real problem is – how to convert our lat-lon coordinates on the surface of the globe into the 3D coordinate system of a sphere. Some trigonometry has to be done.

Methods

For the technical discussion below, I’ll use Swift and related Apple APIs to demonstrate the idea.

Geometry, Matrix and Rotations

The first thought was to convert all the lat-lon coordinates into 3D Cartesian coordinates of the sphere. However, it is quite complicated and error-prone to transform all the angles into xyz‘s. The lat-lon coordinates aren’t too bad – just a few rectangular to polar conversions. The real pain is the azimuth and elevation angles – to represent them in Cartesian, much vector and polar math need to be done. In contrast, the problem can seemingly be expressed succinctly in a spherical coordinate system. Which led me towards thinking of vector operations and matrices. The resulting coordinates of the satellite are the summation of 2 vectors: the one from the center of the earth to the device location, and the other one from the device location to the satellite in the orbit, represented by azimuth and elevation angles.

Finding it hard to visualize the spatial relationships of these vectors in the spherical coordinates, I decided to sketch the whole system in 3D to make it clearer. Using the online geometry sketching tool GeoGebra, I was able to create a spatial representation of the satellite and the earth.

Diagram to show what is an Elevation Angle

In order to calculate the coordinates, vanilla Cartesian transformations aren’t enough. Quaternions are introduced to express a position and orientation in space. SIMD library is used so that small vectors and matrices are easily expressed and quickly computed.

Quartenions

Quaternions are defined by a scalar (real) part, and three imaginary parts collectively called the vector part. They are often used in graphics programming as a compact representation of the spatial rotation of an object in three dimensions. One of the advantage is that they don’t suffer from singularity points (or gimbal locks), which are hard to avoid in Cartesian space when compounding rotations. Furthermore, they are quicker to compute than the representations by matrices. For instance, the lat-lon coordinates can be effectively represented by quaternions.

Consider a unit vector of the earth radius, with starting point at the center of the earth and terminal point at 0° East on the equatorial plane. By rotating this vector along the y-axis by latitude degrees, and z-axis by longitude degrees, we will have the vector pointing at the lat-lon position on the surface of the globe.

SIMD

Swift has a handy simd library. simd provides integer and floating-point types and functions for small vector and matrix computations. The functions provide basic arithmetic operations, element-wise mathematical operations, and geometric and linear algebra operations. It supports up to 4×4 elements in a matrix, which is a perfect fit for quaternions.

The Quest to Find “The 3rd Axis”

Using quaternions, we are able to represent the lat-lon coordinates with 2 rotations using y and z axes as the first and second axes. To get the vector that points at the satellite from the lat-lon location, we need a third axis to incorporate azimuth and elevation angles. This is the trickiest math for this problem. To get there, we’ll need to rotate a vector according to the elevation angle. That vector itself needs to be rotated from another vector by the azimuth angle. To better denote it, I’m calling this vector the 3rd axis.

It is hard to describe it in words, so please use the visualization on GeoGebra to see it in action. Toggle the graphics on the left pane to see their spatial relationships.

With 3 axes, 4 quaternions and all the mysterious steps, how to ensure the result is correct? Luckily, with the visuals created on GeoGebra, I can get the azimuth and elevation values reversely from the result by specifying the other parameters, namely latitude, longitude, earth radius and orbit radius. So, if my calculation is correct, the resulting satellite coordinates should be the same as on GeoGebra.

Algorithm

/// Calculates a satellite's position from its elevation and azimuth angles.
/// - Parameters:
/// - deviceLocation: The location of the device that receives satellite info.
/// - elevation: The elevation angle.
/// - azimuth: The azimuth angle.
/// - satelliteAltitude: The satellite's altitude above ground level.
/// - earthRadius: The earth's radius.
/// - Returns: A point representing the position of the satellite.
func satellitePosition(
    deviceLocation: Point,
    elevation: Measurement<UnitAngle>,
    azimuth: Measurement<UnitAngle>,
    satelliteAltitude: Measurement<UnitLength> = .init(value: 20200, unit: .kilometers),
    earthRadius: Measurement<UnitLength> = .init(value: 6371, unit: .kilometers)
) -> Point {
 
    // Degree to radian conversions.
    let longitude = Measurement<UnitAngle>(value: deviceLocation.x, unit: .degrees).converted(to: .radians).value
    let latitude = Measurement<UnitAngle>(value: deviceLocation.y, unit: .degrees).converted(to: .radians).value
    let elevation = elevation.converted(to: .radians).value
    let azimuth = azimuth.converted(to: .radians).value

    // 1. Basic trigonometry
    let r1 = earthRadius.value
    let r2 = r1 + satelliteAltitude.value
    let elevationAngle = elevation + .pi / 2
    let inscribedAngle = asin(r1 * sin(elevationAngle) / r2) // Law of Sines
    let centralAngle = .pi - elevationAngle - inscribedAngle // Internal angles sum = pi

    // 2. Magical quaternions
    let latLonAxis = simd_normalize(
        sphericalToCartesian(r: r1, theta: .pi / 2 - latitude, phi: longitude)
    )
 
    // Find the vector between device location and North point (z axis)
    var thirdAxis = simd_normalize(
        simd_double3(0, 0, 1 / latLonAxis.z) - latLonAxis
    )
 
    // Rotate to the azimuth angle
    let quaternionAzimuth = simd_quatd(
    angle: 1.5 * .pi - azimuth,
        axis: latLonAxis
    )
    thirdAxis = quaternionAzimuth.act(thirdAxis)
 
    // Rotate the lat-lon vector to elevation angle, with the rotated third axis
    let quaternionElevation = simd_quatd(
        angle: -centralAngle,
        axis: thirdAxis
    )
    let resultVector = quaternionElevation.act(latLonAxis)

    // 3. Make result
    let result = cartesianToSpherical(vector: resultVector)
    let resultLatitude = Measurement(
        value: .pi / 2 - result.theta,
        unit: UnitAngle.radians
    ).converted(to: .degrees).value

    let resultLongitude = Measurement(
        value: result.phi,
        unit: UnitAngle.radians
    ).converted(to: .degrees).value

    return Point(
        x: resultLongitude,
        y: resultLatitude,
        z: satelliteAltitude.converted(to: .meters).value,
        spatialReference: .wgs84
    )
}

The method to get the satellite’s position can be split into 3 parts – basic trigonometry, quaternion rotation, and coordinate system conversion.

Annotated angles in the elevation tangent plane
Various angles for calculation, visualized

First, we need some basic trigonometry on the elevation plane, i.e. the plane where the elevation angle exists. Using Law of Sines, we can get the value of the central angle, assuming both the earth radius and the satellite orbital distance are constants.

Second, the rotations.

  • Rotate the x unit vector (0° East) to get the latLonAxis vector, which starts at the center of the earth, and ends at our lat-lon point on the earth surface.
  • In order to find the axis for the elevation angle rotation, we need to first create the thirdAxis – which is a vector that runs perpendicular to the rotated azimuth vector. The azimuth vector itself, is a line on a tangent plane of the earth (which the point of tangency is the lat-lon point) that, by definition, starts at the lat-lon position and ends at 0° North, which intersects with the z axis.
  • Ensure the sign of these angular values match the direction of rotation.

Third, convert the Cartesian coordinates back to the (geographic) spherical coordinates.

There are a few possible ways to test the algorithm’s correctness. The most accurate and straightforward way is to run the algorithm, and compare the result with the satellite coordinates from GeoGebra. One can also run some ad-hoc testing with the real data from a GPS receiver. For example, we should expect all the satellites-in-view to be above, rather than below the horizon; If a satellite has a very small elevation angle and the result does show it just above the horizon and vice versa, then it’s very likely that we’ve done it right!

Edge Case: 0° Latitude

When the device’s lat-lon location is on the equatorial plane where the latitude equals to 0° (say we are in Quito, Ecuador), the tangent plane at that point is parallel to the z axis, which leads to the division by zero problem that introduces some ambiguity. Having searched online, I cannot fine a definitive answer on what is the azimuth angle for a location on the equator. Thanks to one of the comments from the old article, Anthony proposed a workaround to add a small deviation when latitude is zero, and I think it is valid workaround.

var thirdAxis: simd_double3
if latLonAxis.z == 0 {
   // To avoid division by zero error.
   thirdAxis = simd_double3(0, 0, 1)
} else {
   thirdAxis = simd_normalize(simd_double3(0, 0, 1 / latLonAxis.z) - latLonAxis)
}

The other way is to handle the division by zero as a special case. As we approach the equator from the Northern Hemisphere, the limit of the third axis approximately equals the z unit vector. If the azimuth angle reported by the GPS receiver is as expected, the result should be correct. While it can also be totally wrong if the reported azimuth angle is a default fallback value for this edge case.

Results

The results are verified and displayed using the ArcGIS Maps SDK for Swift, which provides 3D and Augmented Realit (AR) visualization of the calculations. The demo app is tested with Bad Elf GPS Pro+ GNSS surveyor. It includes a test NMEA data that features a 2-minute driving trip around Redlands, CA.

Satellite 3D demo app screenshot
Screenshots of the Satellite demo app in 3D (left) and real-time satellite locating by thumb using AR (right)!

When thinking about satellites, people often perceive them as not-relatable high-tech objects that is only accessible by top officials and scientists. In other words, only professionals have the technical know-how to operate them. And even for them, trying to visualize the satellites in 3D is laborious. One of the less explained topics is how to solve the mathematical problem, specifically trigonometry, of converting receiver data (receiver lat-lon coordinates, Earth radius, GPS satellite orbit altitude, elevation angle, and azimuth angle) into lat-lon-z coordinates to represent them in 3D mapping. Which is exactly what this article wants to achieve.

At first, transforming the azimuth-elevation representation of a satellite position into its lat-lon location seems to be an insignificant chore. After all, there are only 2 values to be transformed, how hard it can be? It turns out to be a quite interesting topic that involves many concepts for translating one spatial representation to another. The resulting algorithm is still quite concise, thanks to the use of quaternions.

People love to see the satellites in a perceivable graphical representation! With today’s advancement in 3D mapping technology, it’s never been easier to visualize those remote objects in space at our fingertips. When I first presented the results, people were astounded by the fact that these satellites are just there, orbiting above our heads and working tirelessly to connect everyone, everywhere.

Best of all, you can try the app I’ve shown in this article out for free with ease! Follow the link at the end to download the project to trace the stellar whispers of satellites as they dance across the deep space, weaving secrets of the cosmos just for you.

Summary

In this article, we have explored the fascinating challenge of calculating and visualizing GPS satellite positions in 3D. We tackled unique technical hurdles, such as converting azimuth and elevation angles into 3D lat-lon coordinates, leveraging quaternions for rotations, and simplifying complex spatial math with Swift’s SIMD library. In the end, we created a SwiftUI-based app that uses ArcGIS Maps SDK for Swift to display satellite positions in an intuitive 3D scene. To dive deeper into the code and learn more, check out the GitHub repository and ArcGIS Maps SDK documentation.

If you’re interested in solving GIS-related technical problems like these, consider exploring opportunities at Esri Careers.

Bonus: Augmented Reality

Since we have done all the calculations to pinpoint the location of the satellites flying in the sky, why not go one step further to show them in Augmented Reality? Point the camera on the device to the sky, and you can see the satellite positions rendered in real time with AR. It would make a great educational tool for people who want to get a better sense of GPS satellites.

In the app, I added leader lines between the device location and the satellites. People can see through their AR camera and use their finger to follow along the approximate direction of a satellite. While it might be less accurate due to the general limitation of AR nowadays, it is still a fun experience and would be helpful if you are trying to observe the satellites with a telescope ;-).

About the author

Ting Chen is a Senior Product Engineer with the Maps SDK team at Esri, primarily focusing on software development with Swift on iOS and macOS. He also contributes to Esri GeoNet Community and ArcGIS Developer website.

Next Article

Full Send: How Data Race Safety Was Added to the ArcGIS Maps SDK for Swift

Read this article