Available with Standard or Advanced license.
Available for an ArcGIS organization with the ArcGIS Reality license.
The data requirements and workflow below explain how to set up a Reality mapping workspace using satellite imagery.
General data requirements
The following are some general data requirements for images to be processed using ArcGIS Reality for ArcGIS Pro.
- Two or more images—The minimum requirement is two highly overlapping single images (not collected as a stereo pair), one stereo pair, or tri-stereo pair. For increased accuracy, quality, and redundancy, more images are highly recommended.
- Cloud free—The images to be processed are free of clouds in the project area.
- Highly overlapping images—The project area must be fully covered with overlapped imagery. For multi sensor datasets, each sensor data type must fully cover the project area.
- Not orthorectified—ArcGIS Reality for ArcGIS Pro will not work with orthorectified images. The following are examples of vendor-specific imagery products suitable for orthorectification using ArcGIS Reality for ArcGIS Pro.
- Maxar—View Ready (Standard) OR2A mono, or View Ready Stereo (Standard) OR2A product.
- Airbus—Primary mono, Primary stereo, or Primary tri-stereo product.
- Associated Rational Polynomial Coefficient (RPC) file—The RPC file is a text file delivered with the satellite imagery by the provider. The RPC file is required to support the geometric correction of satellite images.
Note:
Depending on the satellite imagery vendor, the file may be called a RPC or RPB file, and may sometimes be a .txt file. They all refer to the same type of information—an abstraction of the satellite camera model.
- Single or multi-band data, includes single band (panchromatic), 3 bands (RGB), multispectral, bands pansharpened or 4 bands (RGB,NiR) multispectral + Panchromatic to support pansharpening in ArcGIS Reality for ArcGIS Pro.
- 8 bit or 16-bit imagery.
- High sun elevation angle to minimize the effects of shadow in the derived products. A sun elevation angle of 60 degrees or higher is recommended.
- Spatial referencing supported includes Geographic (WGS84), or WGS84 UTM or NAD83 UTM.
- Elevation source—This information provides an initial height reference for computing the block adjustment. This height reference can be derived from a digital elevation model (DEM) or the image metadata, or you can specify an average ground elevation or z-value.
Note:
If you need to perform an RPC adjustment using a DEM as the elevation source, it is recommended that you use a local DEM that has a EGM96 or WGS84 Vertical Coordinate System (VCS). Use the Project Raster tool to reproject your DEM if it has a VCS that is different than EGM96 or WGS84.
- Optionally, you can download the Global DTM of the Earth locally, which provides access to a global DEM that can be used to support this process. Once installed, this global DEM will replace the ArcGIS World Elevation service as the default DEM when processing satellite imagery using a Reality mapping workspace.
Multi-sensor data requirements
Additional requirements for working with multi-sensor satellite imagery are listed below.
- Images to be processed must be from the same family of satellite sensors. For example, Maxar’s Worldview-3, Worldview-2 and GeoEye-1 can be combined in a single project.
- Images must have the same spatial referencing.
- Images must have the same number of bands.
- Images must have the same bit-depth.
Data requirements for mesh generation
In addition to the above requirements, the following requirements are recommended.
- Sensor Angles—It is recommended that the acquired images have varying incidence angles. This includes images having incidence angle close to nadir (0-5 degrees) and oblique (up to 20-28 degrees). This ensures good coverage of building facades. For optimal results, the convergence angle (the 3D angle between the incidence angles) of the stereo pairs needs to be close to 9 degrees.
- Target azimuth angle – A wide target azimuth range covering all look directions around the project area is required for good mesh texturing. See target azimuth example below.
Note:
Target azimuth angle refers to the angle in degrees from the target to the sensor. The angles range from 0 degrees to 360 degrees in a clockwise direction.
Create a Reality mapping workspace
To create a Reality mapping workspace using satellite imagery, complete the following steps:
- On the Imagery tab, click New Workspace.
- On the Workspace Configuration page, type a name for the workspace.
- Ensure that the Workspace Type option is set to Reality Mapping.
- From the Sensor Data Type drop-down list, choose Satellite.
- Optionally, from the Basemap drop-down list, choose a basemap as a backdrop for the image collection.
- Optionally, check the Allow adjustment reset check box if you want to revert your workspace to a previous state.
You can also import and use an existing image collection or mosaic dataset for your workspace.
- Click Next.
The workflow wizard advances to the Image Collection pane.
- In the Image Collection window, under the Sensor Type drop-down menu, choose an appropriate sensor type from the list of satellite sensors.
- Under Folder Containing Images, click the Browse button and browse to the folder on disk containing the imagery, select it, and click OK.
The supported Workspace Spatial Reference information will be automatically populated, and the option to add the spatial reference information manually will be unavailable. If the system is unable to automatically determine the appropriate spatial referencing, manual entry of this information will be enabled.
- If the system was unable to automatically determine the appropriate spatial referencing, set it manually. Under Spatial Reference, click the Browse button and set the Current XY and Current Z coordinates for the project.
When processing satellite imagery in Reality mapping, the planimetric (XY) coordinate system must be defined using the WGS84 UTM reference frame, and the Vertical Coordinate System must be WGS84.
- Click OK to close the Spatial Reference window.
- Click Next.
- If working with multi-sensor satellite imagery, additional sensor types can be added by clicking the Add Sensor button.
- Repeat steps 8 and 9 to define the additional sensor parameters.
- Click Next.
The workflow wizard advances to the Data Loader Options pane.
- On the Data Loader tab, define the Elevation Source, or use the default DEM.
- If using a DEM as the Elevation source, set the Geoid Correction.
If your local DEM has ellipsoidal height, select None from the Geoid Correction drop-down list. If your DEM has orthometric height, select EGM96.
- Under Processing Template, choose the appropriate processing template based on your project requirements.
If you want to generate a DSM or DTM, choose the Panchromatic template. If you want to generate a True Ortho or DSM mesh, choose one of the following templates:
- Multispectral template if the data being processed is already pansharpened, or the imagery is comprised of multispectral and panchromatic data, but a multispectral product is required.
- Panchromatic template when processing panchromatic data.
- Pansharpened template if the data being processed is comprised of both multispectral imagery with associated panchromatic data.
Note:
Pansharpening is an image fusion process that combines a high-resolution panchromatic image with a lower resolution multispectral image to create a high-resolution multispectral image.
- All Bands template when adding reflectance satellite imagery data to be used for Digital Surface Model (DSM), true ortho or mesh generation.
- Multispectral Acomp template if the data being processed is comprised of multispectral reflectance data that will not be used for DSM, true ortho or mesh creation.
- Panchromatic Acomp template when processing panchromatic reflectance data that will not be used to support Reality mapping-derived product generation.
- Pansharpened Acomp template if the data being processed is comprised of pansharpened, reflectance data that will not be used for DSM, true ortho or mesh creation.
Note:
The workspace creation process will fail if an incompatible processing template is used to add the imagery. It is recommended, that the image metadata file be reviewed to determine the imagery product type, which is used to determine the processing template selection process.
- Expand the Advanced Options section. The available options will vary based on the Processing Template selected. For example, if an atmospheric compensation (Acomp) type template is selected, the Stretch and Gamma options will not be available, but will appear for all other Processing Template types.
- Expand Gamma.
- For Gamma Stretch, select User Defined from the drop-down menu, and enter an appropriate value. For example, 1.7 works well with Maxar and Airbus imagery.
- Expand Pre-processing.
- Ensure the Calculate Statistics option is checked.
Note:
If statistics were previously calculated for the imagery using the Build Pyramids and Statistics geoprocessing tool, this step can be skipped.
- For Number of Columns to Skip and Number of Rows to Skip, ensure the value is 1.
- Accept all other defaults, and click Finish to create the workspace.
When the Reality mapping workspace is created, the image collection is loaded in the workspace and displayed on the map. You can now perform block adjustments and generate Reality mapping products.
Related topics
- Reality mapping in ArcGIS Pro
- Add ground control points to a Reality mapping workspace
- Manage tie points in a Reality mapping workspace
- Perform a Reality mapping block adjustment
- Generate multiple products using ArcGIS Reality for ArcGIS Pro
- Introduction to the ArcGIS Reality for ArcGIS Pro extension
- Frequently asked questions