Skip To Content

ArcGIS Video Server 11.4 system requirements

For a production environment, the user and business needs for the software may vary. These requirements must be considered in determining hardware needs to meet performance and scalability expectations. Some minimum requirements are listed below. Minimum requirements will support the application, but may not perform well.

Hardware requirements

ItemSupported and recommended

CPU

Minimum: 2 cores, simultaneous multithreading

Simultaneous multithreading or hyperthreading of CPUs typically features 2 threads per core. A multithreaded 2-core CPU will have 4 threads available for processing, while a multithreaded 6-core CPU will have 12 threads available for processing.

Storage

Minimum: 200 GB of free space*

Memory/RAM

Minimum: 8 GB

Dedicated (not shared) graphics memory

Minimum: 6 GB or more

If you're using a notebook computer with an integrated GPU, consider increasing the system RAM to compensate for the use of shared memory.

*Storage (link to Video Storage section below) on the Video Server will depend on how this is configured

GPU requirements

It is highly recommended that you install ArcGIS Video Server in an NVIDIA GPU environment that supports NVENC (encoding) and NVDEC (decoding) for optimal publishing, encoding, use, and dissemination of video services. Driver version 531.61 or higher is required. See the NVIDIA support matrix for a complete list of GPU cards that support video encoding and decoding.

Video Server can be installed on a machine without GPU, but certain functionality will be lost and streaming performance will be affected. This includes not having the option to select output resolutions when publishing on-demand video, inability to publish video files encoded with non-h.265 and h.264 codecs, cannot rotate video in landscape mode to portrait mode, and latency or buffering when viewing video services

ItemSupported and recommended

GPU type

NVIDIA GPU with CUDA compute capability 12.1 or above. See the list of CUDA-enabled cards to determine compute capability of a GPU.

GPU driver

NVIDIA GPU drivers: version 531.61 or later is required.

Dedicated graphics memory

Minimum: 6 GB or more.

Note:

An out-of-date GPU driver may cause encoding and decoding issues or for the server to report that it has no GPU. Verify that you have up-to-date GPU drivers directly provided by NVIDIA.

Support and recommendations

There are numerous areas of consideration when publishing video that range from supported video file formats to metadata telemetry requirements. Each of these considerations will impact how the video is published and stored.

Supported video file formats

The supported video formats, including high resolution 4K formats, are listed in the following table:

DescriptionExtension

MOV file

.mov

MPEG-2 Transport Stream

.ts

MPEG-2 Program Stream

.ps

MPEG file

.mpg

MPEG-2 file

.mpg2

MPEG-2 file

.mp2

MPEG file

.mpeg

VLC (mpeg2)

.mpeg2

MPEG-4 Movie

.mp4

MPEG-4 file

.mpg4

H264 Video file

.h264

H265 Video file

.h265

VLC Media file (mpeg4)

.mpeg4

VLC Media file (vob)

.vob

Supported video codec formats

The supported video codec formats include h.264 and h.265, which hls natively supports and does not require GPU to do encoding along with .av1, .mpeg1, .mpeg2, .mpeg4, .wmv, and .mjpeg.

Metadata telemetry requirements

To compute and display metadata telemetry information from the video onto a map, the below metadata fields are required. Videos that contain only a subset of the metadata will still display partial telemetry information.

For example, if the video file or corresponding side car metadata files only contain the Precision Time Stamp, Sensor Latitude, and Sensor Longitude fields, the location of the sensor will be displayed on the map, but the footprint of the video frames will not be displayed, and some functionality such as capturing a video frame will not be supported

Field NameDescriptionUnitsTelemetry

SensorLatitude

Sensor latitude based on WGS84 ellipsoid that ranges -90.0 to 90.0

Degrees

Sensor Location - 3D

Sensor Trail - 3D

Only one timestamp field is required.

SensorLongitude

Sensor longitude based on WGS84 ellipsoid that ranges -180.0 to 180.0

Degrees

TimeStamp

Date and Time stamp with optional milliseconds

String in format: YYYY-MM-DD HH-MM-SS.zzz

UnixTimeStamp

Coordinated Universal Time (UTC)

Microseconds since 1970 (Unix epoch)

SensorEllipsoidHeight

Sensor ellipsoid height as measured from the reference WGS84 ellipsoid

Meters

Sensor Location – 3D

Sensor Trail – 3D

Only one field is required for 3D locations.

SensorTrueAltitude

Altitude of sensor as measured from Mean Sea Level (MSL)

Meters

PlatformHeading

Asset (platform) heading relative to True North, measured clockwise in the horizontal plane looking down that ranges 0.0 to 360.0

Degrees

PlatformPitch

Asset (platform) pitch relative to horizontal plane with positive angles for nose above the horizontal plane

Degrees

Sensor Sight Line

Frame Outline (Footprint)

Frame Center (Footprint Centerpoint)

PlatformHeading

Asset (platform) roll angle relative to horizontal plane with positive angles for left wing above the horizontal plane

Degrees

PlatformRoll

Asset (platform) roll angle relative to horizontal plane with positive angles for left wing above the horizontal plane.

Degrees

SensorRelativeRoll

Relative roll angle of sensor to aircraft platform where top of image level is 0 degrees and positive angles are clockwise when looking from behind camera.

Degrees

SensorRelativeElevation

Relative angle of sensor pointing direction to the platform horizontal plane where negative angles down

Degrees

SensorRelativeAzimuth

Relative angle of sensor pointing direction to platform longitudinal axis as seen from platform that ranges 0.0 to 360.0

Degrees

HorizontalFOV

Horizontal field of view of selected imaging sensor

Degrees

VerticalFOV

Vertical field of view of selected imaging sensor

Degrees

Note:

Fields in metadata file must reflect the Field Names above (Field Names are case, space, and dash insensitive). Field headers in metadata file can be matched to the Field Names using a field mapping file. Field Names are only supported in EN locale.

When the metadata is complete and accurate, the application will calculate the video frame corners, and the size, shape, and position of the video frame outline, which can then be displayed on a map. The 12 Field Names comprise the minimum metadata required to compute the transform between video and map, to display the video footprint on the map, and to enable other functionality.

Field mapping metadata

In the event the original metadata file does not contain the 12 field names, a field mapping CSV file can be created following the below schema, with a Metadata Field Name column ("Metadata") conveying the field names in the metadata file that would be matched to the expected Video Server Field Names ("Field Name") listed below. This CSV table must contain the two listed columns- "Field Name" for Video Server's expected metadata fields, and a "Metadata" column for the fields which need to be mapped. Once created, this field mapping file should be included in the upload along with the video file and corresponding metadata file.

Field Name

SensorLatitude

SensorLongitude

TimeStamp

SensorEllipsoidHeight

PlatformHeading

PlatformPitch

PlatformHeading

PlatformRoll

SensorRelativeRoll

SensorRelativeElevation

SensorRelativeAzimuth

HorizontalFOV

VerticalFOV

Time shifting metadata

For optimal metadata results, the video data and metadata should be time synchronous. If the time stamp linking the video and metadata are not accurately synchronized, the video footprint and sensor locations on the map will be offset from the view in the video player.

If the time shift is observable and consistent, a time shift csv file can be used to adjust the timing of the metadata to match the video. The csv should contain two columns labeled ElapsedTime (the time location in the video where the time shift occurs) and TimeShift (the amount of time offset in seconds). If the time shift between the video and metadata is inconsistent, you can list multiple positions in the video with the associated time shift in the csv file. Once created, this time shift file would be included in the upload along with the video file and corresponding metadata file if applicable.

0.00:00:00 (days.hours:minutes:seconds)

0.00:00:00 (days.hours:minutes:seconds) (Video time in seconds relative to metadata. Use negative values if the video footprint lags.)

Video Storage

By default, ArcGIS Video Server stores all video and metadata files on the file system. Optionally, an organization can register an existing object data store to use as the output location of video and metadata.

During Video Server site creation, the config-store, directories, and logs locations can be specified. By default, these are all stored on the Video Server file system. The config-store can optionally be stored in a cloud store. The config-store contains a services and an uploads folder that store json files about the video uploaded and the output service created.

When a video is published, the video is uploaded to the Video Server file system or object store based on how it was configured. During the publishing process, if in a GPU environment and multiple output resolutions are selected, the uploaded video will be transcoded into additional resolutions and the output files are stored in the directories/arcgisvideoservices location. This will also be the location where the original uploaded video is also stored. The more output resolutions selected, the more storage the service requires to successfully and efficiently stream the service in each resolution.

The directories/arcgisvideouploads location stores the original video file based on the fileId created during upload. When using ArcGIS Excalibur to publish video services, the original uploaded video is deleted from this location as the original video will remain in the arcgisvideoservices location.

Windows operating system requirements

Several internet host name specifications have designated the underscore character as nonstandard. Although Microsoft Windows allows the underscore in a machine name, it can cause problems when you interact with other servers and platforms. For this reason, ArcGIS Video Server will not install on servers that have an underscore in the host name.

The operating system of your ArcGIS Video Server machines can be different from those of the other machines in your ArcGIS Enterprise deployment.

ArcGIS Video Server is not supported on domain controllers. Installing ArcGIS Video Server on a domain controller may adversely affect functionality.

The following 64-bit operating systems satisfy the minimum operating system requirements. Support is not provided for 32-bit operating systems; the setup will only proceed if the operating system is 64 bit.

Supported operating system Latest update or service pack tested

Windows Server 2022 Standard and Datacenter

SP (21H2)

Windows Server 2019 Standard and Datacenter

May 2022 update

Windows Server 2016 Standard and Datacenter

May 2022 update

Prior and future updates or service packs on these operating system versions are supported unless otherwise stated. The operating system version and updates must also be supported by the operating system provider. ArcGIS is only supported on 64-bit CPUs with x86-64 architecture. The Desktop Experience option is required on all versions of Windows Server.

Windows 11 is supported for basic testing and application development use only. It is not recommended for deployment in a production environment.

Cloud implementation

ArcGIS Video Server can manually be deployed in Microsoft Azure and Amazon Web Services. Full support for Microsoft Azure and Amazon Web Services cloud formation templates are not currently supported.

SSL certificates

ArcGIS Video Server is preconfigured with a self-signed certificate that allows the server to be initially tested and to help you quickly verify that the installation was successful.

You must request a certificate from a trusted certificate authority (CA) and configure ArcGIS Video Server to use it. This can be a domain certificate issued by your organization or a CA-signed certificate. The certificate must have a Subject Alternative Name (SAN) configured or ArcGIS Video Server will not work properly.

Note:

Certificates created using IIS do not have the option to include a SAN. Use the script in Create a domain certificate, which includes a SAN, with the certificate created.

Portal for ArcGIS also includes a preconfigured self-signed certificate. Because you'll federate an ArcGIS Video Server site with your portal, request a certificate from a trusted CA and configure the portal to use it.

Software prerequisites

ArcGIS Video Server can be installed on its own or with other ArcGIS Enterprise 11.4 components.

Note:

Whether you are installing a new deployment or upgrading from an earlier version, ensure that all the software components—the ArcGIS Enterprise portal, an ArcGIS Server site, and ArcGIS Data Store—are installed and running ArcGIS Enterprise 11.4.

During ArcGIS Video Server set up, you will federate the new ArcGIS Video Server site with your Enterprise portal.

Domain name system and fully qualified domain name recommendations

It's recommended that you configure your organization's domain name system (DNS) to include fully qualified domain name (FQDN) entries for the ArcGIS Video Server site. The ArcGIS Enterprise portal will request the FQDN of your server site when you federate it.

Supported web browsers

The ArcGIS Video Server installation wizard is supported by all web browsers including, but not limited to, the following:

  • Google Chrome version 122 and later
  • Microsoft Edge version 122 and later
  • Mozilla Firefox version 125 and later
  • Mozilla Firefox version 115 (ESR)
  • Safari version 16 and later