Command Line Tools

ARCSI is command line driven. While it is written in python and could be run directly from python using the arcsilib.arcsirun module this is not documented and also does some checking and setting up of default values so it would take a little effort and reference to the source code to setup your own code to call arcsilib.arcsirun. This is something we plan to improve at a later date.

Running ARCSI

These commands are for running ARCSI to generate analysis ready data (ARD).

This is the main command that may well be the only command you use within ARCSI. This is the command which you use to process your data to retrieve ARD. -s lstm -p CLOUDS DOSAOTSGL STDSREF SATURATE TOPOSHADOW FOOTPRINT METADATA \
-o ./Outputs/ --stats --format KEA --tmpath ./tmp  \
--dem ./UKSRTM_90m.kea --cloudmethods LSMSK \
--k  clouds.kea meta.json sat.kea toposhad.kea valid.kea stdsref.kea \
-i LT05_L1TP_203024_19950815_20180217_01_T1/LT05_L1TP_203024_19950815_20180217_01_T1_MTL.txt

The sensors available are:

  • lsmss Landsat MSS 1, 2, 3, 4 and 5

  • lstm - Landsat TM 4 and 5

  • lsetm - Landsat ETM+ 7

  • lsoli - Landsat OLI 8 and 9

  • sen2 - Sentinel-2

This command supports the identical functions to but can be used with MPI on computational clusters to process a group of input images using multiple processing cores (see

Downloading Data

These commands allow you to automate the processing of downloading your EO data, currently Landsat and Sentinel-2 using the Google Cloud.

This command creates a local copy of the Google database of landsat acquisitions as a SQLite database. -f landsatdb_20180216.db

This command creates a local copy of the Google database of Sentinel-2 acquisitions as a SQLite database. -f sen2db_20180216.db

This command queries the database setup by generating a shell script for downloading the scenes meeting the query parameters. -f landsatdb_20180216.db -p 204 -r 24 --collection T1  \
--cloudcover 70 --startdate 2000-01-01 --outpath ./Downloads/ --multi \
--lstcmds -o

This command queries the database setup by generating a shell script for downloading the scenes meeting the query parameters. --source GOOG -f sen2db_20180216.db -t 30UVD \
--cloudcover 70 --startdate 2017-01-01 --enddate 2017-12-31  \
--outpath ./Downloads/ --multi --lstcmds -o

If you don’t use the –lstcmds you can use this command to download the list of URLs and it will check whether the download already exists, skipping those that do.

Batch Processing

A useful feature is being able to generate the commands for a large number of input files making it significantly easier to process a large number of input files. In addition to the commands listed here, may also be useful.

This command allows you to build the commands for a large number of input files. This is really useful and can save a lot of time and error! You need to specify a search string and wildcard for finding the input image header files. -s lstm -f KEA --stats -p CLOUDS DOSAOTSGL STDSREF \
--outpath ./Outputs --dem ../UKSRTM_90m.kea --cloudmethods LSMSK \
--keepfileends stdsref.kea clouds.kea \
--tmpath ./tmp -i ./Inputs -e "*MTL.txt" -o

A unique option within ARCSI is to process images files are a path/scene. For example, Sentinel-2 provides individual granules and landsat is cut into rows. ARCSI allows these images to be processed as a single job and tries to ensure that there isn’t a boundary step between the individual input scenes/granules due to changes in parameterisation. To undertake this analysis you still use or (and you can use the mulitple core options) but requires a text file listing the header files for the scenes to be passed. This command ( provides the functionality to automatically build those text files from a directory of scenes you have downloaded (e.g., from Google).

ARCSI creates a standard unique name for each sensor using the meta-data for that image. However, if you have batch processed a large number of scenes then knowledge of which input dataset results in a particular output file might be lost. This command can build a look up table with this information.


These commands provide specific functionality for the Landsat sensors.

This command can sort a set of Landsat archives into a directory structure based on the sensor (i.e., Landsat 1, Landsat 2, … Landsat 5 MSS, Landsat 5 TM, … Landsat 8).

Error Checking

This functions can be useful for double checking that everything is processed through correctly particularly when you have a very large dataset that you cannot manually check.

This command using the input data to check whether all the expected outputs have been produced.

This comamnd will do a quick check as to whether an output file with the basename of input data has been created.

This command aims to find duplicate input files and remove them before you undertake any processing. This is similar to

Other Utilities

Unless you have downloaded your data from Google you will probably have a set of archives (e.g., tar.gz, zip) for your images. It is useful to extract these into their own directories. This command provides functionality for this and can extract all archives within a directory or just a single input file. -i ./InputDIR -o ./OutputDIR

Development Utilities

These tools are not expected to be useful for the average user but are very useful for some creating a new sensor to be added to the ARCSI source code.

This command can resample a set of spectral response functions to a new sample interval. 6S requires spectral response functions to be sampled at 2.5 nm.

This command can use the spectral response functions to calculate the solar irradiance for the input band. This value is required for converting at sensor radiance to at sensor reflectance (also called top of atmosphere reflectance; TOA).