Globally available

Archive Roboter

Systematic data processing: from raw data to data products

The processing of data is one of the largest fields of responsibility in IT industry as it happens in most diverse areas of our every-day life. The photo filter of a messaging app processes the data of our picture to achieve its optimum presentation. We send an e-mail and the data are encrypted before dispatching them to the recipient. The vehicle control system in a car, too, is constantly processing data in order to identify dangerous situations and to optimize operating procedures.

Wherever and whenever data are periodically produced in large amounts, you will need an efficient mechanism that sees to a smooth execution of your tasks.

Special challenges of satellite data processing

Especially in scientific areas, there are many use cases calling for data processing: in order to understand the Earth's eco and climate systems, scientists collect huge amounts of data across the whole world. In space, for example, satellites collect data recorded by sensors or cameras and send them to Earth as so-called payload data.

Every satellite produces up to 1 TB of raw data per day, all of which need to be processed, stored and supplied to downstream systems. Every single day they are processed and turned into data products by applying scientific algorithms and support of auxiliary data. Because those data volumes are so enormous, efficient control of the workflows of systematic data processing acquires utmost importance. The consequential large and even growing demand for storage capacities make extensible data centers or cloud systems suitable locations to store the data.

This is how you accomplish efficient execution of processing chains on Earth observation data

Efficient control of workflows is guaranteed by apt software platforms, which offer specific benefits depending on the individual field of application and are optimized for the system infrastructures and file structures typically encountered in the area of satellite data processing. Even with strongly varying processing complexity, e.g. at times of extraordinarily many data retrievals or re-processing of data products, the software employed is able to scale the performance by flexibly configuring cloud resources. This ensures fast and save processing of large data quantities while avoiding unnecessary consumptions of energy and infrastructure when it is not needed.

We are well aware of the special challenges of systematic data processing and have been providing reliable solutions in many projects. A customer project that is a telling example for the use of our proprietary systematic data processing framework is the ground segment software for some of ESA's Earth Explorer missions like Biomass, EarthCARE or SWARM. By order of ESA, we are operating the Copernicus Long-Term Archive to back up Sentinel satellite data in a European cloud system. The actual archiving of files does not involve all too complex processing operations but, with the large quantities of data, the archive service requires substantial computing and processing capacities to verify file contents, compress data and generate meta data for traceability purposes.

Learn more about our EO Processing Service:

Go on to EO Processing Service