ESA has during the last years re-visited the user facing services and systems with an aim to refresh the experience of data discovery, access and use. In this context, all main components of the previous Earthnet on-line set-up have been renewed and replaced with new ones based on more modern technologies. Specific attention during these tasks was given to the user perspective with a focus on enabling and facilitating access and use of the Earth Observation Payload data that is published and presented.
The pursued ESA activities all shared common objectives of simplifying, consolidating and improving the user experience based on a modernized set of underlying technologies. ESA set in this context out to re-new and re-publish the complete set of earth on-line web pages, to replace a bespoke and obsolete user identity management solution with more suitable modern product and to deploy a specific ESA controlled solution for user services and ticket management.
Overall, the pursued activities have established a full suite of new user engaging components, from the information and data discovery functions, through the data access mechanisms and access authorization processes to the user service and support when needed. In addition to traditional data download services, an open science platform is also offered. The user is now given the opportunity to access and manipulate data at pixel level, upload and share algorithm and analysis results. The offering of services that now are more dynamically changing in response to user demand has driven ESA to establish mechanism for capturing indicators measuring the user engagement and satisfaction in a new and more holistic fashion.
This paper will present the various activities and their results in this overall refresh. It will describe how the processes for displaying information contents, like news and highlights of the supported missions, has been separated from the processes of publishing and making the data and associated services available for access. Interactive help and user support services will also be presented, them as well having dedicated processes. In addition, the overall integration into a coherent user front end will be discussed.
In particular, the paper will cover:
- The overall design and implementation of an integrated Earth Observation Data access portal with dedicated functions for publishing of contents, data access and user services and support.
- The primary considerations of the user perspective and achieving an implementation respecting personal data privacy measures in line with current best practices and regulations.
- The need to respect an increasing attention to network and system security in order ensure sustained system and service integrity and the methods and tools eventually applied.
- The use of emerging technologies and the customization and configuration of the shelf software and “Software as a Service” offers.
The paper will also describe the design and management processes that were used to achieve the desired outcome. Agile methods and close cooperation between ESA and industry teams lead to an iterative evolution of the solution. Small and incremental improvements were constantly added to an ever-expanding offer both in terms of its cope of functionality and its depth in integration.
To conclude the paper will draw some conclusions relating to encountered issues and lessons learnt and suggest process improvements that can be deployed to the overall methods and processes that were used. In this, particular attention will be given to areas that initially were not appropriately considered and needed more attention during the project.
These days, emerging technologies surround us more than ever. The total market value of the top 10 cryptocurrencies is almost equal to the market cap of gold. Top S&P companies invest in the metaverse, a mixed implementation of augmented reality and artificial intelligence. Thanks to voice assistance, more than 50% of Americans can easily ask Google's, Apple's or Amazon's virtual assistants questions such as 'Where is my iPhone?' or 'What is the weather like?' or 'What interesting movie or series can I watch on Netflix?'
In addition, with the COVID-19 pandemic, more internet users started using social media and teleconferencing systems. TikTok – a social media platform that allows sharing short and funny videos – acquired 181 million users in only one year. The time spent on the internet increased dramatically.
All emerging technologies reduce the distance between us and simplify communication.
By contrast, the information describing Earth Observation data, algorithms, satellites and instruments is complex. Unravelling all differences between spacecraft structure, sensor parameters and technical specifications of the data transformation chain is challenging.
Our research on web usability and customer experience points to the most critical question: How do emerging technologies change the perception of complex information and the knowledge discovery process in large groups of Earth Observation data consumers?
In the research, we collected and analysed data from a wide range of public and commercial systems. Earth Observation Data access systems are the primary source of information. However, the ESA Space Safety portals and applications, local government portals (Warsaw City Hall) or commercial systems also provide information on how the end-user knowledge discovery patterns imperceptibly change.
This session will explain how digital users, affected by the new emerging technologies, discover complex information, such as Earth Observation content, data and products. The session will explain how different user personas execute information discovery (differentiated by domain knowledge: amateur, general expert, and scientific expert). The presentation will highlight the steps of information discovery, including how to use Google, social media and virtual assistance, and how the emerging technologies change that.
The summary of our research can be applicable and valuable in building new information services for Earth Observation, such as portals, data cubes, catalogues, and platforms, and will allow us to perceive those systems from the perspective of digital users affected by new technologies and flooded by the information stream. These thoughts can be applicable and valuable in any process of building new information services for Earth Observation independently from the target audience, and we hope that our findings will support business.
About author:
Bartosz Szkudlarek, CEO at Eversis
Bartosz experience is focused on building high-scalable, content driving portals and web applications. He leads the company which delivers content- and user-oriented portals and web applications, including Earth Online, GEOSS portal and COS-2 application for The International Charter Space and Major Disasters. The company experience helps deliver other projects for ESA (Space Safty Program), European Commission (Sentinels websites), JRC, and commercial companies such as BNP Paribas, ING Bank, or Orange.
As part of the European Space Agency activities, the multi-mission ground systems are operated to acquire, process, archive and distribute data from ESA Missions, including the Heritage Missions, Earth Explorer as well as the Third-Party Missions under specific agreements with NASA, JAXA and other data owners. As a joint effort of several departments, the dissemination services have been designed to provide reliable support for both interactive (e.g. via web client to explore data catalogues) and automated access (e.g. via API and machine-to-machine interfaces) the available data collections.
Moreover, the large availability of data is driving the users to request time-series spanning almost 40 years, likely to increase even more in the future, in particular regarding the growing interest on global change monitoring and policy makers decisions on the observed changes particularly in the climate system (atmosphere, ocean, cryosphere, carbon and other biogeochemical cycles, sea levels).
To ensure, enhance and facilitate discovery, access, and exploitation of ESA Missions by a broader community of users, the European Space Agency has transferred into operation the ESA PDGS Data Cube service. Based on a new paradigm aiming at pulling out the full potential of EO data providing access to large spatio-temporal data stored in ESA dissemination archives and enabling on-the-fly analysis-ready form, the ESA PDGS Data Cube service (https://earth.esa.int/eogateway/news/esa-s-new-datacube-service-is-now-available) has been designed to fit the ESA Common Services architecture.
By end 2022, a total of 24 different datasets representing ESA as well as third party mission data will be available to the users community for discovery and access. Aside these services, direct data exploitation through a Juypter Lab instance is also at the user´s disposal, implementing the full concept of virtual lab, setting the user free of any local infrastructure costs, allowing running owned code and downloading only the result of the processing.
Besides the overall ESA PDGS datacube platform, relevant use cases defined in connection with international initiatives (e.g. CEOS, GEOSS) are presented, together with examples of join data exploitation e.g. on-the-fly computation of SMOS global anomalies, and Cryosat - GEDI synergic use for inland water bodies assessment, enabled by the datacube technology.
Large Earth Observation (EO) data archives are nowadays available on platforms that offer computing and storage resources (e.g., Amazon Web Services [1], Google Cloud [2], DIAS platforms [3]). The exploitation of EO data with a user-oriented focus can be realized by different interfaces (APIs) to ease processing and analysis tasks, such as xarray, Open Data Cube, OpenEO, or an implementation of the OGC (Open Geospatial Consortium) Application Deployment and Execution Service (ADES). Experiments on a Data Access and Processing API conducted as part of the OGC innovation program have shown benefits in simplifying data access and data analysis [4].
For analyzing EO data, user-oriented interfaces need to know what kind of data is available on the platform (e.g., different satellites and sensors). Additionally, the data need to be filtered by different parameters (e.g., spatial and temporal dimensions, cloud coverage). Thus, a metadata database containing the information needs to be queried either by the user or the API itself. EO data available on a platform are often registered in a metadata catalogue in order to be searchable by the users (e.g., CreoDIAS Finder application). However, specifications for such an EO metadata catalogue (e.g., OGC Catalogue Service for Web, OGC OpenSearch Extension for EO) previously focused mainly on data discovery rather than direct data access, which is needed for EO exploitation platforms. In most cases, such platform-dependent metadata catalogues were not connected to data analysis and visualization tools.
The SpatioTemporal Asset Catalog (STAC) specification [5] ushered in a new era, by not only making EO data discoverable, but accessible for data analysis as well as data visualization. The software ecosystem around STAC contains open source software for data discovery, data visualization, data catalogs, metadata creation, as well as integrations into already existing data analysis tools. As an example, the Open Data Cube software no longer needs its own database containing the available EO data. Instead, it can directly connect to a STAC API for data search and filtering. This allows a user to create an instance of the Open Data Cube with the results of a STAC API request on the fly and to use it for further analysis. Similar integrations exist for xarray (stackstac) and GRASS GIS (actinia-stac-plugin).
We present a STAC-oriented architecture for EO exploitation platforms, which includes technical solutions for user-oriented discovery, access, visualization, and analysis of EO data. This architecture is the basis for DLR’s EO Exploitation Platform terrabyte, which comprises 40 Petabyte online storage together with a large amount of computing resources. STAC metadata has been created initially for Sentinel-1 and Sentinel-2 data, which has been registered into a STAC API. Users are able to connect with Open Data Cube, stackstac or the GRASS GIS-based OpenEO interface to the STAC API for data analysis. The software titiler is used as a STAC-based visualization service.
From an EO data platform provider’s perspective this concept allows to focus on a performant STAC API service without needing to synchronize the metadata holdings for different additional services. In addition to the STAC API, services for data visualization and data access (e.g., sub-setting, time-series extraction) enable further on the fly exploitation of the provided geospatial data. In addition, users of the platform can create STAC metadata along with their processing and analysis results, which can then be used in all of the STAC-based platform services for on the fly data discovery, visualization, access and follow-up data analysis. Results of a comparison to traditional approaches and future capabilities and activities are shown and presented from a data provider’s perspective.
References:
[1] Amazon Web Services, “Registry of Open Data on AWS,” https://registry.opendata.aws, accessed on 2021-02-26.
[2] Google, “Google Cloud Public Datasets,” https://cloud.google.com/public-datasets, accessed on 2021-02-26.
[3] Copernicus, “Data and Information Access Service,” https://www.copernicus.eu/en/access-data/dias, accessed on 2021-02-26.
[4] Vretanos, Panagiotis (2021), “OGC Testbed-16: Data Access and Processing Engineering Report,” Open Geospatial Consortium – OGC-20-016, https://docs.ogc.org/per/20-016.html
[5] Radiant Earth, “SpatioTemporal Asset Catalogs,” https://stacspec.org, accessed on 2021-02-26.
Land Surface Temperature (LST) is a crucial biomarker for our planet's climate and ecosystems and is emerging as a key indicator in addressing the increasing challenges of climate change and population growth. Remote sensing measurements of LST appear as the most viable option to tackle these challenges by enabling us to determine key processes such as evapotranspiration at global scale. The wide range of applications for LST data goes beyond the monitoring of global warming and ecosystems to also provide a basis for decision making in fields such as precision agriculture, water management, urban planning and disaster prevention. Owing to their nature, many of these applications are constrained by the spatial resolution, revisit time and coverage of all current operational satellite missions.
To fill this gap, we’re developing a new platform, HeatR, to generate an easily accessible, fused LST product from various existing spaceborne sensors. It generates dense time series of homogenized, analysis-ready Level-2 data that combine the LST information from Landsat-5, -7, -8 and -9 as well as ECOSTRESS. Consequently, the user benefits from a maximum set of data points with broad spatial coverage and the highest possible temporal resolution; a basis that unlocks a wider range of applications.
The platform will offer both a web service with a user-friendly graphical user interface (GUI) and an application programming interface (API) allowing our homogenization framework to be embedded into existing workflows. To ensure easy interoperability with existing data, the fused LST product is highly customizable, e.g., in terms of output resolution, coordinate grid or projection. The backend code uses a Micro-Services Architecture (MSA) as a structural model for software interoperability and easy integration of new components. It is deployed on different Google Cloud computation environments to provide a high level of scalability and assure cost efficient data products.
The modular design of the platform enables us to pursue continuous innovation due to the effortless integration of the latest research on EO technologies and incorporation of future satellite missions to further improve the data quality and density of the homogenized product over the coming years. A first addition will be the LisR sensor which is set to be launched to the ISS in the beginning of 2022 by ConstellR, a German spin-off from Fraunhofer EMI. This serves as the first in-orbit demonstrator in preparation for ConstellR’s upcoming HiVE constellation consisting initially of 5 satellites which are planned to be launched over the course of a year starting end of 2023. While LisR will add additional 80m resolution thermal data to the dataset, HiVE will offer global, daily coverage of LST data at 50m resolution.
Our work brings together state-of-the-art technology and know-how from one company, ConstellR, and two German research institutions, the GFZ and Fraunhofer EMI to create a platform for an unique fused LST product. The quality and its cost efficiency enables a wider range of scientific and commercial applications to address the immense current and potential future challenges brought about by a changing climate.
There is a broad need, by a growing number of user communities, for utilization of increasingly available satellite-derived environmental data in support of research and numerous oceanographic applications. However, the potential of such Earth Observations for ocean science and applications for societal benefit has yet to be fully realized. This is largely due to issues encountered by less expert users, who are currently underserved, and which relate to complexities of data access, product selection, and working with large volume, heterogeneous datasets across agency repositories. The CEOS Ocean Variables Enabling Research and Applications for GEO (COVERAGE) initiative aims to address this critical gap. Via a user-focused approach, it seeks to provide improved access to multi-agency, multidisciplinary remote sensing data for the oceans that are better integrated with in-situ and biological observations that pose additional interoperability challenges.
COVERAGE is an international initiative and 3 year pilot project within the Committee on Earth Observation Satellites (CEOS) involving interagency participation. It aligns with programmatic objectives of CEOS and the missions of GEO-MBON (Marine Biodiversity Observation Network), which are to advance and exploit synergies amongst marine observational programs. It focuses on implementing technologies, including cloud-based solutions, to provide an advanced yet accessible data rich, web-based platform for integrated ocean data delivery and access: multi-parameter observations, easily discoverable and usable, organized thematically, available in near real-time, collocated to a common grid and complemented by a set of value-added data services. COVERAGE development is organized around priority application use cases identified by agency partners and user driven. Here we provide an overview of the initiative and the status of the Phase C technical implementation work that builds upon our current prototype. Emphasis is also placed on describing the associated ecosystem thematic demonstration application that focuses on the dynamics of high seas pelagic fish assemblages in relation to the environment in support of emerging high seas area-based management frameworks. International collaborative aspects of the project are discussed with the intent of soliciting community feedback.