Digitalisationhttps://bab.gv.at/index.php2024-03-29T07:03:55+01:00Bundesanstalt für Agrarwirtschaft und Bergbauernfragenwebmaster@bab.gv.atJoomla! - Open Source Content ManagementBAB 074/24: D-DOK: Research documentation departments2024-01-04T11:13:39+01:002024-01-04T11:13:39+01:00https://bab.gv.at/index.php?option=com_content&view=article&id=2296:bab-074-24-d-dok-research-documentation-departments&catid=112&lang=en&Itemid=413Michaela Hager<p>Currently, the data for research activities and public relations activities are collected in simple lists at the research departments and the RZL and activity reports are compiled manually. There are currently no standardized publication lists or archives.</p>
<h3>Objective</h3>
<p>Development of a web-based research documentation system for departments. <br />This application is used by the respective users to collect all relevant data that is</p>
<ul>
<li>are required for the creation of the RZLP, especially for the creation of key figures.</li>
<li>for the targeted, citable publication and collection of all recorded data and documents</li>
<li>can be automatically analyzed for annual reports,</li>
<li>are relevant for strategic decisions.</li>
</ul>
<p>With the introduction of D-ESS in October 2021, a tool with the necessary data, authorizations and evaluations relating to personnel, projects, time recording, allocation and time allocation has already been created. This tool has reduced the administrative effort for this recurring work and provides new high-quality evaluation options.<br />By integrating publication documentation into the existing tool, the following improvements can be achieved:</p>
<p>The data obtained can be used to generate time series for the RZLP, as well as the basis for decisions, key figures and automatically controllable publication lists (annual report, presentation on the homepage, personal publication lists, all project publications)</p>
<ul>
<li>All publications relevant to the department can be compiled in a documentation on site</li>
<li>These are prepared for citation and can also be assigned to several projects</li>
<li>Public relations activities can be documented and evaluated centrally</li>
<li>All recorded data can be entered and looked up at author, institute and institution level at any time in a familiar environment</li>
<li>Create an overview of all active projects and collaborators</li>
<li>The application has a modular structure and can be adapted to the requirements of the department</li>
</ul>
<h3>Planned work 2024</h3>
<p>WP1:</p>
<p>Create an application based on the D-ESS server:</p>
<ul>
<li>System authorization (same structure as D-ESS, but new authorizations)- Cost units (projects) from D-ESS database</li>
<li>Users from the D-ESS database- New user groups specifically for D-DOK (data entry clerk, head of department, head of institute, controlling)</li>
<li>New server logs (user tracking) specifically for D-DOK- Supports database, Active Directory and OIDC authentication- Copy (and modify) existing relevant API endpoints from D-ESS</li>
<li>Create base code</li>
</ul>
<p>WP2:</p>
<ul>
<li>Research, install and configure MongoDB GridFS</li>
<li>Create API endpoints (and client-side views) to support uploading, saving and indexing PDF files</li>
<li>Implement client-side views to display file versioning and download of PDFs</li>
<li>Create views and a server API to link employees to "projects"</li>
<li>Create a view to link employees to PDF documents</li>
<li>Implement tables for full text search of uploaded PDFs</li>
</ul>
<h3><br />Schedule</h3>
<p>Project start: 01/2024<br />Project end: 12/2025</p>
<p> </p>
<p> </p>
<p> </p>
<p> </p><p>Currently, the data for research activities and public relations activities are collected in simple lists at the research departments and the RZL and activity reports are compiled manually. There are currently no standardized publication lists or archives.</p>
<h3>Objective</h3>
<p>Development of a web-based research documentation system for departments. <br />This application is used by the respective users to collect all relevant data that is</p>
<ul>
<li>are required for the creation of the RZLP, especially for the creation of key figures.</li>
<li>for the targeted, citable publication and collection of all recorded data and documents</li>
<li>can be automatically analyzed for annual reports,</li>
<li>are relevant for strategic decisions.</li>
</ul>
<p>With the introduction of D-ESS in October 2021, a tool with the necessary data, authorizations and evaluations relating to personnel, projects, time recording, allocation and time allocation has already been created. This tool has reduced the administrative effort for this recurring work and provides new high-quality evaluation options.<br />By integrating publication documentation into the existing tool, the following improvements can be achieved:</p>
<p>The data obtained can be used to generate time series for the RZLP, as well as the basis for decisions, key figures and automatically controllable publication lists (annual report, presentation on the homepage, personal publication lists, all project publications)</p>
<ul>
<li>All publications relevant to the department can be compiled in a documentation on site</li>
<li>These are prepared for citation and can also be assigned to several projects</li>
<li>Public relations activities can be documented and evaluated centrally</li>
<li>All recorded data can be entered and looked up at author, institute and institution level at any time in a familiar environment</li>
<li>Create an overview of all active projects and collaborators</li>
<li>The application has a modular structure and can be adapted to the requirements of the department</li>
</ul>
<h3>Planned work 2024</h3>
<p>WP1:</p>
<p>Create an application based on the D-ESS server:</p>
<ul>
<li>System authorization (same structure as D-ESS, but new authorizations)- Cost units (projects) from D-ESS database</li>
<li>Users from the D-ESS database- New user groups specifically for D-DOK (data entry clerk, head of department, head of institute, controlling)</li>
<li>New server logs (user tracking) specifically for D-DOK- Supports database, Active Directory and OIDC authentication- Copy (and modify) existing relevant API endpoints from D-ESS</li>
<li>Create base code</li>
</ul>
<p>WP2:</p>
<ul>
<li>Research, install and configure MongoDB GridFS</li>
<li>Create API endpoints (and client-side views) to support uploading, saving and indexing PDF files</li>
<li>Implement client-side views to display file versioning and download of PDFs</li>
<li>Create views and a server API to link employees to "projects"</li>
<li>Create a view to link employees to PDF documents</li>
<li>Implement tables for full text search of uploaded PDFs</li>
</ul>
<h3><br />Schedule</h3>
<p>Project start: 01/2024<br />Project end: 12/2025</p>
<p> </p>
<p> </p>
<p> </p>
<p> </p>BAB 073/24: Data modelling, management and consulting for the GAP Datapool2024-01-04T11:06:34+01:002024-01-04T11:06:34+01:00https://bab.gv.at/index.php?option=com_content&view=article&id=2294:bab-073-24-data-modelling-management-and-consulting-for-the-gap-datapool&catid=112&lang=en&Itemid=413Michaela Hager<p>The European Union's Common Agricultural Policy (CAP) plays a central role in shaping agricultural policies that affect agriculture, the environment and rural communities. The evaluation of these policies is crucial to understand their effectiveness, sustainability and potential adjustments.<br /><br />The evaluation of CAP 23-27 poses new challenges for the evaluators involved. Increased expectations of the evaluation as well as larger amounts of data require targeted solutions and tools to efficiently solve current problems.<br /><br />The data-related challenges include the integration of heterogeneous data sources from various agriculture-related areas such as environmental data, socio-economic indicators and more. This data is often available in different formats, structures and qualities. In preparation, methods of data integration, cleansing and analysis must be developed in order to ensure the quality, consistency and analytical capability of this data.<br /><br />As part of the Datenpool project, the BAB has accumulated a wide range of knowledge in these areas. This knowledge, which can be summarized under the interdisciplinary term of data science, is now to be used to support the evaluation of the CAP 23-27 measures in their work.</p>
<h3>Objective</h3>
<p>Creation and design of the architecture of the GAP 23-27 data pool, including conception and use cases of the evaluation and access systems.The aim is to create an efficient and accessible structure that provides the necessary data for the evaluation of CAP measures and at the same time enables easy use and analysis by authorized users.Provide technical expertise and support to Division II/1 and those responsible for CAP measures 23-27 in the area of data modeling and data management.This includes developing and managing an effective data model and implementing best practices for data management to ensure that the CAP data pool contains high quality and reliable data.Provide technical advice and support to CAP intervention managers on sourcing data, seamless transfer of supplier data to relevant interfaces, efficient storage, data processing and smooth delivery to the CAP data pool.<br /><br />Establish methods for data security and integrity, including ways to comprehensively document all stored data under CAP Actions 23-27.</p>
<p>The focus is on creating a solid database that supports evaluation in the best possible way and provides evaluators with relevant, well-documented information.</p>
<h3>Schedule</h3>
<p>Project start: 01/2024<br />Project duration: CAP Strategic Plan 2023-2027 accompanying long-term project</p>
<p> </p>
<p> </p><p>The European Union's Common Agricultural Policy (CAP) plays a central role in shaping agricultural policies that affect agriculture, the environment and rural communities. The evaluation of these policies is crucial to understand their effectiveness, sustainability and potential adjustments.<br /><br />The evaluation of CAP 23-27 poses new challenges for the evaluators involved. Increased expectations of the evaluation as well as larger amounts of data require targeted solutions and tools to efficiently solve current problems.<br /><br />The data-related challenges include the integration of heterogeneous data sources from various agriculture-related areas such as environmental data, socio-economic indicators and more. This data is often available in different formats, structures and qualities. In preparation, methods of data integration, cleansing and analysis must be developed in order to ensure the quality, consistency and analytical capability of this data.<br /><br />As part of the Datenpool project, the BAB has accumulated a wide range of knowledge in these areas. This knowledge, which can be summarized under the interdisciplinary term of data science, is now to be used to support the evaluation of the CAP 23-27 measures in their work.</p>
<h3>Objective</h3>
<p>Creation and design of the architecture of the GAP 23-27 data pool, including conception and use cases of the evaluation and access systems.The aim is to create an efficient and accessible structure that provides the necessary data for the evaluation of CAP measures and at the same time enables easy use and analysis by authorized users.Provide technical expertise and support to Division II/1 and those responsible for CAP measures 23-27 in the area of data modeling and data management.This includes developing and managing an effective data model and implementing best practices for data management to ensure that the CAP data pool contains high quality and reliable data.Provide technical advice and support to CAP intervention managers on sourcing data, seamless transfer of supplier data to relevant interfaces, efficient storage, data processing and smooth delivery to the CAP data pool.<br /><br />Establish methods for data security and integrity, including ways to comprehensively document all stored data under CAP Actions 23-27.</p>
<p>The focus is on creating a solid database that supports evaluation in the best possible way and provides evaluators with relevant, well-documented information.</p>
<h3>Schedule</h3>
<p>Project start: 01/2024<br />Project duration: CAP Strategic Plan 2023-2027 accompanying long-term project</p>
<p> </p>
<p> </p>BAB 071/24: DBZ – webbased dashboard for operational analysis2024-01-04T10:39:15+01:002024-01-04T10:39:15+01:00https://bab.gv.at/index.php?option=com_content&view=article&id=2290:bab-071-24-dbz-webbased-dashboard-for-operational-analysis&catid=112&lang=en&Itemid=413Michaela Hager<p>A constantly growing amount of data with an increased need for availability and user-friendliness also poses growing challenges for the BAB. Numerous interfaces to the population (e.g. funding processing) and the various independent institutions assigned to the BML create a very heterogeneous and widely ramified database infrastructure.</p>
<h3>Objective</h3>
<p>In order to be able to leave these data as independent services in the respective sovereignty and to be able to visualize them collectively, this project is intended to provide a dashboard as a central contact point for all data information over which the BML has sovereignty. <br />Specifically, the result should be a query tool for potential users from Section II of the BML that enables individual company analyses. In addition, the GDPR obligation to provide information is to be taken into account by allowing requests for information to be processed quickly and completely.<br />This should represent the central first major step of the Digitaler Betriebszwilling project.</p>
<h3>Status of the project</h3>
<p>The project is scheduled to start in 2024. By November 2023, a workshop had been held to roughly outline the project. The specific objectives were defined here and it was agreed that this project should be carried out as a collaboration between BAB and BML.</p>
<h3>Planned work for 2024</h3>
<p>The first step is to review all data sources and the relevant responsible bodies. This will result in limitations and potential for further processing and merging the data. <br />This second step should focus on an architecture that is as performant, automated and less complex as possible, while at the same time avoiding direct intervention in the database so as not to hinder existing processes.<br />Initial ideas and implementations will be tested for the creation of a dashboard as the third step.</p>
<h3>Schedule</h3>
<p>Project start: 01/2024<br />Project end: 12/2025</p>
<p> </p><p>A constantly growing amount of data with an increased need for availability and user-friendliness also poses growing challenges for the BAB. Numerous interfaces to the population (e.g. funding processing) and the various independent institutions assigned to the BML create a very heterogeneous and widely ramified database infrastructure.</p>
<h3>Objective</h3>
<p>In order to be able to leave these data as independent services in the respective sovereignty and to be able to visualize them collectively, this project is intended to provide a dashboard as a central contact point for all data information over which the BML has sovereignty. <br />Specifically, the result should be a query tool for potential users from Section II of the BML that enables individual company analyses. In addition, the GDPR obligation to provide information is to be taken into account by allowing requests for information to be processed quickly and completely.<br />This should represent the central first major step of the Digitaler Betriebszwilling project.</p>
<h3>Status of the project</h3>
<p>The project is scheduled to start in 2024. By November 2023, a workshop had been held to roughly outline the project. The specific objectives were defined here and it was agreed that this project should be carried out as a collaboration between BAB and BML.</p>
<h3>Planned work for 2024</h3>
<p>The first step is to review all data sources and the relevant responsible bodies. This will result in limitations and potential for further processing and merging the data. <br />This second step should focus on an architecture that is as performant, automated and less complex as possible, while at the same time avoiding direct intervention in the database so as not to hinder existing processes.<br />Initial ideas and implementations will be tested for the creation of a dashboard as the third step.</p>
<h3>Schedule</h3>
<p>Project start: 01/2024<br />Project end: 12/2025</p>
<p> </p>BAB 069/24: Open Source Data Pipeline & Database for HyDaMS (Hydrographisches Daten Management System)2024-01-04T09:53:20+01:002024-01-04T09:53:20+01:00https://bab.gv.at/index.php?option=com_content&view=article&id=2286:bab-069-24-open-source-data-pipeline-database-for-hydams-hydrographisches-daten-management-system&catid=112&lang=en&Itemid=413Michaela Hager<p>As part of the cooperation between the federal and provincial governments required by the Water Act, a monitoring network is operated to determine the water balance in Austria. This data is collected by the federal states and transferred to the Hydrographic Data Management System (HyDaMS).<br /><br />HyDaMS is a version of the TopoDesk software from Toposoft. This is proprietary software based on the AZUR programming language.<br /><br />In this software, over 300,000 time series with different parameters and time references are stored in a very special format.<br /><br />As part of the INSPIRE Directive (2007; implementation AT: Geodata Infrastructure Act 2010), geodata must be named and made public (metadata) and made available in a harmonized manner (defined common data structure) (data services).<br /><br />In 2019, the old PSI Directive (Public Sector Information) was revised/amended to become the Open Data Directive, which came into force in 2022 as the Federal Information Reuse Act 2022. It contains minimum rules for the reuse of public data and introduces the open-by-default principle: public data should therefore be open (OpenData) in principle, as long as there is nothing definitively against it. In addition, dynamic data (sensors/time series) must be made available via services/API.</p>
<p>The situation was tightened in February 2023 with the publication of the HVD Regulation (High Value Datasets Regulation). The datasets listed there must be made available free of charge (OpenData) via a service/API by June 2024 at the latest.</p>
<h3>Objective</h3>
<p>Modern big data analysis options and dashboards cannot access the time series data within HyDaMS due to the special structure of the data format used to date.<br /><br />In order to enable big data analyses or model calculations, the time series are to be automatically mirrored periodically and monitored in an open source database system via an interface to be developed. This concerns (i) the verified hydrography data, which is to be imported annually at the end of a yearbook, and (ii) the current remotely sensed (and unverified) data from the countries, which is to be imported continuously.<br /><br />A modern database system is also a prerequisite for numerous innovations planned by Division I/3 Water Balance, such as a national water balance model. The provision of data via the WebGIS portal eHYD is also no longer up to date. First, the time series and master data must be exported from HyDaMS as text files and then manually integrated into eHYD. Due to the large amounts of data, there are limitations, so many time series are only offered in aggregated temporal resolution in order to reduce the amount of data. If a database with a modern interface is available, the eHYD could be linked directly to the new database so that a user can access the entire hydrography data set.</p>
<p>The HyDaMS does not have a machine-readable interface (API) and, according to Toposoft, cannot be equipped with one. The present data pipeline and database project between the BML and the BAB is therefore also conditioned by legal requirements (see "Initial situation").</p>
<h3>Main goals</h3>
<ul>
<li>Stable export of data from HyDaMS (master data and time series)</li>
<li>Selection of a suitable open source database system</li>
<li>Development of data pipelines for the import of data from HyDaMS into a selected open source database</li>
<li>Development and operation of a test system</li>
<li>Provision of interfaces to dashboard and state-of-the-art evaluation tools</li>
</ul>
<h3>Planned procedure, implementation</h3>
<ul>
<li>BML Dept. I/3: Viewing and evaluating the existing data (master data and time series) so that only cleansed time series that are useful for further evaluation are processed</li>
<li>BAB: Testing the suitability and comparison of various open source databases with regard to the efficient management of a sensor time series big data test dataset (100 million data entries)</li>
<li>BAB: Implementation of a suitable database schema</li>
<li>BAB / BML Dept. I/2: Development of scripts for the following applications: - Linking the time series from Hydams/Callisto with the master data; - Exporting to arrays with format conversions; - Transfer of data to OpenSource database</li>
</ul>
<ul>
<li>BAB: Ensuring the following functions: - Parallelization; - logging; - monitoring</li>
<li>BAB: Time-controlled execution: - Open source-based workflow management; --Create, manage and monitor workflows; - Map workflows with directed acyclic graphs</li>
<li>BAB: Provision and hosting of a test DBMS</li>
<li>BAB: Export time series data from HyDaMs and Callisto</li>
<li>BAB: Import time series data into the selected test system</li>
<li>BAB / BML Dept. I/3: Data validation</li>
</ul>
<h3> <br />Schedule</h3>
<p>Project start: 01/2024<br />Project end: 12/2025</p>
<p> </p>
<p> </p>
<p> </p>
<p> </p>
<p> </p><p>As part of the cooperation between the federal and provincial governments required by the Water Act, a monitoring network is operated to determine the water balance in Austria. This data is collected by the federal states and transferred to the Hydrographic Data Management System (HyDaMS).<br /><br />HyDaMS is a version of the TopoDesk software from Toposoft. This is proprietary software based on the AZUR programming language.<br /><br />In this software, over 300,000 time series with different parameters and time references are stored in a very special format.<br /><br />As part of the INSPIRE Directive (2007; implementation AT: Geodata Infrastructure Act 2010), geodata must be named and made public (metadata) and made available in a harmonized manner (defined common data structure) (data services).<br /><br />In 2019, the old PSI Directive (Public Sector Information) was revised/amended to become the Open Data Directive, which came into force in 2022 as the Federal Information Reuse Act 2022. It contains minimum rules for the reuse of public data and introduces the open-by-default principle: public data should therefore be open (OpenData) in principle, as long as there is nothing definitively against it. In addition, dynamic data (sensors/time series) must be made available via services/API.</p>
<p>The situation was tightened in February 2023 with the publication of the HVD Regulation (High Value Datasets Regulation). The datasets listed there must be made available free of charge (OpenData) via a service/API by June 2024 at the latest.</p>
<h3>Objective</h3>
<p>Modern big data analysis options and dashboards cannot access the time series data within HyDaMS due to the special structure of the data format used to date.<br /><br />In order to enable big data analyses or model calculations, the time series are to be automatically mirrored periodically and monitored in an open source database system via an interface to be developed. This concerns (i) the verified hydrography data, which is to be imported annually at the end of a yearbook, and (ii) the current remotely sensed (and unverified) data from the countries, which is to be imported continuously.<br /><br />A modern database system is also a prerequisite for numerous innovations planned by Division I/3 Water Balance, such as a national water balance model. The provision of data via the WebGIS portal eHYD is also no longer up to date. First, the time series and master data must be exported from HyDaMS as text files and then manually integrated into eHYD. Due to the large amounts of data, there are limitations, so many time series are only offered in aggregated temporal resolution in order to reduce the amount of data. If a database with a modern interface is available, the eHYD could be linked directly to the new database so that a user can access the entire hydrography data set.</p>
<p>The HyDaMS does not have a machine-readable interface (API) and, according to Toposoft, cannot be equipped with one. The present data pipeline and database project between the BML and the BAB is therefore also conditioned by legal requirements (see "Initial situation").</p>
<h3>Main goals</h3>
<ul>
<li>Stable export of data from HyDaMS (master data and time series)</li>
<li>Selection of a suitable open source database system</li>
<li>Development of data pipelines for the import of data from HyDaMS into a selected open source database</li>
<li>Development and operation of a test system</li>
<li>Provision of interfaces to dashboard and state-of-the-art evaluation tools</li>
</ul>
<h3>Planned procedure, implementation</h3>
<ul>
<li>BML Dept. I/3: Viewing and evaluating the existing data (master data and time series) so that only cleansed time series that are useful for further evaluation are processed</li>
<li>BAB: Testing the suitability and comparison of various open source databases with regard to the efficient management of a sensor time series big data test dataset (100 million data entries)</li>
<li>BAB: Implementation of a suitable database schema</li>
<li>BAB / BML Dept. I/2: Development of scripts for the following applications: - Linking the time series from Hydams/Callisto with the master data; - Exporting to arrays with format conversions; - Transfer of data to OpenSource database</li>
</ul>
<ul>
<li>BAB: Ensuring the following functions: - Parallelization; - logging; - monitoring</li>
<li>BAB: Time-controlled execution: - Open source-based workflow management; --Create, manage and monitor workflows; - Map workflows with directed acyclic graphs</li>
<li>BAB: Provision and hosting of a test DBMS</li>
<li>BAB: Export time series data from HyDaMs and Callisto</li>
<li>BAB: Import time series data into the selected test system</li>
<li>BAB / BML Dept. I/3: Data validation</li>
</ul>
<h3> <br />Schedule</h3>
<p>Project start: 01/2024<br />Project end: 12/2025</p>
<p> </p>
<p> </p>
<p> </p>
<p> </p>
<p> </p>BAB 050/21: Soil Data Analysis - Financial Soil Estimation and Soil Mapping (BODAT)2021-01-04T12:32:17+01:002021-01-04T12:32:17+01:00https://bab.gv.at/index.php?option=com_content&view=article&id=295:bab-050-21-soil-data-analysis-financial-soil-estimation-and-soil-mapping-bodat&catid=112&lang=en&Itemid=413Michaela Hager<p>In the BODAT project, possible applications of the two area-related soil data sets (financial soil valuation, soil map) of Austria were evaluated and possibilities of use for agricultural management and agricultural policy measures were investigated. In addition, the possibility of merging the two area-based soil data sets was examined and the correlation of individual soil parameters with remote sensing data as well as results of the GIS-ELA project were analyzed. For the project areas, remotely sensed parameters (leaf area index, vegetation index NDVI) were calculated from satellite imagery data using BAB's OpenDataCube (ODC), which can depict the current state of the vegetation. By combining these satellite data, which are available in high temporal and spatial resolution, with the two Austrian soil information systems (financial soil valuation, soil map), this potentially opens up new possibilities to obtain indirect information on soil properties or to usefully complement the soil data. The combination of the two area-related soil data sets themselves did not lead to a satisfactory result in the currently available data structure and is in principle also not possible in an automated way. However, the combination of the soil data of the financial soil valuation with remote sensing data offers potential for the production of map bases for precision agriculture.</p>
<p>Project start: 13.10.2020<br />Project end: 31.8.2022</p>
<p>{rsfiles path="Publikationen/BAB/Abschlussberichte/abschlussbericht_bodat_bab_050_22.pdf"}</p>
<p>In the BODAT project, possible applications of the two area-related soil data sets (financial soil valuation, soil map) of Austria were evaluated and possibilities of use for agricultural management and agricultural policy measures were investigated. In addition, the possibility of merging the two area-based soil data sets was examined and the correlation of individual soil parameters with remote sensing data as well as results of the GIS-ELA project were analyzed. For the project areas, remotely sensed parameters (leaf area index, vegetation index NDVI) were calculated from satellite imagery data using BAB's OpenDataCube (ODC), which can depict the current state of the vegetation. By combining these satellite data, which are available in high temporal and spatial resolution, with the two Austrian soil information systems (financial soil valuation, soil map), this potentially opens up new possibilities to obtain indirect information on soil properties or to usefully complement the soil data. The combination of the two area-related soil data sets themselves did not lead to a satisfactory result in the currently available data structure and is in principle also not possible in an automated way. However, the combination of the soil data of the financial soil valuation with remote sensing data offers potential for the production of map bases for precision agriculture.</p>
<p>Project start: 13.10.2020<br />Project end: 31.8.2022</p>
<p>{rsfiles path="Publikationen/BAB/Abschlussberichte/abschlussbericht_bodat_bab_050_22.pdf"}</p>
BAB 049/21: Open Data Cube as data repository and analysis tool for drought monitoring2021-01-01T12:07:10+01:002021-01-01T12:07:10+01:00https://bab.gv.at/index.php?option=com_content&view=article&id=293:bab-049-21-open-data-cube-as-data-repository-and-analysis-tool-for-drought-monitoring&catid=112&lang=en&Itemid=413Michaela Hager<h2>Development of analysis tools for recurring agricultural policy issues based on the example of drought monitoring</h2>
<h3>Initial situation</h3>
<p>A data cube (ODC) was set up at the Federal Institute of Agricultural Economics and Mountain Farming (BAB) in order to efficiently manage and analyze the constantly growing amount of raster data. The unique feature of the BAB's Open Data Cube is the extension of the technology originally intended for satellite images so that other data can also be indexed and loaded as time series into this multidimensional data cube. This will make it possible to intersect and evaluate a variety of raster and rasterized vector data (including ALS, INVEKOS and climate data) and satellite images together in one system. In addition to the purely spatial analysis, the temporal dimension can also be taken into account in the calculations and evaluations in the multidimensional data cube, which enables high-performance analyses of time series.</p>
<h3>Objective</h3>
<p>The aim is to establish the ODC as an analysis tool at the BAB in order to be able to carry out recurring data evaluations as a basis for decision-making on agricultural policy issues in the future in a much more targeted and, above all, more performant manner. The ODC should serve as a networked data center and be able to function openly for data integration via cloud object storage interfaces (e.g. S3).This is intended to replace part of the existing geodata infrastructure.This solution minimizes sources of error, makes the most up-to-date data immediately available to all users and is managed decentrally.The aim is for users to be able to carry out the necessary analyses independently via an internet browser on the underlying infrastructure. Existing analysis functions are to be expanded and recurring evaluations of various issues can be updated. As an example application, monitoring of Austria's climatic development is being developed specifically for agricultural areas. The ODC is to be transferred from a pilot environment to a scalable, fail-safe productive environment in the further course of the project.</p>
<h3>Status of the project</h3>
<p>The Jupyterhub platform around the ODC was migrated from a test environment to a Kubernetes cluster (system for orchestrating container applications), which ensures high reliability and rapid scalability.<br />In 2023, the infrastructure was further optimized and expanded into a dask cluster.This makes it possible to parallelize processes and make the best possible use of the hardware resources of the Kubernetes cluster in order to increase performance.In addition, login to the platform was replaced by single sign-on (SSO).<br /><br />As a methodological use case, defined climate parameters (e.g. the climatic water balance, heat days and dry periods of 10 or more days) were calculated for each cadastral municipality in Austria for the normal climate period 1961-1990 and 1991-2020.The aim was to show climate changes and to create a basis that allows these or similar recurring questions to be answered at short notice with current data if required.<br /><br />The ODC was presented and discussed at the GI-Salzburg 2023.</p>
<h3>Planned work 2024</h3>
<p>Due to the positive experience gained in the course of the ODC project and the possibility of storing the growing amounts of raster data (satellite, ALS, climate data, etc.) in a structured way and being able to evaluate them in combination for many questions, the ODC and the Jupyterhub environment will be further developed in 2024.The results of analyses already carried out, such as the calculation of climate parameters for arable and grassland areas, are to be updated once up-to-date data has been acquired. The long-term trend is now moving towards decentralized evaluation, where the large volumes of data no longer need to be stored.For this reason, the infrastructure will be further developed and data will increasingly be integrated using STAC (Spatio Temporal Asset Catalogs) in 2024. This is a specification that enables interoperable access to a wide range of global data without having to download this data yourself.<br /><br />At the current request of the BML, the monitoring of climatic development is to be expanded. With a focus on agricultural areas, current climatic indices (based on climate data) and vegetation indices (based on satellite data) are to be calculated and compared with long-term mean values.</p>
<h3>Schedule</h3>
<p>Project start: 01/2021<br />Project end: 12/2024</p>
<p> </p>
<h2>Development of analysis tools for recurring agricultural policy issues based on the example of drought monitoring</h2>
<h3>Initial situation</h3>
<p>A data cube (ODC) was set up at the Federal Institute of Agricultural Economics and Mountain Farming (BAB) in order to efficiently manage and analyze the constantly growing amount of raster data. The unique feature of the BAB's Open Data Cube is the extension of the technology originally intended for satellite images so that other data can also be indexed and loaded as time series into this multidimensional data cube. This will make it possible to intersect and evaluate a variety of raster and rasterized vector data (including ALS, INVEKOS and climate data) and satellite images together in one system. In addition to the purely spatial analysis, the temporal dimension can also be taken into account in the calculations and evaluations in the multidimensional data cube, which enables high-performance analyses of time series.</p>
<h3>Objective</h3>
<p>The aim is to establish the ODC as an analysis tool at the BAB in order to be able to carry out recurring data evaluations as a basis for decision-making on agricultural policy issues in the future in a much more targeted and, above all, more performant manner. The ODC should serve as a networked data center and be able to function openly for data integration via cloud object storage interfaces (e.g. S3).This is intended to replace part of the existing geodata infrastructure.This solution minimizes sources of error, makes the most up-to-date data immediately available to all users and is managed decentrally.The aim is for users to be able to carry out the necessary analyses independently via an internet browser on the underlying infrastructure. Existing analysis functions are to be expanded and recurring evaluations of various issues can be updated. As an example application, monitoring of Austria's climatic development is being developed specifically for agricultural areas. The ODC is to be transferred from a pilot environment to a scalable, fail-safe productive environment in the further course of the project.</p>
<h3>Status of the project</h3>
<p>The Jupyterhub platform around the ODC was migrated from a test environment to a Kubernetes cluster (system for orchestrating container applications), which ensures high reliability and rapid scalability.<br />In 2023, the infrastructure was further optimized and expanded into a dask cluster.This makes it possible to parallelize processes and make the best possible use of the hardware resources of the Kubernetes cluster in order to increase performance.In addition, login to the platform was replaced by single sign-on (SSO).<br /><br />As a methodological use case, defined climate parameters (e.g. the climatic water balance, heat days and dry periods of 10 or more days) were calculated for each cadastral municipality in Austria for the normal climate period 1961-1990 and 1991-2020.The aim was to show climate changes and to create a basis that allows these or similar recurring questions to be answered at short notice with current data if required.<br /><br />The ODC was presented and discussed at the GI-Salzburg 2023.</p>
<h3>Planned work 2024</h3>
<p>Due to the positive experience gained in the course of the ODC project and the possibility of storing the growing amounts of raster data (satellite, ALS, climate data, etc.) in a structured way and being able to evaluate them in combination for many questions, the ODC and the Jupyterhub environment will be further developed in 2024.The results of analyses already carried out, such as the calculation of climate parameters for arable and grassland areas, are to be updated once up-to-date data has been acquired. The long-term trend is now moving towards decentralized evaluation, where the large volumes of data no longer need to be stored.For this reason, the infrastructure will be further developed and data will increasingly be integrated using STAC (Spatio Temporal Asset Catalogs) in 2024. This is a specification that enables interoperable access to a wide range of global data without having to download this data yourself.<br /><br />At the current request of the BML, the monitoring of climatic development is to be expanded. With a focus on agricultural areas, current climatic indices (based on climate data) and vegetation indices (based on satellite data) are to be calculated and compared with long-term mean values.</p>
<h3>Schedule</h3>
<p>Project start: 01/2021<br />Project end: 12/2024</p>
<p> </p>
BAB 048/21: Image matching of aerial imagery using graphics processors to create up-to-date surface models.2021-01-01T11:45:07+01:002021-01-01T11:45:07+01:00https://bab.gv.at/index.php?option=com_content&view=article&id=291:bab-048-21-image-matching-of-aerial-imagery-using-graphics-processors-to-create-up-to-date-surface-models&catid=112&lang=en&Itemid=413Michaela Hager<h3>Initial situation</h3>
<p>The aerial images for the entire Austrian federal territory are updated in a three-year cycle, with around one third of the national territory being reflown each year. These aerial surveys are commissioned as part of a federal-state cooperation and are primarily used to create two-dimensional orthophotos (rectified, true-to-scale aerial images) for a variety of applications. The original aerial images are recorded with a transverse and longitudinal overlap, which makes it possible to generate high-resolution, three-dimensional data (point clouds, meshes, surface models) of the earth's surface using the image matching method, from the results of which central spatial information - such as vegetation heights, vegetation structures, terrain changes and building heights - can be derived, which provides a valuable basis for numerous application and research projects. At the moment, the calculation of one third of the national territory takes about one year of computing time and is currently being carried out at the Federal Office of Metrology and Surveying (BEV) - optimized for urban areas - and at the Federal Research Centre for Forests (BFW) - optimized for forest areas (forest inventory). The image matching products created to date are carried out on stand-alone computers and are only available after a long delay. The contracted flight companies supply aerial image data on hard disks, generating around 150 terabytes of data per year.</p>
<h3>Objective</h3>
<p>The main objective of this project is to accelerate the process of creating high-resolution digital surface models (DOM) using image matching. This acceleration is to be achieved with the help of a server architecture based on graphics processors in cluster operation. While it takes about a year to calculate using current methods, the aim is to shorten this to a few months so that calculated surface models can be used in the year following the flight.A further aim is to identify potential savings in the development of the processing infrastructure (hardware and software).In addition to accelerating the process, the optimization of the process parameters and results for use in agriculture and forestry (determining the usability of the results for agricultural funding as well as other applications and research projects) is also central.In 2022, the Federal Ministry of Agriculture, Forestry, Environment and Water Management (BML) issued an order to ensure easy access to the aerial images and to ensure the long-term usability of the raw data - to this end, the aerial images will be archived in their original format and resolution using LTO-8 magnetic tapes for future calculations and research.In addition, it was requested that the test infrastructure created in the course of the project be left in operation for the time being and maintained. In line with the recommendations of the BMNT's digitization report (2018), the methodological competence for the use of new geodata sources is to be expanded.It is not the aim of the project to provide a regular operation or service from the developed or tested products and procedures.Based on the valuable data basis and the infrastructure and methodology created, calculation tests will also be carried out to create and evaluate follow-up products with great future potential (3D mesh models and point clouds).The know-how built up will be made available to the entire department, the GIS coordination department of the BML and cooperating agencies and communicated to interested parties in the form of presentations and a final report.The objective was slightly adapted during the course of the project and in the course of the findings in consultation with the BML.<br />Status of the work</p>
<h3>Work 2024</h3>
<p>The cooperation and exchange of experience with the BEV should be continued and, if possible, the BFW should also be involved. The aim is to optimise the workflow and processing parameters in order to be able to provide results more quickly with aFollowing the successful optimization of the surface models, calculation tests of mesh models and point clouds will be carried out in 2024. The cooperation with the Federal Office of Metrology and Surveying (BEV) and the Federal Forest Research Center (BFW) to exchange experiences and further develop the methods will be continued.The results and findings of the project will be published in a final report and presented at the GIS-Jour-Fixe.The long-term backup of the raw aerial image data on LTO tapes will be continued, whereby the effort will increase towards the past, as a larger number of data carriers were used per flight block and older hard disks have already partially lost their functionality.The management of the data carriers will be transferred to the library system.</p>
<h3>Schedule</h3>
<p>Project start: 01/2021<br />Project end: 12/2024<br /><br /></p>
<h3>Initial situation</h3>
<p>The aerial images for the entire Austrian federal territory are updated in a three-year cycle, with around one third of the national territory being reflown each year. These aerial surveys are commissioned as part of a federal-state cooperation and are primarily used to create two-dimensional orthophotos (rectified, true-to-scale aerial images) for a variety of applications. The original aerial images are recorded with a transverse and longitudinal overlap, which makes it possible to generate high-resolution, three-dimensional data (point clouds, meshes, surface models) of the earth's surface using the image matching method, from the results of which central spatial information - such as vegetation heights, vegetation structures, terrain changes and building heights - can be derived, which provides a valuable basis for numerous application and research projects. At the moment, the calculation of one third of the national territory takes about one year of computing time and is currently being carried out at the Federal Office of Metrology and Surveying (BEV) - optimized for urban areas - and at the Federal Research Centre for Forests (BFW) - optimized for forest areas (forest inventory). The image matching products created to date are carried out on stand-alone computers and are only available after a long delay. The contracted flight companies supply aerial image data on hard disks, generating around 150 terabytes of data per year.</p>
<h3>Objective</h3>
<p>The main objective of this project is to accelerate the process of creating high-resolution digital surface models (DOM) using image matching. This acceleration is to be achieved with the help of a server architecture based on graphics processors in cluster operation. While it takes about a year to calculate using current methods, the aim is to shorten this to a few months so that calculated surface models can be used in the year following the flight.A further aim is to identify potential savings in the development of the processing infrastructure (hardware and software).In addition to accelerating the process, the optimization of the process parameters and results for use in agriculture and forestry (determining the usability of the results for agricultural funding as well as other applications and research projects) is also central.In 2022, the Federal Ministry of Agriculture, Forestry, Environment and Water Management (BML) issued an order to ensure easy access to the aerial images and to ensure the long-term usability of the raw data - to this end, the aerial images will be archived in their original format and resolution using LTO-8 magnetic tapes for future calculations and research.In addition, it was requested that the test infrastructure created in the course of the project be left in operation for the time being and maintained. In line with the recommendations of the BMNT's digitization report (2018), the methodological competence for the use of new geodata sources is to be expanded.It is not the aim of the project to provide a regular operation or service from the developed or tested products and procedures.Based on the valuable data basis and the infrastructure and methodology created, calculation tests will also be carried out to create and evaluate follow-up products with great future potential (3D mesh models and point clouds).The know-how built up will be made available to the entire department, the GIS coordination department of the BML and cooperating agencies and communicated to interested parties in the form of presentations and a final report.The objective was slightly adapted during the course of the project and in the course of the findings in consultation with the BML.<br />Status of the work</p>
<h3>Work 2024</h3>
<p>The cooperation and exchange of experience with the BEV should be continued and, if possible, the BFW should also be involved. The aim is to optimise the workflow and processing parameters in order to be able to provide results more quickly with aFollowing the successful optimization of the surface models, calculation tests of mesh models and point clouds will be carried out in 2024. The cooperation with the Federal Office of Metrology and Surveying (BEV) and the Federal Forest Research Center (BFW) to exchange experiences and further develop the methods will be continued.The results and findings of the project will be published in a final report and presented at the GIS-Jour-Fixe.The long-term backup of the raw aerial image data on LTO tapes will be continued, whereby the effort will increase towards the past, as a larger number of data carriers were used per flight block and older hard disks have already partially lost their functionality.The management of the data carriers will be transferred to the library system.</p>
<h3>Schedule</h3>
<p>Project start: 01/2021<br />Project end: 12/2024<br /><br /></p>
BAB 031/19: Reports on and simulation of revenues and costs subject to changes in prices and quantities2019-05-01T16:39:22+02:002019-05-01T16:39:22+02:00https://bab.gv.at/index.php?option=com_content&view=article&id=256:bab-031-19-reports-on-and-simulation-of-revenues-and-costs-subject-to-changes-in-prices-and-quantities&catid=112&lang=en&Itemid=413Michaela Hager<h3>Initial situation</h3>
<p>In Austria there are currently market-based solutions (options, futures, other derivatives) for stabilising the producer prices of certain agricultural products. However, there are no market-based (e.g. insurance) or tax-based instruments to stabilise total agricultural income. Relevant information and data on agricultural markets are sometimes not available to farm managers, or are only available with difficulty or from different providers; this in turn increases their workload in the course of independent income stabilisation activities.</p>
<h3>Objective</h3>
<p>To provide digital information on developments in the agricultural sector in a collected, compact and user-friendly form in order to support farm managers in their individual income stabilisation measures. Such digital information offers can be price reports including price forecasts, an information platform or an application for the simulation of income/contribution margin fluctuations due to price and quantity fluctuations.</p>
<h3>Initial situation</h3>
<p>In Austria there are currently market-based solutions (options, futures, other derivatives) for stabilising the producer prices of certain agricultural products. However, there are no market-based (e.g. insurance) or tax-based instruments to stabilise total agricultural income. Relevant information and data on agricultural markets are sometimes not available to farm managers, or are only available with difficulty or from different providers; this in turn increases their workload in the course of independent income stabilisation activities.</p>
<h3>Objective</h3>
<p>To provide digital information on developments in the agricultural sector in a collected, compact and user-friendly form in order to support farm managers in their individual income stabilisation measures. Such digital information offers can be price reports including price forecasts, an information platform or an application for the simulation of income/contribution margin fluctuations due to price and quantity fluctuations.</p>
BAB 022/19: Data pool: Platform for natural area controlling2019-01-02T12:49:43+01:002019-01-02T12:49:43+01:00https://bab.gv.at/index.php?option=com_content&view=article&id=2032:bab-022-19-data-pool-platform-for-natural-area-controlling&catid=112&lang=en&Itemid=413Michaela Hager<p>Objective<br /><br />Storage, processing and further use of project data from ACube and BMon in the data pool, as well as provision of an operative platform for the interconnection and analysis of these data in the course of natural area controlling.<br />Initial situation<br /><br />Approximately 12 TB of data from the Sentinel 1 and 2 satellites arrive at the EODC every day and are stored there on a long-term basis. In the course of the ACube (Austrian Data Cube) and BMon (Soil Moisture Monitor) projects (see BMNT-UW.3.3.3/0015-IV/6/2018), these data are prepared and processed. In order to be able to merge the results of these projects with the in-house data of the BMLRT or to carry out Austria-wide analyses with the "Protected Water Analysis Tool (SwwAT)", the BMLRT/Department I/10 requires the corresponding infrastructure, as this is not feasible with the resources currently available at the central office.<br /><br />SwwAT:<br /><br /> In Swwat, the BMLRT cooperation partners are the provincial offices of the Federal Water Engineering Administration (indirect federal administration). The BMLRT/Department I/10 approves, collects and analyses their flood protection data and<br /> reports these within the framework of the EU Floods Directive and<br /> within the framework of INSPIRE to the European Commission and<br /> controls the entire financing system on the basis of its own evaluations in accordance with the Water Structures Promotion Act (€ 70-90 million per year).<br /> Linking Swwat GDI and ODC:<br /><br />Within the framework of the FLOOD RISK MANAGEMENT (EU-RTL), the increased use of basic data (ALS, DKM, GWR ...) from the GDI for Austria-wide analyses ==》 For this, the BMLRT/Department I/10 needs a high-performance infrastructure that is above all "close" to the GDI.<br /><br /> FLOOD RISK AWARENESS SYSTEM<br /><br />Objective: The BMLRT/Department I/10 combines the SWWAT information with the Copernicus data of the EODC, the ZAMG and the hydrographic service of the BMLRT within the framework of a continuous FLOOD RISK MONITORING in order to monitor the current status (flood situation picture). In order to be able to act internally and independently, the federal Copernicus products will be transferred to the OGD and thus integrated into the monitoring via Swwat. ==For this purpose, the BMLRT/Department I/10 needs a powerful internal infrastructure that is close to the data sources and the analysis tools and that would otherwise be very expensive and time-consuming to purchase.<br />Work in 2019<br /><br /> Development of a PostgresSQl database with GIS extension<br /> Connection of the BMLRT<br /> Training and data migration (MSGIS)<br /><br />Work 2020<br /><br /> Operation of the PostgresSQl database with GIS extension<br /> Data migration (MSGIS)<br /> Application and evaluation by Dept. I/10</p>
<p>Translated with www.DeepL.com/Translator (free version)</p><p>Objective<br /><br />Storage, processing and further use of project data from ACube and BMon in the data pool, as well as provision of an operative platform for the interconnection and analysis of these data in the course of natural area controlling.<br />Initial situation<br /><br />Approximately 12 TB of data from the Sentinel 1 and 2 satellites arrive at the EODC every day and are stored there on a long-term basis. In the course of the ACube (Austrian Data Cube) and BMon (Soil Moisture Monitor) projects (see BMNT-UW.3.3.3/0015-IV/6/2018), these data are prepared and processed. In order to be able to merge the results of these projects with the in-house data of the BMLRT or to carry out Austria-wide analyses with the "Protected Water Analysis Tool (SwwAT)", the BMLRT/Department I/10 requires the corresponding infrastructure, as this is not feasible with the resources currently available at the central office.<br /><br />SwwAT:<br /><br /> In Swwat, the BMLRT cooperation partners are the provincial offices of the Federal Water Engineering Administration (indirect federal administration). The BMLRT/Department I/10 approves, collects and analyses their flood protection data and<br /> reports these within the framework of the EU Floods Directive and<br /> within the framework of INSPIRE to the European Commission and<br /> controls the entire financing system on the basis of its own evaluations in accordance with the Water Structures Promotion Act (€ 70-90 million per year).<br /> Linking Swwat GDI and ODC:<br /><br />Within the framework of the FLOOD RISK MANAGEMENT (EU-RTL), the increased use of basic data (ALS, DKM, GWR ...) from the GDI for Austria-wide analyses ==》 For this, the BMLRT/Department I/10 needs a high-performance infrastructure that is above all "close" to the GDI.<br /><br /> FLOOD RISK AWARENESS SYSTEM<br /><br />Objective: The BMLRT/Department I/10 combines the SWWAT information with the Copernicus data of the EODC, the ZAMG and the hydrographic service of the BMLRT within the framework of a continuous FLOOD RISK MONITORING in order to monitor the current status (flood situation picture). In order to be able to act internally and independently, the federal Copernicus products will be transferred to the OGD and thus integrated into the monitoring via Swwat. ==For this purpose, the BMLRT/Department I/10 needs a powerful internal infrastructure that is close to the data sources and the analysis tools and that would otherwise be very expensive and time-consuming to purchase.<br />Work in 2019<br /><br /> Development of a PostgresSQl database with GIS extension<br /> Connection of the BMLRT<br /> Training and data migration (MSGIS)<br /><br />Work 2020<br /><br /> Operation of the PostgresSQl database with GIS extension<br /> Data migration (MSGIS)<br /> Application and evaluation by Dept. I/10</p>
<p>Translated with www.DeepL.com/Translator (free version)</p>BAB 023/19: Data pool: Sentinel-2 Data processing with open source2019-01-02T12:38:08+01:002019-01-02T12:38:08+01:00https://bab.gv.at/index.php?option=com_content&view=article&id=2030:bab-023-19-data-pool-sentinel-2-data-processing-with-open-source&catid=112&lang=en&Itemid=413Michaela Hager<p>The Sentinels are a fleet of seven types of satellites that provide various freely accessible data and images for the European Commission's Copernicus programme. One of these satellites is Sentinel-2, whose data was used in this project.<br /><br />The main mission objectives of Sentinel-2, consisting of the two satellites Sentinel-2A and Sentinel-2B, are to acquire global high-resolution multispectral imagery with a high repetition rate and to provide information in the form of land cover maps, land change detection maps and geophysical variables. Consequently, Sentinel-2 contributes directly to global land monitoring, disaster management and security services.<br /><br />At the Federal Institute for Agricultural Economics and Mining Research (BAB), Sentinel-2 data for the whole of Austria for the years 2017 and 2018 could be obtained in a test run and a terrain and atmospheric correction was successfully carried out.<br /><br />In total, more than 3000 scenes could be processed with the open-source software Sen2-Agri. In a further step, the following "Essential Climate Variables" could be calculated for the whole of Austria at a spatial resolution of 10m:<br /><br /> Leaf Area Index<br /> Fraction of green Vegetation Cover<br /> Fraction of Absorbed Photosynthetically Active Radiation<br /><br />The data were successfully used for the monitoring of Austrian biotope types.<br /><br />Based on the experience gained with raster data (storage space, performance), it was started to outsource data to an OpenDataCube.<br /><br />The OpenDataCube is an open source software developed for processing and storing large amounts of raster data. Furthermore, it is also possible to distribute and provide the data and results of the OpenDataCube via OGC compliant services such as WMS & WCS.<br /><br />When importing the data (ingest) into the cube, special emphasis is placed on keeping all raster layers in a unit grid such as the European Statistical Grid (ETRS89 LAEA). This has the great advantage that every raster cell in the cube refers to the same origin and thus no inaccuracies can arise in the analyses in the cube. Such grid systems also play an important role in the field of spatial statistics.<br /><br />The know-how gathered in the course of the work with the OpenDataCube should subsequently enable a new provision basis in the BMLRT departmental GIS for raster data with the Datacube technology. To date, the data has been distributed among the departments via the GDS-GDI (HDs), with this geodata collection being released every six months with new updates to the data. Another advantage of providing OGC-compliant services via the OpenDataCube can thus be seen in the immediate availability of the raster data.<br /><br />Translated with www.DeepL.com/Translator (free version)</p><p>The Sentinels are a fleet of seven types of satellites that provide various freely accessible data and images for the European Commission's Copernicus programme. One of these satellites is Sentinel-2, whose data was used in this project.<br /><br />The main mission objectives of Sentinel-2, consisting of the two satellites Sentinel-2A and Sentinel-2B, are to acquire global high-resolution multispectral imagery with a high repetition rate and to provide information in the form of land cover maps, land change detection maps and geophysical variables. Consequently, Sentinel-2 contributes directly to global land monitoring, disaster management and security services.<br /><br />At the Federal Institute for Agricultural Economics and Mining Research (BAB), Sentinel-2 data for the whole of Austria for the years 2017 and 2018 could be obtained in a test run and a terrain and atmospheric correction was successfully carried out.<br /><br />In total, more than 3000 scenes could be processed with the open-source software Sen2-Agri. In a further step, the following "Essential Climate Variables" could be calculated for the whole of Austria at a spatial resolution of 10m:<br /><br /> Leaf Area Index<br /> Fraction of green Vegetation Cover<br /> Fraction of Absorbed Photosynthetically Active Radiation<br /><br />The data were successfully used for the monitoring of Austrian biotope types.<br /><br />Based on the experience gained with raster data (storage space, performance), it was started to outsource data to an OpenDataCube.<br /><br />The OpenDataCube is an open source software developed for processing and storing large amounts of raster data. Furthermore, it is also possible to distribute and provide the data and results of the OpenDataCube via OGC compliant services such as WMS & WCS.<br /><br />When importing the data (ingest) into the cube, special emphasis is placed on keeping all raster layers in a unit grid such as the European Statistical Grid (ETRS89 LAEA). This has the great advantage that every raster cell in the cube refers to the same origin and thus no inaccuracies can arise in the analyses in the cube. Such grid systems also play an important role in the field of spatial statistics.<br /><br />The know-how gathered in the course of the work with the OpenDataCube should subsequently enable a new provision basis in the BMLRT departmental GIS for raster data with the Datacube technology. To date, the data has been distributed among the departments via the GDS-GDI (HDs), with this geodata collection being released every six months with new updates to the data. Another advantage of providing OGC-compliant services via the OpenDataCube can thus be seen in the immediate availability of the raster data.<br /><br />Translated with www.DeepL.com/Translator (free version)</p>