Smart Farming is driving a revolution in agriculture, aiming at more productive and sustainable production through precise and resource-efficient decision making, with additional applications in forest and rangeland management. Remotely sensed Big Data from satellite, small unmanned aerial, airborne, in situ and proximal systems, brings both challenges and opportunities which requires high spatial resolution and near real-time mapping capabilities. Success in crop health monitoring, stress identification, soil mapping, fertilizer and irrigation advisories, yield prediction, ecosystem services, and more have been achieved. This session seeks contributions across government, university, private, and nonprofit organizations. It focuses on research methodologies and applications for the use of high spatial resolution or high temporal frequency remotely sensed Big Data for Smart Farming and land management applications. We invite your findings throughout the chain of data collection, storage, transfer, transformation, analytics and discuss how to achieve the goal of more productive and sustainable agriculture production
Many of the promises of smart farming centre on assisting farmers to monitor their fields throughout the growing season. Having precise field boundaries has thus become a prerequisite for field-level assessment. When farmers are being signed up by agricultural service providers, they are often asked for precise digital records of their boundaries. Unfortunately, this process remains largely manual, time-consuming and prone to errors which creates disincentives. There are also increasing applications whereby remote monitoring of crops using earth observation is used for estimating areas of crop planted and yield forecasts. Automating the extraction of field boundaries would facilitate bringing farmers on board, and hence fostering wider adoption of these services, but would also improve products and services to be provided using remote sensing. Several methods to extract field boundaries from satellite imagery have been proposed, but the apparent lack of field boundary data sets seems to indicate low uptake, presumably because of expensive image preprocessing requirements and local, often arbitrary, tuning. Here, we introduce a novel approach with low image preprocessing requirements to extract field boundaries from satellite imagery. It poses the problem as a semantic segmentation problem with three tasks designed to answer the following questions: 1) Does a given pixel belong to a field? 2) Is that pixel part of a field boundary? and 3) What is the distance from that pixel to the closest field boundary? Closed field boundaries and individual fields can then be extracted by combining the answers to these three questions. The tasks are performed with ResUNet-a, a deep convolutional neural network with a fully connected UNet backbone that features dilated convolutions and conditioned inference. First, we characterise the model’s performance at local scale. Using a single composite image from Sentinel-2 over South Africa, the model is highly accurate in mapping field extent, field boundaries, and, consequently, individual fields. Replacing the monthly composite with a single-date image close to the compositing period marginally decreases accuracy. We then show that, without recalibration, ResUNet-a generalises well across resolutions (10 m to 30 m), sensors (Sentinel-2 to Landsat-8), space and time. Averaging model predictions from at least four images well-distributed across the season is the key to coping with the temporal variations of accuracy. Finally, we apply the lessons learned from the previous experiments to extract field boundaries for the whole of the Australian cropping region. To that aim, we compare three ResUNet-a models which are trained with different data sets: field boundaries from Australia, field boundaries from overseas, and field boundaries from both Australia and overseas (transfer learning). By minimising image preprocessing requirements and replacing local arbitrary decisions by data-driven ones, our approach is expected to facilitate the adoption of smart farming services and improve land management at scale.
How to cite:
Waldner, F. and Diakogiannis, F.: Extracting field boundaries from satellite imagery with a convolutional neural network to enable smart farming at scale, EGU General Assembly 2020, Online, 4–8 May 2020, EGU2020-102, https://doi.org/10.5194/egusphere-egu2020-102, 2020.
Monitoring the nitrogen status of spring wheat under different nitrogen treatments is important for effective nitrogen management in precision agriculture. Unmanned aerial vehicle (UAV) integrated with machine learning techniques shows to be a promising tool for precisely estimating crop nitrogen content. In this study UAV-base spectral indices, artificial neural network model (ANN) and Support Vector Machine (SVM) were used to estimate spring wheat nitrogen content. The experiment was conducted on 144 spring wheat plots located at Parma Research and Extension Center, ID on six different spring wheat varieties and six different nitrogen rates. A rotary-wing UAV equipped with a multispectral sensor (RededgeM, Micasense Systems) was used to acquire very high spatial resolution images of the related plots. Validation of the methods was based on the cross-validation procedure and using three statistical indicators: R2, RMSE and relative RMSE. The cross-validated results identified all models providing accurate estimates of canopy nitrogen content in spring wheat crop.
How to cite:
Shafian, S. and Walsh, O.: Evaluating Different Learning Tools for Spring Wheat Nitrogen Content Estimation from UAV-Remote Sensing Data, EGU General Assembly 2020, Online, 4–8 May 2020, EGU2020-2457, https://doi.org/10.5194/egusphere-egu2020-2457, 2020.
Ilaria Cantini, Benedetto Allotta, Luca Bini, and Marco Vieri
Internet of Things (IoT) has revolutionized many fields in every-day life. It addresses many aspects related to data management, storage and connectivity.
The main objective of this project focuses on the application of IoT to a low-cost system to be used on land for monitoring plant life parameters (humidity, temperature, rain, solar radiation, etc.) in crop growing control, viticulture, pest prevention for olive groves, greenhouse automation and other applications in agriculture.
Additional applications are in urban environment (where major problems of extreme weather phenomena occur) and in the integration with existing trust networks for better characterization of weather phenomena on very limited space and time scales. Adaptation strategies must start from the knowledge and the availability of additional information.
In a previous project (EGU2018), an ArduinoUno-based control system board was utilized. The fully automatic equipment allowed transmission of real-time data using external esp8266 Wi-Fi.
In the new version, a LoLiN board, an Arduino board-compatible with integrated ESP8266 and RTC with a few Lua script lines, is used. The board allows a simplification of the design-and-development phase, and an overall reduction of costs.
The proposed system uses wireless sensors placed in open space and collects information stored on cloud server. The diffusion of a large number of sensors is possible through the use of low-cost sensors and technologies. The new target for this project is to develop a microcontroller system on Wi-Fi protocol based on ESP8266 connected in station mode for data collection, and on LoRa protocol for interconnection among multiple systems that cannot be connected with Wi-Fi.
The system has been fully developed in the University of Florence, and a high school under the supervision of teachers, involving potential stakeholders interested in the use of low-cost sensors in agriculture. Some traditional sensors, tipping bucket raingauges, magnetic reed devices anemometers, capacitive/resistive thermos-hygrometers, and an innovative impact piezo-element raingauge have been adapted in order to develop the meteorological station.
During the current year 2020, the LoRa protocol will be developed on the new system to interconnect multiple systems in the absence of Wi-Fi coverage.
Despite the low nominal cost of data collection, the current use for application in precision and smart agriculture, as well as in climate change monitoring and adaptation, could be possible only through a massive work of sensor calibration in order to reach the standards of the WMO. In any case, also in absence of absolute calibration the quantification of measurement uncertainties is mandatory to give value to the amateur network observations.
All these aspects are included in the presented project, an attempt to develop a low-cost weather monitoring system for educational purposes, but with lateral effects of awareness among students.
How to cite:
Cantini, I., Allotta, B., Bini, L., and Vieri, M.: Advance in a meteorological station, developed in educational environment, for agricultural and urban purposes, EGU General Assembly 2020, Online, 4–8 May 2020, EGU2020-6117, https://doi.org/10.5194/egusphere-egu2020-6117, 2020.
Muhammad Usman, Talha Mahmood, and Christopher Conrad
Textile products made with cotton produced in Pakistan, Turkey, and Uzbekistan are largely imported to European markets. This is responsible for high virtual water imports from these countries and thus puts immense pressure on their water resources, which is further extravagated due to climate change and population growth. The solution to combat the issue, on one hand, is to cut water usage for cotton irrigation, and on the other hand, to increase water productivity. The biggest challenge in this regard is the correct quantification of consumptive water use, cotton yield estimation and crop water productivities at a finer spatial resolution on regional levels, which is now possible by utilizing remote sensing (RS) data and approaches. It can also facilitate comparing regions of interest, like in this study, Pakistan, Turkey, and Uzbekistan by utilizing similar data and techniques. For the current study, MODIS data along with various climatic variables were utilized for the estimation of consumptive water use and cotton yield estimation by employing SEBAL and Light Use Efficiency (LUE) models, respectively. These estimations were then used for working out water productivities of different regions of selected countries as case studies. The results show that the study area in Turkey achieved maximum cotton water productivity (i.e. 0.75 - 1.2 kg.m-3) followed by those in Uzbekistan (0.05 – 0.85 kg.m-3) and Pakistan (0.04 – 0.23 kg.m-3). The variability is higher for Uzbekistan possibly due to agricultural transition post-soviet-union era. In the case of Pakistan, the lower cotton water productivities are mainly attributed to lower crop yields (400 – 1200 kg.ha-1) in comparison to Turkey (3850 – 5800 kg.ha-1) and Uzbekistan (450 – 2500 kg.ha-1). Although the highest crop water productivity is achieved for the study region in Turkey, there is still potential for further improvement by introducing on-farm water management. In the case of the other two countries, especially for Pakistan, major improvements are possible through maximizing crop yields. The next steps include comparisons of the results in economic out-turns.
How to cite:
Usman, M., Mahmood, T., and Conrad, C.: Remote Sensing based comparative analysis of cotton irrigation and water productivity in Pakistan, Turkey and Uzbekistan, EGU General Assembly 2020, Online, 4–8 May 2020, EGU2020-8191, https://doi.org/10.5194/egusphere-egu2020-8191, 2020.
Katerina Trepekli, Andreas Westergaard-Nielsen, and Thomas Friborg
With rising demand for increasing the yield potential of agricultural products and for reducing greenhouse gas emissions during food production, strengthening our scientific and technological capacity to monitor crop growth and above ground biomass (AGB) is indispensable to move towards more sustainable management of our agricultural resources. Pivotal to meet this goal is the application of high-throughput field-phenotyping tools such as drone borne Light Detection and Ranging (Lidar) systems for accurate, fine-grained, rapid and labor-saving measurements of vegetation growth parameters. Our objective is to develop and assess a workflow to estimate AGB, leaf area index (LAI), plant height (PH) and volume of a homogeneous and highly dense agricultural field using the capabilities of UAV-Lidar technology. The experimental site is located in Denmark and populated by potato plants. Aerial campaigns and field experiments, including destructive biomass sampling and measurements of LAI and plants’ geometrical characteristics at 1m2 square plots, were performed once per month during the vegetation growth period (May–September 2019). The high resolution (3.6 cm) Canopy Height model (CHM) is generated by first evaluating the performance of different filtering algorithms that separate the ground points from the Lidar-derived point cloud datasets. To extract the geometrical parameters of individual crop plants, we delineate the CHM by applying segmentation directly to the Lidar point cloud rather than segmenting the CHM as an interpolated raster surface. The PH obtained by the Lidar scanner is highly correlated with the field-measured PH (R²=0.89 and RMSE=0.028 m) implying that the point cloud data processing evaluated here is efficient and able to generate serviceably accurate CHMs for agricultural sites with similar vegetation structures. Throughout the observed vegetation growth period, the AGB can be quantified with high accuracy when it is considered to be a function of plant volume (R²=0.81 and RMSE=31.65 %) rather than a function of PH, as the latter approximating an exponential relationship with AGB. Height and density Lidar metrics were more effective in predicting in situ LAI measurements in comparison with remotely sensed LAI calculated directly from Lidar vegetation points following the Beer Lambert law. The predictive frameworks emerging from this approach indicate the applicability of drone borne Lidar systems for obtaining agricultural crop growth parameters in both high spatial and temporal resolution.
How to cite:
Trepekli, K., Westergaard-Nielsen, A., and Friborg, T.: Application of drone borne LiDAR technology for monitoring agricultural biomass and plant growth., EGU General Assembly 2020, Online, 4–8 May 2020, EGU2020-9802, https://doi.org/10.5194/egusphere-egu2020-9802, 2020.
Seungtaek Jeong, Jonghan Ko, Gwanyong Jeong, and Myungjin Choi
A satellite image-based classification for crop types can provide information on an arable land area and its changes over time. The classified information is also useful as a base dataset for various geospatial projects to retrieve crop growth and production processes for a wide area. Convolutional neural network (CNN) algorithms based on a deep neural network technique have been frequently applied for land cover classification using satellite images with a high spatial resolution, producing consistent classification outcomes. However, it is still challenging to adopt the coarse resolution images such as Moderate Resolution Imaging Spectroradiometer (MODIS) for classification purposes mainly because of uncertainty from mixed pixels, which can cause difficulty in collecting and labeling actual land cover data. Nevertheless, using coarse images is a very efficient approach for obtaining high temporal and continuous land spectral information for comparatively extensive areas (e.g., those at national and continental scales). In this study, we will classify paddy fields applying a CNN algorithm to MODIS images in Northeast Asia. Time series features of vegetation indices that appear only in paddy fields will be created as 2-dimensional images to use inputs for the classification algorithm. We will use reference land cover maps with a high spatial resolution in Korea and Japan as training and test datasets, employing identified data in person for validation. The current research effort would propose that the CNN-based classification approach using coarse spatial resolution images could have its applicability and reliability for the land cover classification process at a continental scale, providing a direction of its solution for the cause of errors in satellite images with a low spatial resolution.
How to cite:
Jeong, S., Ko, J., Jeong, G., and Choi, M.: Classification of Paddy Fields using Convolutional Neural Network with MODIS imagery in Northeast Asia, EGU General Assembly 2020, Online, 4–8 May 2020, EGU2020-13335, https://doi.org/10.5194/egusphere-egu2020-13335, 2020.
Asmaa Abdelbaki, Martin schlerf, Jochem Verrelst, and Thomas Udelhoven
Unmanned aerial vehicle-based (UAV) hyperspectral imagery is of great significance to estimate crop attributes at a landscape scale, which is required for many environmental and agricultural applications. Multiple methods have been proposed such as empirical regression, radiative transfer, and hybrid models to derive target information (leaf area index (LAI), canopy chlorophyll content (CCC), and fractional vegetation cover (fCover)). Yet, it remains a challenge to select the most suitable method, since each method has its respective advantages and disadvantages. In this study, a hybrid strategy is proposed, as it combines the flexibility of regression with the universality of radiative transfer models (RTM) compared to other retrieval methods concerning model accuracy, computational efficiency under varying sample sizes and different levels of artificial noise. Two datasets of canopy spectra were simulated from two types of Look-up-tables (LUTs) for simulating a range of canopy reflectance-based on a set of input parameters from a Soil-Leaf-canopy RTM. The first type (LUTstd) was derived from a set of independent input parameters, while the other type (LUTreg) relied on the variable correlations by using the Cholesky algorithm. The LUTs were used for training linear and non-linear nonparametric regression algorithms for estimating the relevant parameters for characterizing 27 potato plots. Subsequently, the best approach of non-parametric regression methods was applied to UAV-based hyperspectral data for mapping of crop properties.
Results showed for LAI and fCover estimates that the principal component regression, partial least square regression, and least squares regression line (PCR, PLSR and LSLR) outperformed any of machine learning regression algorithms (MLRAs) and LUT inversion approaches. Besides, analysis of multiple LUT sizes ranging from 1000 to 17280 revealed that the 1000 simulations were sufficient for training LUTs. Also, adding 1% of noise to the simulations was adequate to imitate the uncertainty of UAV data. By using the independent ground data for validation, the PCR and PLSR methods yielded the lowest errors (R²= 0.81, NRMSE=11.47% for LUTreg than LUTstd (R²= 0.51, NRMSE= 22.61%). Regarding fCover, the accuracy of linear non-parametric and LUT-inversion approaches in LUTreg (R²=0.75 and NRMSE=14.53% for PLSR and R²= 0.78 and NRMSE=14.37% for LUT inversion based) was increased slightly rather than the results obtained from MLRAs (R²= 0.76 and NRMSE= 14.74% for kernel ridge regression (KRR)). Regarding CCC, the best result was obtained using Random forest of tree bagger (RFTB) and fit ensemble (RFFE) for both LUTs. The accuracy of LUTreg did not improve as much as LUTstd through changing sample sizes (R²= 0.80, NRMSE= 14.71% for LUTreg and R²= 0.81, NRMSE= 13.93% for LUTstd). In terms of processing speed, the linear non-parametric methods were the fastest one as compared to MLRAs (PCR=0.0097 and PLSR=0.013 seconds). In conclusion, compared to the two analyzed hybrid strategies (Linear and non-linear non-parametric regression), the use of LUT-inversion is not recommended for large images because of low prediction accuracy and slow processing speed.
Keywords: SLC model, LUT inversion based, linear non-parametric regression, machine learning, hybrid model, Leaf area index, fractional vegetation cover, leaf and canopy chlorophyll content.
How to cite:
Abdelbaki, A., schlerf, M., Verrelst, J., and Udelhoven, T.: Optimizing hybrid retrieval strategies for crop attribute retrieval using hyperspectral UAV data, EGU General Assembly 2020, Online, 4–8 May 2020, EGU2020-18406, https://doi.org/10.5194/egusphere-egu2020-18406, 2020.
With rising demand for food in Sub-Saharan Africa (SSA), cropland expansion represents the main strategy to boost agricultural production. However, cropland expansion is not a sustainable form of agricultural development as there is limited arable land and increasing soil degradation in SSA. Cropland expansion needs to be monitored in order to focus intervention and propose alternatives. In this study, we monitor agriculture expansion over the past decades across Malawi using Landsat satellite data and explore factors that can explain expansion using Malawi integrated household survey data. The preliminary results showed that cropland expansion has widely occurred across the country, and the newly expanded croplands have higher productivity compared to the croplands with long cultivation history. We also found that estate agricultural land contributes to 40% of the expanded area and the level of irrigation is negatively correlated to expansion, being the dominant factors that are associated with expansion in Malawi. The results will further help to offer localized information for policy making and to develop strategies for conserving land.
How to cite:
Li, C. and Dash, J.: Cropland expansion and productivity reduction in Malawi monitored by using Satellite data, EGU General Assembly 2020, Online, 4–8 May 2020, EGU2020-20286, https://doi.org/10.5194/egusphere-egu2020-20286, 2020.
Nikolaos-Christos Vavlas, Toby Waine, Jeroen Meersmans, and Goetz Richter
Synthetic Aperture Radar (SAR) is sensitive to the surface structure as well as dielectric properties, so can be used to quantify the canopy characteristics and surface moisture. High temporal frequency SAR backscatter data are useful in terms of quantifying crop phenological development, growth and yield formation. The aim of this research is to identify the growth dynamics of winter wheat from SAR at field scale, validated using farm sites with different productivity between two years (2018-2019). We identify and explore the parameters which characterize crop performance from SAR temporal curves and use these to improve and automate the monitoring of wheat fields. Our novel methodology includes the extraction of crop indicators using the VH/VV ratio temporal curve from Sentinel-1. Sigmoid curve fitting is used to simulate the VH/VV response and the extracted parameters are related to the field development. The results show that specific indicators, such as the duration of the high vegetation (stem elongation to dough development) as well as the timing of the booting stage of wheat significantly correlate with the final yield. Other indicators can provide information about the canopy characteristics of wheat (e.g. above ground biomass and plant water content). The combination of selected indicators can provide a more robust analysis of the fields. These results demonstrate the potential of SAR to remotely quantify yield without using any management data from the farm.
How to cite:
Vavlas, N.-C., Waine, T., Meersmans, J., and Richter, G.: Winter wheat growth dynamics and their relationship with the field productivity using Sentinel-1 SAR polarimetry, EGU General Assembly 2020, Online, 4–8 May 2020, EGU2020-20800, https://doi.org/10.5194/egusphere-egu2020-20800, 2020.