- 1Department of Architecture, Built Environment and Construction Engineering, Politecnico di Milano, Piazza Leonardo da Vinci, 32, 20133 Milan, Italy (romanos.ioannidis@polimi.it)
- 2Department of Water Resources and Environmental Engineering, School of Civil Engineering, National Technical University of Athens, 15780 Zographou, Greece
Modern digital technologies and geoinformatics have experienced rapid growth, offering powerful tools to bridge the gap between scientific communities and society in landscape assessment and mapping. This research details the application of a crowdsourcing scheme that utilizes a dedicated mobile application to facilitate direct public participation in quantifying perceptions of urban landscapes and architecture. Initially developed as an educational tool, the methodology has been tested by university students across Italy, Greece, and France, providing a foundational phase for assessing landscape quality and urban typologies. Building upon these educational pilot studies, the work explores the evolution of this methodology into a broader, multicultural citizen science initiative designed to improve the quality and quantity of available landscape perception data.
A significant technical advancement in this research involves the integration of automated image analysis to process the novel data generated by participants from any location. The photographic material was examined using stochastic image analysis based on climacograms, in which images are treated as two-dimensional grayscale intensity fields and analyzed across multiple spatial scales. The method enables the comparison of image patterns based on the visual complexity of the uploaded photographs. A primary challenge addressed was the algorithm's performance when processing real-world, non-curated smartphone images. The analysis began an assessment on how the methodology handles environmental noise, such as sky, trees, and unconventional capture angles, which are inherent to bottom-up crowdsourcing schemes.
The early results indicate that the method can reveal group-level tendencies associated with differing architectural characteristics, particularly in relation to visual complexity, while not supporting reliable classification at the level of individual image. In detail, the findings indicate a trend towards two categorizations: firstly, between modernist-type movements, characterized by minimal elements, and secondly between eclectic or decorative movements, which exhibited higher measured complexity; however, this this behaviour was not observed universally on all analyzed movements The stochastic analysis also indicated theoretical overlaps between certain movements, such as Postmodernism and Eclecticism, based on shared decorative patterns. While the results highlight that environmental factors can influence the analysis of individual photographs, the method utilized presents potential for distinguishing movement trends with logical consistency even from unfiltered data.
Scientifically, this yield of quantitative data sets the groundwork for improved research in the humanities and culture, showing a strong correlation with established landscape quality indices. Socially, the project provides a scalable model for participatory mapping that fosters critical thinking about urban quality, creating new conditions for communication between universities and the broader public. Overall, the presented work reports on the early-stage results of this methodological exploration and aims to evaluate the combined use of participatory mobile data collection and exploratory image-based analysis for landscape and architectural studies, while identifying key challenges related to data quality, interpretation, and future methodological refinement.
How to cite: Kopelia, S., Tepetidis, N., Tzortzi, J. N., Sargentis, G.-F., and Ioannidis, R.: Integrating Participatory Perception-Mapping Data and Stochastic Image Analysis for Urban Landscape Assessment, EGU General Assembly 2026, Vienna, Austria, 3–8 May 2026, EGU26-13270, https://doi.org/10.5194/egusphere-egu26-13270, 2026.