The quantification of land-use dynamics necessitates a spatiotemporal framework that ensures categorical stability over long-term observation windows. The ESRI 10-Meter Global Land Cover time series, accessible through the ArcGIS Living Atlas, provides a harmonized baseline for this purpose, derived from the dense temporal stack of the ESA Sentinel-2 mission.
Category Archives: clutter
Who Gets to See the War? Satellite Imagery Censorship and the Copernicus Alternative
A private company, operating under a US federal license, was effectively told by the Trump administration to go dark over an active war zone. Planet complied. Vantor and BlackSky — both heavily dependent on US defense revenue — said they hadn’t even received such a request, because they were already tightly controlling access. The selective pressure lands precisely on the most open, most commercially accessible provider. That is not a coincidence.A private company, operating under a US federal license, was effectively told by the Trump administration to go dark over an active war zone. Planet complied. Vantor and BlackSky — both heavily dependent on US defense revenue — said they hadn’t even received such a request, because they were already tightly controlling access. The selective pressure lands precisely on the most open, most commercially accessible provider. That is not a coincidence.
From Overture Maps to GPKG in minutes: Building a Geospatial Data Extractor with R and DuckDB
Modern geospatial workflows increasingly depend on fast, reliable access to city-scale vector data — building footprints, road networks, land use polygons, points of interest, address databases. Whether you are designing a 5G radio network, modelling urban heat islands, planning last-mile logistics, or simulating emergency response coverage, you almost always start from the same question: “How do I get clean, structured geodata for this city, right now, without spending two days on it?”
The Overture Maps Extractor is my answer to that question. It is a Shiny application written in R that lets any GIS professional extract multiple thematic layers from the Overture Maps Foundation dataset — for any city in the world — in a matter of minutes, with zero command-line interaction and zero manual data wrangling.
Urban development in Madrid from the mid-19th century to the present day
All existing buildings in Madrid currently listed in the Land Registry database have their year of construction recorded. This map shows, by decade, where the bulk of that urban development took place. For example, in the 1920s it was in the Salamanca district, in the 1930s in Chamartín… shifting from development in the city centre to the outskirts.
Population Estimation through Dynamic LULC-Based Settlement Validation
The foundational step of this methodology involves the deployment of a centralized processing interface within the Google Earth Engine (GEE) environment. The provided visualization captures the core interface of the custom GEE application, which serves as the hub for the multi-sensor LULC validation pipeline. Within this dashboard, users can define a specific Area of Interest (AOI)—highlighted here over the Iberian Peninsula and North Africa—and configure key parameters, including temporal ranges for the acquisition of sentinel-derived products. Crucially, the interface is designed to load and compare two primary datasets simultaneously: Dynamic World (near real-time, probability-based LULC) and ESA WorldCover (10m resolution structured LULC). The contrasting classification schemes are represented by the legends on the left and right sides of the map view, which illustrate the varying definitions of ‘Built-up’ and urban areas between the two products. Establishing this visual and statistical comparison at the application level is the prerequisite for calculating the spatial disagreement threshold, or delta, that guides the subsequent merging and population estimation phases.
Agricultura de precisión (II). APP para integración con Catastro rural en España
La convergencia entre el Big Data geoespacial y la administración pública ofrece una oportunidad sin precedentes para la optimización agronómica. La capacidad de procesamiento de Google Earth Engine (GEE), vinculada a la cartografía vectorial del Catastro rural, permite transformar las series temporales de misiones como Sentinel-2 en herramientas de diagnóstico directo sobre la parcela. Este enfoque desplaza el análisis de una observación puramente visual a una monitorización cuantitativa basada en la respuesta espectral de los cultivos. El núcleo de esta aplicación reside en la intersección geométrica de las parcelas catastrales con colecciones de imágenes multiespectrales. Mediante el uso de la API de JavaScript en GEE, se automatiza el cálculo de indicadores biofísicos críticos como el NDVI (Índice de Vegetación de Diferencia Normalizada), el NDWI (Índice de Agua de Diferencia Normalizada), el EVI (Índice de Vegetación Mejorado) y el SAVI (Índice de Vegetación Ajustado al Suelo). Estos índices no solo reflejan el vigor fotosintético, sino que permiten identificar anomalías de crecimiento, estrés hídrico o variaciones en la densidad foliar que son invisibles al ojo humano en las fases tempranas del ciclo fenológico.
Super-résolution 1 m a Madrid avec Sentinel-2 10m. Magique ! Tracking NRT!
Passer d’une résolution de 10 mètres à 1 mètre change radicalement la perspective du suivi agricole : on ne regarde plus une parcelle dans sa globalité, on observe ce qui se passe à l’intérieur même des rangs de culture. Ce saut qualitatif est possible grâce à l’algorithme S2DR3, un modèle de Deep Learning qui ne se contente pas d’agrandir les pixels, mais reconstruit l’information manquante. En s’appuyant sur les corrélations entre les différentes bandes spectrales de Sentinel-2 et en s’entraînant sur des images de très haute résolution, l’IA parvient à synthétiser une image à 1 m/pixel d’une précision étonnante.
ESTIMATED GHSL vs INE 2025
He desarrollado este COMPARADOR DE POBLACIÓN GHSL vs PADRÓN INE 2025 en JavaScript/Google Earth Engine que cruza estimaciones satelitales de población con los datos oficiales del censo español municipio a municipio.
La herramienta permite seleccionar cualquier provincia y municipio de España, visualizar la distribución espacial de población estimada por el GHSL con el último dato oficial del INE 2025, detectando municipios con alta presión turística, despoblación real o población no registrada.
Una aplicación directa para planificación de infraestructuras, gestión de emergencias o análisis de cohesión territorial donde el padrón no refleja la ocupación real del territorio.
URBAN ATLAS 2018 + WORLDPOP 100m/GHSL 100m estimates over Madrid
Urban Atlas (UA) representa el estándar de oro dentro del Copernicus Land Monitoring Service (CLMS) para el análisis de la morfología urbana en Europa. A diferencia de Corine Land Cover, UA ofrece una resolución temática y espacial drásticamente superior (Unidad Mínima de Mapeo de 0.25 ha para clases urbanas), permitiendo discriminar entre tejidos urbanos continuos y discontinuos con una precisión de densidad del 10% al 80%.
Mapping Something Unthinkable: Flood Risk in Madrid using Open Data
Dont get wrong if you see the IA background showing our handsome major almost showing his beautiful smile in Cibeles/Correos it’s only to get your attentions (only if you need it thou!). Flooding in urban environments is not a speculative hazard but something we can quantify. In the case of Madrid, the intersection of pretty mountainous terrain (it might surprise you there are 2000m difference between the highest spot in Madrid province, Pico Peñalara -2428m- and the Alberche river environment in some areas -430m-) and urban expansion presents a scenario of significant risk, particularly when analyzed through the lens of shared high-resolution geospatial data. This study integrates the buildings from BTN (Base Topográfica Nacional) provided by the Spanish “IGN”, the CNIG with the official flood hazard maps for a 100-year return period (T=100), published by the Ministry for the Ecological Transition and the Demographic Challenge (MITECO). The T=100 scenario is the most representative for evaluating long-term flood exposure, as it reflects events with a 1% annual probability—rare but not improbable, and certainly not negligible.