Advertisements

Archive for the ‘control de calidad’ Category

Allocation analysis: Attaching customers to facilities

2016/09/20

Allocates a set of demand points (Customers) to user specified number of supply points (Facilities) out of a Facilities point dataset based on the Euclidian distance between the Customers and Facilities.

customers

100 customers anywhere in the World

In other words the function selects N Facilities out of K candidates to service a set of M Customer locations in such a way that each Customer is allocated to a single Facility (based on Euclidean distance) and the total distance between the Customers and selected Facilities is minimized.

asignation

Customers attached to 3 pre-defined facilities

In a more simple way: take a bunch of customers and assign them the closest facility (using euclidean distance, the “ordinary” (i.e. straight-line) distance between two points)). In this particular theoretical analysis I have also selected a maximum range of 5000 meters so anything beyond won’t be taken into consideration.

Questions:  Am i giving a proper service with those facilities i have already deployed?. Is there any of them way too far away so we cannot service at all?. Is there any of them over populated and in the end we cannot provide a proper service?. If you happen to come across any other question, please add it to comments so i can modify the post.

Result table:

FID Shape Id FacilityID Facility Type Num_Alloc Max_Dist Total_Dist
0 Point 0 2 2 Selected 4 4852.68 15362.93
1 Point 0 1 1 Selected 11 4110.57 37839.93
2 Point 0 0 0 Selected 18 4991.27 73591.27

This ArcGIS video shows some light over these type of analysis:

This links shows how to create a network dataset
http://desktop.arcgis.com/en/arcmap/latest/extensions/network-analyst/exercise-1-creating-a-network-dataset.htm

 

Software used: ArcGIS 10.3; ET Geowizards 11.1

Hope you guys have liked it, if so, share or let me know about it.

Alberto CONCEJAL
Geographer and MSc GIS

Advertisements

Réalisation du carte de densité pour vérifier Localisation des colonnes aériennes de Nantes Métropole

2016/09/12

Localisation et caractéristiques des colonnes d’apport volontaire aériennes de Nantes Métropole utilisées pour la collecte des déchets.

  1. Outil de visualisation Global Mapper 17
  2. Format SHP
  3. champ: VOLUME

http://data.paysdelaloire.fr/donnees/detail/localisation-des-colonnes-aeriennes-de-nantes-metropole/

colonnes-airiennes-01
Ces colonnes sont implantées sur l’ensemble du territoire et sont destinées à la collecte du verre et des emballages recyclables (papier, carton, plastique).

colonnes-airiennes-02

C’est genial jouer un peu avec des données Open Data, j’espère que vous avez aimé.

Alberto
MSc SIG et remote sensing

Entrevista en ‘Soy Data’

2016/08/30

Me vinieron a entrevistar de SOYDATA.net  para hablar de temas relacionados con la Geovisualización como el Big Data o el Open Data y su implicación con el Control de Calidad o el Software libre. Gracias a Jorge Ubero de SoyData por la misma.

Espero que os guste, ya sabéis que si tenéis algún comentario o algo que decir, estoy encantado de contestaros.

Alberto

Alberto Concejal: Nuevos retos en geovisualización

Para hacer un repaso de algunos de los retos y nuevas aplicaciones que nos estamos encontrando en este sector, hemos entrevistado a un veterano experto en geomática, Alberto Concejal.

Alberto Concejal:

Geógrafo, master en teledetección y GIS (sistemas de información geográfica). Ha trabajado más de 15 años en el sector geoespacial. Comenzó su carrera como fotógrafo aéreo y a lo largo de estos años se ha venido especializando en compaginar diseño y geovisualización. Al mismo tiempo ha desarrollado una carrera paralela como fotógrafo y viajero, sus grandes pasiones (ver ‘Un viajero de colores‘ en el portal viajeros.com)

alberto-fotografo-aereo

En la actualidad trabaja como responsable de control de calidad en una multinacional cartográfica. Alberto hace cada día que los procesos complejos sean más fáciles de comprender.

Podcast geotecnologías y nuevos retos

“Las adaptaciones necesarias para que una fusión de dos empresas se lleve a cabo, es todo un reto desde el punto de vista del ‘Quality assessment’”

“No podemos contemplar el escenario actual con los sistemas anteriores, por la capacidad y la realidad dinámica que hay ahora en la generación de datos geolocalizados”

“El open data es el marco idóneo para poder hacer avanzar la tecnología en el entorno big data actual”

“La nube es algo reciente, aún nos estamos adaptando. Es una parte del todo”

“Hay un escenario nuevo cada cinco años, que debe ser aceptado dinámicamente”

“El cliente se ha trasladado de empresas u organismos a las personas de a pie. Siendo éstos a su vez generadores de datos, que deben ser validados y tenidos en cuenta”

Nuevas aplicaciones en geotecnologías: Carto y Tableau

Con Carto (anteriormente conocida como CartoDB) y Tableau se ha facilitado enormemente el acceso a las geotecnologías. En la sencillez reside una de las principales claves de su éxito

–> En la Academia SoyData tenéis a vuestra disposición cursos para poder poneros al día en estas herramientas de geovisualización de una manera ágil, sencilla y asequible.

–> En el blog geovisualization.net Alberto detalla algunas claves y casos de estudio que serán sin duda de vuestro interés. Os invitamos a todos a que lo visitéis.

Creating value through Open Data

2016/02/19

The benefits of Open Data are diverse and range from improved efficiency of public administrations, economic growth in the private sector to wider social welfare

(Source: http://www.europeandataportal.eu/)

Performance can be enhanced by Open Data and contribute to improving the efficiency of public services. Greater efficiency in processes and delivery of public services can be achieved thanks to cross-sector sharing of data, which can for example provide an overview of unnecessary spending.

The economy can benefit from an easier access to information, content and knowledge in turn contributing to the development of innovative services and the creation of new business models.

Social welfare can be improved as society benefits from information that is more transparent and accessible. Open Data enhances collaboration, participation and social innovation.

clip_image004

The economy can benefit from easier access to information, content and knowledge in turn contributing to the development of innovative services and the creation of new business models.

For 2016, the direct market size of Open Data is expected to be 55.3 bn EUR for the EU 28+. Between 2016 and 2020, the market size increases by 36.9%, to a value of 75.7 bn EUR in 2020, including inflation corrections. For the period 2016-2020, the cumulative direct market size is estimated at 325 bn EUR.

picture_1

New jobs are created through the stimulation of the economy and a higher demand for personnel with the skills to work with data. In 2016, there will be 75,000 Open Data jobs within the EU 28+ private sector. By 2020, this number will increase to just under100,000 Open Data jobs. Creating almost 25,000 new direct Open Data jobs by 2020.

picture_2

Public sector performance can be enhanced by Open Data. Greater efficiency in processes and delivery of public services can be achieved thanks to cross-sector sharing of data, providing faster access to information. The accumulated cost savings for the EU28+ in 2020 are forecasted to equal 1.7 bn EUR.

picture_3

Open Data results in efficiency gains as real-time data is used that enables easy access to information that improves individual decision-making. Three case studies are assess in more detail: how Open Data can save lives, how it can be used to save time and how Open Data helps achieve environmental benefits. For example, Open Data has the potential of saving 7000 lives a year by providing resuscitation earlier. Furthermore, applying Open Data in traffic can save 629 million hours of unnecessary waiting time on the roads in the EU.

picture_4

Economic benefits are primarily derived from the re-use of Open Data. Value is there. The question is how big?

The European Union has adopted legislation to foster the re-use of Open (Government) Data. The expected impact of this legislation combined with the development of data portals, is to drive economic benefits and further transparency. Economic benefits are primarily derived from the re-use of Open Data. Value is there. The question is how big?

Thus, the European Commission, within the context of the launch of the European Data Portal, wished to obtain further evidence of the quantitative impact of re-use of Public Data Resources. A study was carried out with the aim to collect, assess and aggregate all economic evidence to forecast the benefits of the re-use of Open Data for all 28 European Member States and the ETFA countries, further referred to as EU 28+, for the period 2016-2020.

Direct benefits are monetised benefits that are realised in market transactions in the form of revenues and Gross Value Added (GVA), the number of jobs involved in producing a service or product, and cost savings. Indirect economic benefits are i.e. new goods and services, time savings for users of applications using Open Data, knowledge economy growth, increased efficiency in public services and growth of related markets.

The market volume exhibits the totality of the realised sales volume of a specific market; the value added. A distinction can be made between the direct market size and the indirect market size. Together they form the total market size for Open Data. For 2016, the direct market size of Open Data is expected to be 55.3 bn EUR for the EU 28+. Between 2016 and 2020, the market size is expected to increase by 36.9%, to a value of 75.7 bn EUR in 2020, including inflation corrections. For the period 2016-2020, the cumulative direct market size is estimated at 325 bn EUR.

In 2016, there will be 75,000 Open Data jobs within the EU 28+ private sector. By 2020, this number is forecasted to increase to just under 100,000 Open Data jobs. This represents a 32% growth over a 5-year period. Thus, in the period 2016-2020, almost 25,000 new direct Open Data jobs will be created.

Based on the forecasted EU28+ GDP for 2020, whilst taking into account the countries’ respective government expenditure averages, the cost savings per country can be calculated. The accumulated cost savings for the EU28+ in 2020 are forecasted to equal 1.7 bn EUR.

The aim of efficiency is to improve resource allocation so that waste is minimized and the outcome value is maximised, given the same amount of resources. Open Data can help in achieving such efficiency, The study offers a combination of the insights around the efficiency gains of Open Data and real-life examples. Three exemplar indicators are assessed in more detail: how Open Data can save lives, how it can be used to save time and how Open Data helps achieve environmental benefits. For example, Open Data has the potential of saving 1,425 lives a year (i.e. 5,5% of the European road fatalities). Furthermore, applying Open Data in traffic can save 629 million hours of unnecessary waiting time on the road in the EU.

The majority of studies performed previously are ex-ante estimations. These are mostly established on the basis of surveys or indirect research and provide for a wide range of different calculations. No comprehensive and detailed ex-post evaluations of the materialised costs and benefits of Open Data are available. Now that governments have defined Open Data policies, the success of these initiatives should be measured. The study offers several recommendations for doing so.

The report goes into further detail on how Open Data has gained importance in the last several years. Furthermore, the report provides insight into how Open Data can be used, and how this re-use differs around Europe. These insights are used to develop a methodology for measuring the value created by Open Data. The resulting values are presented in a graphical way, providing insight in the potential of Open Data for the EU28+ up to 2020.

 

(Source: http://www.europeandataportal.eu/)

 

Comparación de DTM usando Global Mapper 17.0.1

2016/02/12

Hagamos hoy algo sencillo, comparar, primero cualitativamente (visualmente) y después cuantitativamente dos DTM. Por un lado elegimos una fuente muy usual, SRTM de 3 arc sec (aproximadamente 90m) con un DTM derivado de Fotogrametría Stereo.

  • Comparación CUALITATIVA (i.e visual)
  • Comparación CUANTITATIVA (i.e RMSE)

Abrimos por un lado un DTM cuya fuente sea SRTM, en este caso me he conectado via WMS (Web Mapping Service) a través del data online disponible dentro de la misma aplicación Global Mapper (File/Download Online Imagery/data). La resolución es de aproximadamente 90m (3 arc sec).

DTM-COMPARISON-20160212

Por otro lado he encontrado este DTM cuya fuente conozco (Stereo Photogrammetry). La resolución es de 5m.

DTM-COMPARISON-20160212-02

A través de la herramienta ‘digitizer tool’ (Tools/Digitizer) seleccionamos una línea dibujada al azar sobre los dos. Botón derecho del ratón-> analysis/measurement/path profile. Exporto ambas imágenes (es importante en path setup definir un mismo mínimo y máximo para poder compararlas adecuadamente).

Con Photoshop superpongo (Layer display/ multiply) ambas imágenes y veo cuán diferente son.

DTM-COMPARISON-20160212-03

Esto nos da una primera idea de la comparación, pero vayamos un poco más allá: ¿Cuál es el RMSE (Error medio cuadrático, Root Mean Square Error) entre ambas bases de datos?.

DTM-COMPARISON-20160212-04

Esta es una medida de desviación que nos va a definir mucho más exactamente que una simple visualización. Podéis ver algo más desarrollado este punto en este link de esta misma página:

https://geovisualization.net/2010/06/30/using-excel-to-calculate-the-rmse-for-lidar-vertical-ground-control-points/

DTM-COMPARISON-20160212-05

Ahora tan solo hemos de verificar que esta cifra sea la correcta teniendo en cuenta los valores de precisión prometidos en la entrega.

Espero que os haya resultado interesante, si así es, no olvidéis comentar, compartir o simplemente decir Hola. Cualquiera de estas opciones es apreciada.

Un saludo cordial,
Alberto CONCEJAL
MSc GIS and Remote Sensing

Descargas del CNIG. Open Source bien hecho!

2016/02/08

Hola amigos del GIS,
Por motivos de trabajo que no vienen al caso, he tenido que bucear de manera sistemática la web de descargas del CNIG. http://centrodedescargas.cnig.es/CentroDescargas/inicio.do
Una maravilla.

cnig-20160208-01

Por motivos que tampoco viene al caso, he de hacer esto mismo de vez en cuando en todos los Institutos cartográficos del mundo y el del CNIG es sin duda en el que me resulta más fácil, en el que el modelo de datos en más lógico y en el que los links son más fiables de todo el mundo. La única obligación es la atribución obligatoria de los datos. ¿No es mucho pedir, no? Desde el día 27 de diciembre, los datos del IGN son libres CC By 4.0.
https://creativecommons.org/licenses/by/4.0/

Por tanto es obligatorio que mencione la procedencia a pie de imagen, créditos, etc.., sobre todo en publicaciones, usos comerciales, artículos, etc… (Por ejemplo puede poner “<tal dato> CC by instituto Geográfico Nacional” o más bien “derivado de <tal dato” CC by ign.es” o similares…).

cnig-20160208-02

Ya sea porque necesitemos las imágenes del PNOA (Plan Nacional de Ortofotografía Aérea), un modelo digital del terreno de alta resolución o imágenes históricas de nuestro pueblo… tan solo hay que bucear un poco en el catálogo de geodatos del Instituto Geográfico Nacional (Centro Nacional de Información Geográfica) y los conseguiremos.

Por ejemplo, la semana pasada tuve que encontrar datos sobre algunas ciudades españolas para hacer varios escenarios 3D para un cliente y aquí encontré por un lado un DSM 5m elaborado con fuentes LIDAR, por otro lado me bajé de Cartociudad los datos relativos a vectores lineales, manzanas y luego desde la web de CATASTRO (https://www.sedecatastro.gob.es/OVCFrames.aspx?TIPO=TIT&a=masiv) me bajé las geometrías de todos los edificios de la ciudad (que planeo geoprocesar para eliminar las formas no deseadas y para adjudicar alturas precisas gracias al LIDAR bajado con anterioridad).

Por qué no añadir geometrías de Open Street Maps (https://www.openstreetmap.org/export) o de la propia Base Topográfica Nacional BTN25 para completar dicho escenario?

barcelona-bldg-osm-capture-20160112
MADRID-GISDATA

La verdad amigos es que desde que empezó a funcionar el Open Data, los Geógrafos y derivados tenemos mucho con lo que ‘jugar’ para hacer nuestros análisis.
http://idee.es/

Espero que os resulte interesante.

Un saludo cordial,

Alberto
Geógrafo/ Máster SIG UAH/ Diseñador Multimedia

Change detection – Detección de cambios en polígonos

2015/10/22

change-detection-bogota-telemediciones-20151023-02
THE IDEA: DEMONSTRATING HOW DYNAMIC A CITY IS, THUS HOW IMPORTANT IS HAVING AN UPDATED DATASET
bogota-change-detection-20151105-02

THE FACTS: THE CITY OF BOGOTÁ IN COLOMBIA 2012-2014

Overall growth rate: -0.12% ONLY HAVING INTO ACCOUNT THE DIFFERENCE OF BUILDINGS CAPTURED BETWEEN 2012 AND 2014 (We can do this because we have used the same data capture model in both years)

(De acuerdo al censo catastral, para 2015 la ciudad incorporó 51.531 predios nuevos urbanos. En total, hay 2’402.581 predios en la ciudad, de esos, 266,9 millones de metros cuadrados son de área totalmente edificada. Source: http://www.eltiempo.com/bogota/crecimiento-bogota-/15394797)

bogota-change-detection-20151105

THE PROCEDURE: Centroids of buildings; Spatial join showing presence-absence, considering a 10m accuracy threshold, meaning if the centroid has not moved more than 10m, its the same building. If the centroid in 2012 is not in 2014, its considered as demolished. If a new centroid appears its considered new building.

DENSITY MAPS+3D buildings
Help to quickly focus on the highlights
bogota-change-detection-news-20151021

 

DTM validation using Google Earth (and RMSE extraction)

2015/03/10

Hi guys,

Surfing the internet is great when you need to figure out something. I needed to validate some DTM from unknown sources against an also unknown source (but at least a kind of reliable one, Google Earth).

All we need is

  • Google Earth
  • TCX converter
  • ARcGIS
  • Excel

This is the procedure i have followed:

  1. First of all we draw a path over our AOI using Google Earth, we save this as KML,
  2. This KML is opened by TCX converter, added heights and exported as CSV,
  3. CSV is imported by ArcGIS,
  4. We use the tool ‘extract multi values to points‘ to get in the same table the values of our DTM and the values from Google Earth,
  5. We use Excel to calculate the RMSE and get a quantitative result,

These are the values in our DTM

dtm-validation-02

This is the path we have to draw in Google Earth

dtm-validation-03

Using TCX converter we get the heights out of Google Earth’s DTM

dtm-validation-01

Using the tool ‘extract multi values to points‘ we get the heights out of our DTM

dtm-validation-04

We measure the differences and extract the RMSE.
Are we within our acceptance threshold or expected level of accuracy?.

You guys have to figure this out for yourselves!!!

Lost regarding RMSE calculation?. Think you have to take a look at this other post.

dtm-validation-05

dtm-validation-06

Hope you guys have enjoyed this post, if so, don’t forget sharing it.

Alberto Concejal
MSc GIS and QCQA expert (well this is my post and i say what i want :-))