(source: http://dominoc925.blogspot.com/)

The height accuracy of the collected LiDAR data can be verified by comparing with independently surveyed ground control points on hard, flat, open surfaces. It is essentially just calculating the height differences for all the control points and then determining the height *root mean squared* error (RMSE) or differences. Most LiDAR processing software have the reporting function built-in. However, plain **Microsoft Excel** can also do the job (except for extracting the elevation from the LiDAR data).

Assuming that you are able to calculate the height differences for all the control points and place in a spreadsheet as shown in the figure below. I have a column of delta Z values in column A.

Then to calculate the RMS value for the elevation differences, I can do the following.

- In a cell, type in the formula:= SQRT(SUMSQ(A2:A18)/COUNTA(A2:A18))
*where A2:A18 are the values from cell A2 to A18 in the spreadsheet. Simply replace these with the actual locations on your spreadsheet*.
- Press
**RETURN**.

*The RMSE value is calculated*.

(source: http://dominoc925.blogspot.com/)

### Like this:

Like Loading...

Tags: 3d, 3d geovisualization, alberto concejal, Excel, geography, GIS, ground control points, lidar, mapas, MDE, MDT, modelos digitales de elevaciones, modelos digitales del terreno, RMSE, sig

This entry was posted on 2010/06/30 at 12:22 pm and is filed under Flujos de trabajo /workflows, GIS / SIG, lidar, Modelado 3D. You can follow any responses to this entry through the RSS 2.0 feed.
You can leave a response, or trackback from your own site.

2015/03/10 at 1:23 pm |

[…] Lost regarding RMSE calculation?. Think you have to take a look at this other post. […]

2016/02/12 at 11:53 am |

[…] Using Excel to calculate the RMSE for LiDAR vertical ground control points […]