From charlesreid1

This page covers part 2 of my process for dealing with shapefile topography data from the National Map viewer in Leaflet.

Previously, I covered how to use the National Map Viewer tileset in Leaflet. This provides information about contours, but only as images (as a tileset). http://charlesreid1.com/wiki/Leaflet_NationalMap

I also covered how to download a shapefile containing topology information at the geometry level. That was covered here: http://charlesreid1.com/wiki/Topo_Map

This page covers the combination of these two pieces of data from the National Map Viewer page. I will be creating a map with the National Map contour set, and on top of that, plotting a handful of contour lines from the contour shapefile.

The process will look something like this:

1. Obtain topology shapefile from the National Map Viewer (covered here: http://charlesreid1.com/wiki/Topo_Map)

2. Convert the shapefiles into GeoJson or TopoJson

3. Parse Json data (it is large, so reduce it to the data of interest)

4. Display the GeoJson or TopoJson geometries in Leaflet

5. Add a National Map Viewer tileset layer to the Leaflet map (covered here: http://charlesreid1.com/wiki/Leaflet_NationalMap)

Converting National Map Shapefile

Shapefile Format

(Whatever the shapefile looks like)

Shapefile to TopoJson

(Details of how to install topojson, what it was for)

(Whatever command you use)

TopoJson vs GeoJson

We could use GeoJson, but lot of information, and TopoJson more compact

Size comparison between GeoJson and TopoJson?

Python and Json

Json naturally translates to Python dictionaries, makes it easy to deal with Json, can push/pull Json seamlessly

Parsing National Map Data

Format of TopoJson file

In [17]: z = json.read('barrygoldwater.json')

In [18]: print len(z)
2

In [19]: print z.keys()
['type', 'features']

In [20]: print len(z['features'])
31221

In [22]: print z['features'][0]
{'geometry': {'type': 'MultiLineString', 'coordinates': [[[-113.99991898140638, 32.24971033119198], [-113.9998726845314, 32.24973639785861], [-113.99984953661482, 32.249755296816886], [-113.99982638869818, 32.24978467181688], [-113.99979359390653, 32.249849537441776], [-113.99976721578162, 32.249942130149975], [-113.99974703036497, 32.249978512441544], [-113.99972483349, 32.24999999994151]], [[-113.99997861161461, 32.24999999994151], [-114.0000000001562, 32.24991548744168]], [[-114.0000000001562, 32.24970645202529], [-113.99998842515623, 32.24969827598363], [-113.9999652772396, 32.24969447181701], [-113.99991898140638, 32.24971033119198]]]}, 'type': 'Feature', 'properties': {'LOADDATE': '2014/07/31', 'SOURCE_DAT': '9a7e71eb-e11b-445e-b5d0-472f2c7e9612', 'DISTRIBUTI': 'E4', 'CONTOURELE': 1040.0, 'FCODE': 10101, 'CONTOURINT': 20, 'SOURCE_ORI': 'US Geological Survey', 'SHAPE_LENG': 0.000536, 'SOURCE_D_1': 'Contours derived from the National Elevation Dataset', 'PERMANENT_': '691a4f09-c4b9-4472-bfc9-9ac5b80c5fd2', 'SOURCE_FEA': None, 'DATA_SECUR': 5, 'CONTOURUNI': 1}}

In [24]: print z['features'][0]['properties']
{'LOADDATE': '2014/07/31', 'SOURCE_DAT': '9a7e71eb-e11b-445e-b5d0-472f2c7e9612', 'DISTRIBUTI': 'E4', 'CONTOURELE': 1040.0, 'FCODE': 10101, 'CONTOURINT': 20, 'SOURCE_ORI': 'US Geological Survey', 'SHAPE_LENG': 0.000536, 'SOURCE_D_1': 'Contours derived from the National Elevation Dataset', 'PERMANENT_': '691a4f09-c4b9-4472-bfc9-9ac5b80c5fd2', 'SOURCE_FEA': None, 'DATA_SECUR': 5, 'CONTOURUNI': 1}

What we're interested in here is the "features" component of our Json file. This contains a whole bunch of lines that compose a detailed contour map. In particular, we're interested in the properties of each feature, and most of all, the "CONTOURELE" key, which tells us the elevation corresponding to this particular contour. If we want to filter out some of the (densely-packed) contour lines, this is the parameter we will use.

Here's a one-liner to print out all the unique contour line elevations that are contained in our TopoJson file:

In [26]: print unique([z['features'][i]['properties']['CONTOURELE'] for i in range(len(z['features']))])
[  250.   260.   270.   280.   290.   300.   310.   320.   330.   340.
   350.   360.   370.   380.   390.   400.   410.   420.   430.   440.
   450.   460.   470.   480.   490.   500.   510.   520.   530.   540.
   550.   560.   570.   580.   590.   600.   610.   620.   630.   640.
   650.   660.   670.   680.   690.   700.   710.   720.   730.   740.
   750.   760.   770.   780.   790.   800.   810.   820.   830.   840.
   850.   860.   870.   880.   890.   900.   910.   920.   930.   940.
   950.   960.   970.   980.   990.  1000.  1010.  1020.  1030.  1040.
  1050.  1060.  1070.  1080.  1090.  1100.  1110.  1120.  1130.  1140.
  1150.  1160.  1170.  1180.  1190.  1200.  1210.  1220.  1230.  1240.
  1250.  1260.  1270.  1280.  1290.  1300.  1310.  1320.  1330.  1340.
  1350.  1360.  1370.  1380.  1390.  1400.  1410.  1420.  1430.  1440.
  1450.  1460.  1470.  1480.  1490.  1500.  1510.  1520.  1530.  1540.
  1550.  1560.  1570.  1580.  1590.  1600.  1610.  1620.  1630.  1640.
  1650.  1660.  1670.  1680.  1690.  1700.  1710.  1720.  1730.  1740.
  1750.  1760.  1770.  1780.  1790.  1800.  1810.  1820.  1840.  1860.
  1880.  1900.  1920.  1940.  1960.  1980.  2000.  2020.  2040.  2060.
  2080.  2100.  2120.  2140.  2160.  2180.  2200.  2220.  2240.  2260.
  2280.  2300.  2320.  2340.  2360.  2380.  2400.  2420.  2440.  2460.
  2480.  2500.  2520.  2540.  2560.  2580.  2600.  2620.  2640.  2660.
  2680.  2700.  2720.  2740.  2760.  2780.  2800.  2820.  2840.  2860.
  2880.  2900.  2920.  2940.  2960.  2980.  3000.  3020.  3040.  3060.
  3080.  3100.  3120.  3140.  3160.  3180.  3200.  3220.  3240.]

We've got contour lines for every 10-20 meters, which we could cut down to every 50-100 meters.

We have two choices for how we can parse our data.


Parse Live/On The Fly

This is the more difficult way to parse data. This involves parsing, down-selecting, and filtering the data on the fly using Javascript. This is also probably going to be significantly slower, since it will require loading up several MB of data each page load.

Parse Offline/Ahead of Time

The easy way to parse our data is to parse offline, or ahead of time, so that we can deal with our data in Python, and keep everything straightforward and easy, and keeps files small and portable. This is the approach I'll be using.