NIR Camera Test

nir_testsI got an exciting new toy yesterday – a digital camera modified to capture near infrared (NIR), in addition to visible light. As I mentioned in a previous post, the combination of NIR and visible light allows for vegetation extraction on a pixel-by-pixel basis, after applying the NDVI formula. See an example of a raw NIR photo I took this morning, compared to the processed NDVI image, here. It’s not perfect, but it shows promise.

As a side note, my example makes use of the TwentyTwenty javascript widget, which makes sliding image overlays extremely easy.

Interactive Pixel Masking: NDVI

ndviI’ve been thinking a lot recently about how to visualize NIR (Near Infrared) photography, and here’s where I’m at.

In brief, NIR photos (combined with a regular visual light photo) allow us to calculate an index of plant health, or more technically, the ratio of absorbed photosynthetically available radiation. Healthy plants reflect almost all infrared radiation and absorb a high percentage of visually red radiation – this is why healthy plants appear green (they absorb red and blue and reflect green light).

The most often used formula for interpreting NIR photos is NDVI (Normalized Difference Vegetation Index). NDVI values range from -1 to +1. I calculated NDVI in ArcGIS, exporting to 8-bit unsigned PNG format (in retrospect, I should have exported to a higher bit value in order to retain the original NDVI values), which resulted in values ranging from 0 – 255.

The end objective was to calculate percent coverage of vegetation in photographs, whether healthy or not. My preliminary website seems to do an adequate job, although it surely needs refinement. The jQuery UI slider controls which pixels to mask as “vegetation.” The HTML5 Canvas elements take care of the rest, manipulating the NDVI image at the pixel level, coloring those above the threshold level (set in the slider) red.

Original NIR and RGB images found here: Image and Visual Representation Group

Arcpy: Polygons to Centroids (within Polygons, and with all Attributes)

Here’s a fairly simple python script that creates centroid points (constrained to fall within polygons) and attaches all the attributes (plus a link field, “ORIG_ID”) from the polygons to the centroids. Notably, it uses cursors to read raw geometry objects – no geoprocessing tools (except CreateFeatureclass). Also, while Advanced licensees may use the Feature to Point tool for this procedure, the method presented here is completely free, using no extensions or non-basic licensing – I don’t know of another free way to achieve this in ArcGIS.

Wouldn’t it be easier to run a spatial join on the points to attach the attributes? Yes, but spatial join is slow. In my test of 1000 polygons, the cursor method completed in about 2 seconds, while adding a spatial join call extended the run time to more than 10 seconds. This could mean big savings on large datasets.

  1. # import libraries
  2. import arcpy, os
  3. # set input/output parameters
  4. polyFC = arcpy.GetParameterAsText(0)
  5. outCentroids = arcpy.GetParameterAsText(1)
  6. # set overwrite environment
  7. arcpy.env.overwriteOutput = True
  8. # if the output file does not exist, create it. Add “ORIG_ID” field.
  9. if not arcpy.Exists(outCentroids):
  10.     arcpy.CreateFeatureclass_management(os.path.dirname(outCentroids),
  11.                                     os.path.basename(outCentroids),
  12.                                     “POINT”,
  13.                                     polyFC,
  14.                                     “”,
  15.                                     “”,
  16.                                     polyFC)
  17.     arcpy.AddField_management(outCentroids,‘ORIG_ID’‘LONG’)
  18. # create an InsertCursor containing all the fields in ourCentroids, which are all the fields in polyFC plus “ORIG_ID”
  19. cursor = arcpy.da.InsertCursor(outCentroids, [‘*’])
  20. # read all features in polyFC
  21. for row in arcpy.da.SearchCursor(polyFC, [“SHAPE@”,‘*’,‘OID@’]):
  22.      # create an array to hold a new, modified row
  23.     rowArray = []
  24.      # read all the fields except “SHAPE@”
  25.     for fieldnum in range(1,len(row)):
  26.         # if this is the second field [i.e. SHAPE], replace it with the centroid coordinates
  27.         if fieldnum == 2:
  28.             rowArray.append(row[0].centroid)
  29.         else:
  30.             rowArray.append(row[fieldnum])
  31.     # write the new row to the cursor
  32.     cursor.insertRow(rowArray)
  33. del row

Looping through the fields in this way (lines 28 – 35) feels cumbersome – if anyone knows a better way to do this, please let me know!

Using ArcObjects in Python

Recently, I found the need for more fine-grained control over a map document than the Arcpy library alone could afford, so I turned to the comtypes library (downloaded here) and associated Snippets file (ArcGIS 10.1 version here), which allows access to VBA-based ArcObjects. The Mark Cederholm presentation I followed is from the 2010 ESRI Dev Summit, so it is possible that this method is somewhat outdated, but it does exactly what I need.

Here is a simple example showing how to add some guides to the layout rulers in a map document, which is not possible solely using Arcpy:

# Import the modules
from comtypes.client import GetModule, CreateObject
from Snippets import GetStandaloneModules, InitStandalone

# First time through, need to import the “StandaloneModules”. Can comment out later.
# GetStandaloneModules()
InitStandalone()

# Get the Carto module
esriCarto = GetModule(r”C:\Program Files (x86)\ArcGIS\Desktop10.1\com\esriCarto.olb”)

# Create a map document object
mxdObject = CreateObject(esriCarto.MapDocument, interface=esriCarto.IMapDocument)

# Create new mxd file
mxdObject.New(r”map.mxd”)

# Create some new layout guides
for i in range(1,21):
    mxdObject.PageLayout.VerticalSnapGuides.AddGuide(i)
    mxdObject.PageLayout.HorizontalSnapGuides.AddGuide(i)

# Change current view to layout
mxdObject.SetActiveView(mxdObject.PageLayout)

# Save the mxd
mxdObject.Save()

# Delete the map document object
del mxdObject

…and the result:

layout

Solar Radiation: free method

This is a follow-up post to Tuesday’s Garden Placement Using Publicly Available LiDAR, which relied on ArcGIS’ Spatial Analyst extension. Huge hat tip to the LAStools Linkedin conversation, and particularly Daniel Grohmann, for mentioning the freely available SAGA GIS tool, Potential Incoming Solar Radiation. I should note that this is truly my first attempt at using SAGA, so apologies in advance if I’m leading you astray in some way.

To recap, we had created a digital elevation model (DEM) from publicly available LiDAR from the City of Prince George, and were ready to calculate solar radiation values for a city block in order to optimally locate a garden plot.

  1. Download, install, and open SAGA GIS (it’s free)
  2. Import your DEM into SAGA. I used the GDAL: Import Raster module (Modules -> File -> GDAL/OGR -> GDAL: Import Raster). Double-click the layer in the Data workspace window to view the raster and ensure it imported correctly.
  3. Open the Potential Incoming Solar Radiation tool (Modules -> Terrain Analysis -> Lighting -> Potential Incoming Solar Radiation). Here is how I set up my run:
    Image
  4. After changing the colour ramp to the default (blue -> red), we are left with a very similar, if not identical, solar radiation map comapred to that produced by Spatial Analyst:
    Image

Garden Placement Using Publicly Available LiDAR

Here’s a quick method for finding the sunniest spot on your property in Prince George, BC (and elsewhere if you can find the data). Unfortunately, this method is not free from start to finish (it requires ArcGIS Spatial Analyst), but there are a fair number of free tools and data used.

  1. Get the data. Specifically, you need the raw LiDAR (.las file) for your area, which are not available for download from the City of PG’s Open Data Catalogue. While you’re at it, also ask for the orthophotos (aerial photos) and cadastral data (parcel boundaries, road files, etc.). Get in touch with the City of Prince George’s GIS department to make your data request: gisinfo@city.pg.bc.ca
  2. Load your orthophoto into your GIS program (I’m using ArcGIS, but you can use QGIS for the time being [it’s free]).
  3. Create a new polygon feature class and draw a polygon to narrow down your area of interest.Image
  4. Download LAStools (it’s free for the tools you need). Bring the LAStools toolbox into your GIS (there are ArcGIS and QGIS versions).
  5. Run the lasclip tool to ignore the millions of LiDAR points that you’re not interested in.
  6. Run las2dem to create an elevation raster of your LiDAR data.Image
  7. Consult your favourite source, like the Farmer’s Almanac, to determine the timing for your growing season, .
  8. Here is where I used a Spatial Analyst tool called Area Solar Radiation, which is not free (in fact, it’s darn expensive). Run the tool, using your latitude (PG is about 53.914), and frost free start and end dates (for PG, June 4 to Sept. 3). You should end up with something like this, which my legend tells me ranges from blue (353 watt hours per square metre) to red (883061 WH/m^2): Image
  9. Now, it’s a matter of overlaying parcel boundaries, finding your property, and seeing what kind of sunlight you can expect. I’ve circled a few good candidates on this block with lots of sun in the backyard, and (surprise!) some of them are existing gardens.ImageImage

Bonus: another cool thing you can do (for free) is load your monochrome elevation model in Blender, extrude the terrain heights, drape the aerial imagery overtop, and create a 3D animation like this one.

PaperJS Voronoi Diagrams

Image

I’ve been tinkering around with Voronoi diagrams, inspired by the example on the PaperJS site. My example (here) uses Raymond Hill’s Voronoi Javascript implementation and PaperJS to derive the diagram from a point distribution. Note: the redraw animation is a bit backwards, as the point locations are determined first, driving the placement of the Voronoi diagram.

Another, more festive example, can be found here, which shows the distribution of lights on our Christmas tree. Falalalala-la-la-la-la.