Tag Archives: home range

PaperJS-based Minimum Convex Polygon


Even though I’m at home sick today (or maybe due to the boredom of it), I’ve been working on improving my PaperJS-based minimum convex polygon (MCP) script (now found here, previously found here). Now, the algorithm is as follows:

  1. Find the point that is the farthest right
  2. Draw a large circle that encompasses the entire possible field, centred on that point. My circle has a radius of 720px (the field is 500x500px). The circumference of the circle is split every 18px. When those vertices are connected to the centre, there are 252 spokes.
  3. Loop through the spokes, starting with the spoke pointing straight down and moving clockwise. Does the spoke pass through another point (within 5px)? No? Move to next spoke. Yes? Connect the circle centre on that point (a new boundary line of the MCP), move the circle and all spokes to that point. Continue with next spoke to find the next MCP boundary point.

This method isn’t perfect, but it works fairly well. Problems arise when two or more points fall within the tolerance of the spoke hittest (5px). But like I said, I’m sick, so instead of fixing it, I’m going to watch Community.

edit: I’m very grateful to SparkGeo’s lab, which finally clued me in to how to interact with a map underneath a canvas element. Hint: canvas CSS = “z-index: 1; pointer-events:none”

PaperJS Minimum Convex Polygon


I continued my obsession with PaperJS today. Ultimately, my goal is to produce sharp-looking, location-based visualization tools. So, how about a dynamic home-range (minimum convex polygon) visualization, derived from GPS locations, that can be overlaid on Google Maps? Sounds good, and here‘s what I came up with today. Next step is to get it talking in lat/long coords for Google Maps API.

Home Range Calculation Success/Failure

Use distribution (95%) as calculated using reference bandwidth.

This is an example of how I wanted to calculate Grizzly Bear home ranges.

Success: Remembered how to write R code, and figured out adehabitatHR package in two hours.

Failure: Unfortunately, using the Least Square Cross Validation smoothing parameter requires “cross-validation to be minimized,” and none of my tests satisfied this, so we’re going to use another method. The above picture was made using the “reference bandwidth” (substitute “href” for “LSCV”) – it tends to overestimate the area used by the individual in question, so we’re not using that, either.

Anyway, here’s the code, as much a reminder for me how to write R code as anything:

1. Install and load adehabitatHR and maptools packages.

install.packages(pkgs=c(“adehabitatHR”,”maptools”), repos=”http://cran.r-project.org”)



2. Read a shapefile into a SpatialDataPointsObject.

shape <- readShapePoints(“H:/GIS_Data/bear”)

3. Calculate use distribution, by bear (identified in the first column, shape[,1]), using the Least Squares Cross Validation smoothing parameter.

kud <- kernelUD(shape[,1],h=”LSCV”,grid=500)

4. Make a polygon (SpatialPolygonDataFrame) of the area of 95% probability.

hr <- getverticeshr(kud, percent=95)

5. Save the polygon to a shapefile.