I recently left the Allen Coral Atlas team after five great years building with them. It’s been quite a journey from the early demo and my being star-struck about National Geographic coverage, through moving institutional homes, the team going fully remote, and announcing that we had reached global coverage. Here’s what we’ve done since that last update:
A full remapping of the world incorporating feedback from more than 90 regional experts, and 150 Atlas users, to make our automated classification a better fit to what locals know is in their waters.
Continue reading “Allen Coral Atlas wrap-up”
I had a chance to go back to some web mapping work from a couple of years ago and improve it:
Much of the improvement came from the client: updated data with more coverage & more fields to filter on, and a design refresh. But it’s always a pleasure to work through refinements, and it was a chance to clean up and simplify the code that runs it.
Soon after moving to Victoria, I started volunteering with Friends of Bowker Creek. In fact, the first week after the 2-week COVID quarantine that was mandatory at the time, my partner and I joined one of their habitat restoration work parties. It’s fair to say that I caught the bug.
The group’s vision is to restore this very culverted creek and its very paved urban/suburban watershed to a condition that can support salmon and trout again, as it apparently has done in living memory, albeit not for some decades. We’re under no illusions about being able to fully undo 150 years of careless development, and are well aware that the state of the creek before that was not the “perfect Eden” that settlers thought they were seeing, but rather carefully managed by the lək̓ʷəŋən & W̱SÁNEĆ people to serve their needs. But we take inspiration from that management, and in that find hope that it can support many people and fish at the same time.
An important part of this is the quality of the water itself, which FoBC sends volunteers out to measure a couple of times a month. We need this data to track our own progress, to advocate for municipal water management changes, and to convince the Federal Department of Fisheries & Oceans to provide the allowance of chum salmon eggs which we’ve started placing in the stream. After about a year, the person who had been leading that effort had to step down, and I took over coordinating it. I have some big ideas about ways to integrate the manually collected data with a couple of continuous loggers we also maintain, and the quite comprehensive water quality data that our regional government publishes. But my first priority was to update our system from using paper data sheets and entering them into a database that was hard to access.
Water Rangers came to the rescue on both counts. We now enter data on our phones at the creek, so there are no more piles of forms on my desk waiting to be transcribed. And they host a site where anyone can see our data and learn about the state of the creek for themselves.
Of course it means nothing if it doesn’t spur change, and the thing that I’m proudest of is that we have convinced one of the municipalities this creek flows through to make some changes in how their own Public Works depot handles wastewater. More of next winter’s chum eggs will manage to hatch because of that. It’s just one baby step, but each brings us a little closer to seeing salmon return to spawn here again.
When I last wrote about the Allen Coral Atlas, I was new on the team and we had just launched a site mapping a handful of test reefs. A lot has changed since then:
As of today, we have completed mapping the world’s shallow tropical coral reefs. Continue reading “A global coral map”
This summer I had the pleasure of working with CORE GIS to build a web map for the World Justice Project. One of their ongoing programs is to aggregate surveys of unmet justice needs from all over the world, and we made them an overview map to present the data:
The map and filters are automatically populated from a file that WJP maintains, to give them a straightforward way to update data without having to engage a consultant each time. The underlying data is more impressive than anything we built, though: the screenshot really is highlighting a 129,000 person face-to-face survey conducted the year before COVID.
Explore the map, or read WJP’s intro to the project.
A/B Street is both a game and a very powerful tool that allows you to download a detailed street map of an area and test out changes to the configuration of roads:
It uses OpenStreetMap as a data source. OSM has impressively rich data on things like the number of lanes a street has, where bike lanes and turn restrictions go, and so on, but one thing it lacks is detailed elevation data. The developers noticed that it was suggesting absurd routes for cyclists in Seattle, including routes that no-one who’s actually tried cycling would ever repeat because of the hills involved. So they brought me in to figure out how to add elevation data.
I wrote a simple Python tool that reads in paths as plain lists of coordinates, and writes out statistics for each one: the start and elevations, and cumulative elevation gain and loss along each path. A/B Street incorporates this into a data load by exporting each road segment as a series of points every metre along the way, getting as fine-grained a picture of a route’s hilliness as the source data allows.
By default, elevation_lookups uses data from the Shuttle Radar Topography Mission, because that source provides fairly high quality data for most of the world’s land. I learned about SRTM while making this tool, and I’m still sort of star-struck that we have a global dataset like this, freely available to anyone who has a use for it. But it is also relatively low resolution, so I built in the ability to override the data source with higher resolution sources where known. It comes preconfigured with examples of both raster (LIDAR source) and vector (contour source) datasets for the Seattle area.
This is an open-source project. I hope it can be useful for other applications, and I have more ideas for it then I have time to implement. I’d love contributions from anyone this appeals to, and have some suggested starting points (not all requiring programming skills!).
I also ran into some technical surprises, which are below the fold in case the information is useful to anyone else
Before the pandemic, I was working on a major update for Washington Hometown’s recreation mapping work. They’ve long been a major provider of outdoor recreation data, but wanted to put more of that data into public-facing maps that they host. They partnered with TOTAGO for mobile maps, and I built a map generator for their desktop maps so that they would be able to quickly publish new thematic maps by just updating some configuration files rather than having to write code each time.
This flexibility turned out to be more important than any of us had anticipated, because before the project wrapped up the pandemic hit. In the first months of Washington State’s lockdown, a lot of public land was either entirely closed to the public or had very limited services. Suddenly being able to publish a spring hiking map just didn’t seem relevant or even appropriate any more. But WHT’s speciality is keeping data current as things change, and they applied the same mindset to the map themes themselves. At the height of public confusion about where to find COVID tests, sanitiser supplies, and so on, they released a “crisis” map with that information updated daily.
As things calmed down, they quickly pivoted away from the crisis map (quickly enough that I didn’t even get a screenshot of the working map!), and started focusing on the ever-changing list of which public lands were open, closed, or somewhere in between. Now that we all know that outdoor activities are relatively safe, there are far fewer closures, but still enough that it’s valuable to have someone keeping track.
I was impressed with my client’s ability to keep this project relevant when I was afraid that the pandemic would sink it. And in the end it’s been a great validation of the map generator itself, which has helped them to stay agile.
Seattle is suffering from a deep housing affordability crisis, with more and more people being priced out of living there. At the same time, it’s been deeply resistant to changes in zoning that would allow enough new housing to be built. One of the examples of this is that it has a program called “Encouraging Backyard Cottages”, and has gone through at least two rounds of legal reforms to support that, but figuring out whether one can be built on any given lot still involves going through a long checklist about the exact dimensions of the site and intricacies of zoning.
In theory, DADUs (Detached Accessory Dwelling Units, the much less appealing legal term for “backyard cottages”) allow a lot of small, affordable housing to be added to single-family zoned neighbourhoods and spread out enough to not feel like a radical change in the streetscape. In practice, the complicatedness of the process adds enough of a barrier that relatively few have been built so far. Hatchback Cottages has a plan to solve this with a set of ready-to-build designs and a package of support to help people through the process.
Even with their expertise, assessing a site under the arcane rules is a time consuming process. But computers are good at applying lots of rules and calculating all the measurements, so Hatchback contracted me to run a GIS analysis assessing every residential lot in Seattle for suitability.
Fortunately for us, Seattle and King County publish very comprehensive and regularly updated open data about zoning and development, so I had a lot to work with. The analysis takes into account existing building footprints, lot characteristics and potential complicating factors like steep slopes and landslide hazard areas. It will never be a complete replacement for a knowledgeable human looking at the site, but by ruling out all the sites that definitely won’t work it saves my client a lot of time. Now the experts can solely focus on sites that have a relatively good chance of working out.
As a child, I adored the National Geographic magazines. I collected them for long enough to fill a bookshelf, and I think it’s fair to say that they had a pretty big influence on how I’ve ended up making a living. So it was particularly satisfying to see a project I’ve been involved with since the summer get a writeup in National Geographic:
Inside the daring plan to map every coral reef from space
It’s a wonderfully ambitious project—using imagery that wasn’t available 5 years ago—and terrifyingly urgent. Coral reefs are particularly sensitive to the impacts of climate change and ocean acidification, and massively important as both habitat and shoreline protection. We have very little time to left to figure out ways to make them more resilient, or lose a major source of protein and watch storms do more and more damage as the reefs’ protection is lost.
The project also lost two of its champions this year: Paul Allen and Ruth Gates. I wish I had had the chance to get to know Dr. Gates. Among other things, she set a great example of how to engage with work this sad and frightening and never be crushed by the weight of it. The project continues, and feels like a fitting memorial to both.
My role is to integrate the processed data we get from two different research groups: one infers depth from the satellite imagery, and the other classifies areas of reef by types of sea floor and what’s growing on them. I write automation that turns the depth data into false-colour imagery, and prepares everything to be displayed in the web map. Here’s a screenshot of that depth imagery:
So far we just have a single snapshot in time of 6 reefs, but the real challenge will be scaling that to all the reefs in the world, updated regularly. You can explore it yourself and read about the methods and partnership at http://allencoralatlas.org/ .
Because it’s a limited preview so far and most areas aren’t covered yet, panning and zooming around the globe isn’t very satisfying. I recommend clicking on the place names in the “Mapped Areas” list to see where we actually have data. And for a first look I prefer to turn all of the data layers off, zoom to a location, and then turn them back on one at a time starting from the bottom of the list.
I got to collaborate with COREGIS again this year, on a couple of maps for a Texas education advocacy group. The first was an internal tool for their staff use only, but the second was public facing and is now available:
I am often a bit skeptical—perhaps more than you might expect from a GIS consultant—of the value of displaying information on a map rather than a chart or a table. Sometimes clients ask for maps simply because they look good, without thinking about whether geography is a useful dimension for the questions we want to answer. These two were interesting cases because geography is relevant, but not for the most obvious reasons, and this influenced their design:
- The internal tool is for lobbyists to show to state politicians, so the design is very focussed on zooming to individual districts and showing how they compare to others. In the big picture that’s not exactly the ideal way for politicians to make choices, but we all know that they do it, so it’s realistic for an advocacy group to appeal to this bias.
- The public map is a way of showing just how big a public funding advantage charter schools have over public schools. The message would have been lost in a chart or table, because the same unlevel playing field benefits rural school districts (which don’t generally have charter schools nearby) over urban ones. Putting the map together helps us to compare like with like. I don’t know anywhere near enough about Texas education politics to know if either bias is deliberate, but assigning blame is outside the scope of a map anyway. It’s enough that it shows the effect of a policy.