Seattle’s Department of Transportation (SDOT) put up a new speed limit map last week. Seattle had a couple of years ago changed local law so that default speed limits for arterials was lowered to 25 mph and non-arterials was lowered to 20 mph. But, it will take a long time to officially change all roads because the process described to me in email involved evaluating a handful of urban village streets per year. But after looking at the data a bit and confirming there are large differences in speed limits across districts, I think a better and more equitable process would be to just lower nearly all arterial streets (per the intent of the law), then measure speeds and impacts, and adjust as needed.
So the data! I used the Seattle Streets dataset which drives the map and the City Council District boundary data. The Seattle Streets data I ingested into a little sqlite3 database. Each short road segment (usually a block) has information about what kind of road it is, its textual name, the speed limit (duh), total segment length and width (in feet), and what is called in GIS software a “line string”. The “line string” is a list of 2 or more pairs of latitude and longitude values that describe the approximate path of that road segment. Most roads have two points but some segments have more. I used a Ruby library rgeo to ingest the shapefiles for the council district boundaries (very complex you might imagine!) and then for each road segment in the Seattle Streets dataset I tried to classify its line string as being in predominantly one district.
Assigning a council district to each road segment is probably the most arbitrary decision here. Most segments are cleanly in one district, but obviously along the boundaries it’s hard to programmatically assign the road to both equally (if it’s right on the boundary) or to the district it’s mostly in. So basically all road segments along the council boundaries are in arbitrarily one or the other or possible both (so I might have double counted some segments). I’ll probably revise my script (not posted yet, sorry) to do this classification to be a bit less arbitrary (and detect “this road is on the council boundary line” in a clean way). But looking at the data and knowing which big/fast roads are aligned with the district boundaries near me, I think it probably doesn’t change the overall themes I’m seeing. With that, here’s some summary data!
Total road segments at each speed in each district
Percent of total (city-wide) road distance at each speed in each district
Percent of in-district road distance at each speed in each district
But these are for all speeds which makes it hard to compare slow vs fast roads. So here are some rollups of data showing percents of road lengths at 25 mph and below and 30 mph and above. All of this data includes freeways and highways which is why some districts have a lot more 55+ mph distance: they have more of I-5 or I-90 in them.
Percent out of city-wide road distance in each district
|District||25 and below||30 and above|
Percent out of in-district road distance in each district
|District||25 and below||30 and above|
So … what?
Keeping in mind my data is a bit noisy, it seems clear that some districts, notably district 2 in southeast Seattle, have more roads over all and more roads with higher speed limits, both in absolute terms but also in terms of their share of roads in the city (or in district). Fast road speeds are a major cause of collisions and fatalities, especially for the people not in cars. The greater distance of road at higher speeds also indicate likely higher traffic volumes which means more air and noise pollution.
Part of the reason I started digging into this was I noticed that some streets I thought of as “quiet neighborhood streets” actually had 30 mph speed limits (signed even). Roads like Genesee or 50th Ave S, for example, should be slow streets as they run past parks and schools and houses. A short distance away, on Rainier, a short segment has been “calmed” and reduced to 25 mph. The results demonstrate that the slower speeds decrease collisions and fatalities dramatically. Why we haven’t quickly rolled this out to the rest of Rainier is a moral failure: if calming Rainier Ave S through Columbia City were a medical study, we’d have had to end the experiment early and give all the patients the clearly effective treatment.
Unfortunately, the process by which we lower other streets (including the rest of Rainier) will require years, as the email I got from SDOT about how they are evaluating speed limits throughout the city describes a process that won’t even get to most of the urban villages before 2020. Meanwhile, more people will die.
I think we should just reduce all the speed limits now, then start measuring. Instead of studying and improving each corridor as we have funds available, we should reverse it. Lower speed limits on every corridor we can immediately – signs are pretty cheap – then start measuring and studying. If we find that drivers don’t observe the speed limit there, then do spot improvements first to calm that segment. We can be innovative and use cheap fixes like paint and post calming bulb outs (see Beacon Ave & McClellan for an example) or planter boxes lining roadways or adding inexpensive and movable bike lanes as they did in Winnepeg. The current process, while orderly, is expensive and doesn’t force us to fix the worst corridors first. I was told the current urban villages being adjusted are Greenwood/Phinney then Green Lake, Roosevelt, U-District, U-Campus, and North Beacon Hill next year. Calming all of Rainier is currently planned to take years. Instead, let’s just lower all of them now, measure them and fix the worst sections Then keep measuring and fixing until 25 mph or lower is a reality everywhere.