- About SEAS
- Faculty & Research
- News & Events
- Offices & Services
- Make a Gift
Battling blight with big data
As Midwestern Rust Belt cities grapple with painful economic transitions, housing blight threatens to choke out once-thriving urban centers.
Detroit, for instance, has seen its population decline by nearly 30 percent over the past 15 years, leaving more than 50,000 homes vacant, according to the real estate firm RealtyTrac.
Identifying blighted homes often requires a major commitment of manpower and public finances, often scarce commodities in cities like Detroit. Big data analytics offers city leaders a more efficient and cost-effective way to find and fight blight, according to a recent paper by Qian Wan, a mechanical engineering Ph.D. candidate at the Harvard John A. Paulson School of Engineering and Applied Sciences, and co-author Bradley Pough, a J.D. candidate at the Harvard Law School.
Their paper, “Digital Analytics and the Fight Against Blight: A Guide for Local Leaders,” examines the problem of urban housing blight, identifies best practice uses of data analytics, and provides data-driven recommendations for municipal officials.
“Blight spreads like a disease,” Wan said. “Neighbors of blighted homes that have fallen into disrepair may pay less attention to their own properties, or the blighted homes erode the housing market so residents pack up and move to better areas. Once it starts, it becomes harder to control. But city officials can do something about it if they can find it early.”
One potential solution involves the use of mobile app-based reporting systems to crowdsource blight identification. In New Orleans, for instance, the creation of a public website and blight identification app helped city officials reduce the number of blighted homes by more than 10,000 between 2010 and 2013, according to the Greater New Orleans Community Data Center.
Identifying existing housing blight is a step in the right direction, but data analytics can also help predict and prevent blight, said Wan.
By training a machine-learning algorithm based on variables that are proxies for blighted homes, such as water and electricity shut-offs, code violations, and mail stoppage, officials can gather predictive analytics on which homes are most likely to become blighted. Armed with that knowledge, municipalities can intervene through code enforcement to prevent blight before it takes root.
Cincinnati, for example, has begun utilizing a machine-learning algorithm comprised of 50 different variables; the algorithm correctly predicted blight in 78 percent of cases, compared with a 53 percent prediction accuracy rate by code inspectors, according to the students’ research.
The biggest obstacle to launching these systems is often the data itself, Wan said.
“Utility data and code violations are all indicators of blight, but they are stored in different places and in different formats,” she said. “Once the information is organized in the same database, at the same scale, in the same units, and the same intensity, even the simplest machine-learning algorithm could draw some patterns.”
Before beginning to organize data, Wan and Pough recommend officials establish parameters based on their city’s unique definition of blight. Regulations in some municipalities call a house blighted if it lacks a clear owner, while others focus on code violations rather than deeds.
Once the data have been identified and labeled, creating crowdsourcing applications and developing predictive algorithms can go a long way toward streamlining what is currently a costly and time consuming process.
In addition to providing tips and resources to help local governments battle blight, Wan hopes the paper inspires engineering students to think outside the box.
“Engineers often think of going into academia or industry, but there is a lot of potential to make a major impact by working in the public sector,” she said. “We can use technology to make the world a better place.”