Algorithms are now used throughout the public and private sectors, informing decisions on everything from education and employment to criminal justice. But when they turn into “black boxes” that don’t offer up their secrets, we can’t hold them accountable. There is growing evidence that some algorithms and analytics can be opaque, making it difficult to determine when their outputs may be biased
Algorithms and the data that drive them are designed and created by people. Even for techniques such as genetic algorithms that evolve on their own, or machine-learning algorithms where the resulting model was not hand-crafted by a person, results are shaped by human-made design decisions, rules about what to optimize, and choices about what training data to use. “The algorithm did it” is not an acceptable excuse if algorithmic systems make mistakes or have undesired consequences.
This is exactly why a group of professors have come up with five principles
to hold algorithms accountable.
Having said that, let’s take a look at some exciting progress in creating algorithms for better buildings and cities. Most of the work in this area are primarily driven by data-analytics and machine learning startups in collaboration with private & public Agencies. As architects and planners, it is extremely important that we understand how the algorithms that impact our buildings and cities are created.
P.S. If you want to get in touch, simply reply to this email. Oh by the way, I wouldn’t mind if you gave TGIC some love through LinkedIn