Author: Aarathi Krishnan

Publisher: TED Talks

Publication Year: 2021

Summary: The following speech discusses how humanitarians have embraced digitalization. The use of AI, big data, drones, and more may seem logical and needed, but it is actually the deployment of unrest technologies on vulnerable populations without consent. Humanitarian technology innovations are inherently colonial and good intentions alone do not prevent harm. In fact, good intentions alone can cause harm. Without due consideration of power, data collected on vulnerable people can be used against them and pose a great risk to them, their families, and their communities. This has happened recently with data collected by the Afghanistan government that is now in the hands of the Taliban. Technologies need to be designed with the inequities of the past built into digital futures. Indigenous AI, Nia Tero, and the Satellite Sentinel Project are working hard to do things right. In the absence of legally binding ethical frameworks, a framework is needed to guide the work.

The speech makes you think of the following: Ask: Which groups of humans will be harmed by this and when? Assess: Who does this solution actually benefit? Interrogate: Was appropriate consent obtained from the end users? Consider: What must we gracefully exit out of to be fit for these futures? And imagine: What future good might we foreclose if we implemented this action today?