Why IoT should become the ‘internet of transparency’

Why IoT should become the ‘internet of transparency’

Algorithms are essential to IoT.

Connected devices autopilot our cars; control the light, heat and security of our house; and shop for us. Wearable devices monitor our heart rates and oxygen levels, tell us when to get up and how to move, and keep detailed records of our whereabouts. Smart citiesPowered by a host of IoT devices and applications, it controls the lives of millions of people around the world by directing traffic, sanitation, public administration, and security. The scope and influence of IoT in our daily lives would be inconceivable without algorithms, but how much do we know about algorithmic function, logic, and security?

Most algorithms operate at computational speeds and complexities that preclude effective human review. They work in a black box. On top of that, most IoT application algorithms are proprietary and work in a double black box. This status quo may be acceptable if the results are positive and the algorithms do no harm. Unfortunately, this is not always the case.

When black box algorithms fail and cause material, physical, social, or economic damage, they also damage the IoT movement. Such mistakes undermine the social and political trust the industry needs to ensure broader adoption of smart devices, which is key to moving the field forward.

Opaque algorithms can be expensive, even deadly

Black box algorithms can give rise to significant real-world problems. For example, there is a nondescript stretch of highway in Yosemite Valley, California, that consistently confuse driverless cars, and at present, we still don’t have an answer as to why. The open road is naturally full of risks and dangers, but what about your own home? Smart assistants are there to listen to your voice and fulfill your wishes and commands regarding shopping, heating, security, and just about any other home feature that lends itself to automation. However, what happens when the smart assistant starts acting silly and not listening to you, but to the TV?

There’s a anecdote circulating the web about many smart home assistants initiating unwanted online purchases because CW6 News anchor Jim Patton from San Diego uttered the line, “Alexa ordered me a dollhouse.” Whether this happened on this large scale is beside the point. The real problem is that the dollhouse incident sounds very plausible and, again, casts doubt on the inner workings of the IoT devices that we have entrusted with much of our daily life, comfort, and security.

From an IoT perspective, the intangible damage from such occurrences is considerable. When an autonomous vehicle fails, the reputation of all autonomous vehicles suffers. When a smart home assistant does stupid things, the intelligence of all smart home assistants is called into question.

The data elephant in the room

Every time an algorithm makes a wrong decision, its providers promise a thorough investigation and a quick fix. However, due to the proprietary and for-profit nature of all these algorithms, authorities and the general public have no way of verifying what improvements have occurred. In the end, we must take the companies’ word for it. Repeated offenses make this a difficult question.

One of the main reasons why companies don’t reveal the inner workings of their algorithms, to the extent that they can understand them, is that they don’t want to show every operation they perform on our data. Autonomous cars keep detailed records of every trip. Home assistants keep track of activities in the house; record temperature, light and volume settings; and keep a constantly updated shopping list. All of this personally identifiable information is collected centrally to enable algorithms to learn and pass the information on to targeted advertisements, detailed consumer profiles, behavioral suggestions, and direct manipulation.

think about the time Cambridge Analytica effectively armed 87 million social media profile information from unsuspecting users to misinform voters and could have helped change an entire US presidential election. If your friends list and a few online discussion groups are enough for an algorithm identify the best ways to influence your beliefs and behaviors, what deeper and stronger level of manipulation can detailed recordings of your heart rate, movement and sleep patterns allow?

Companies have a vested interest in keeping algorithms opaque because it allows them to tweak them for profit and amass huge centralized databases of sensitive user data along the way. As more and more users become aware of this painful but necessary realization, IoT adoption and development is slowly approaching a plateau and skepticism is building a mountain in the face of algorithmic progress that never was. What are we going to do?

The transition to the ‘internet of transparency’

The most urgent focus must be on making what algorithms do more understandable and transparent. To maximize trust and eliminate the adverse effects of algorithmic opacity, IoT must become the “internet of transparency.” The industry can create transparency by decoupling AI from centralized data collection and making as many algorithms open source as possible. Technologies such as masked federated learning and edge AI enable these positive steps. We need the will to pursue them. It won’t be easy, and some big tech companies won’t go down without a fight, but we’re all better off on the other side.

About the Author

Leif-Nissen Lundbæk, PhD, is co-founder and CEO of xayn. His work focuses primarily on algorithms and applications for privacy-preserving AI. In 2017, he co-founded the privacy technology company with professor and research director Michael Huth and COO Felix Hahmann. The Xayn mobile app is a private search and discovery browser for the Internet, combining a search engine, discovery feed, and mobile browser focused on privacy, personalization, and intuitive design. Winner of the first Porsche Innovation Competition, the Berlin-based artificial intelligence company has worked with Porsche, Daimler, Deutsche Bahn and Siemens.

Leave a Comment