Internet of things Security

We’re making sensors smarter and more complicated – Stacey on IoT

We’re making sensors smarter and more complicated – Stacey on IoT
Written by ga_dahmani
We’re making sensors smarter and more complicated – Stacey on IoT

During the pandemic, we have seen major sensor vendors add more intelligence to their sensors. In 2020, Sony released a vision sensor that can track people or detect objects locally, and last year Bosch announced its Sensor BHI260APwhich has accelerometers and gyroscopes along with a processor preloaded with activity tracking algorithms that can actively learn to follow new movements.

There is a clear desire from sensor manufacturers to increase the price and capabilities of their sensors, along with demand from customers who want to incorporate a smarter sensor into a product without requiring more parts and integration on their part. And making all this possible is TinyML.

From left to right: Abbas Ataya, TDK InvenSense; Victor Pankratius, Bosch Sensortec; Andrea Onetti, STMicroelectronics. Image courtesy of Ira Feldman at the tinyML foundation.

Speaking at this week’s tinyML summit in San Francisco, Victor Pankratius, with Bosch Sensortec, explained that the demand for greater energy efficiency in overall systems means that deeper integrations between sensors and processors make sense. He called the computer programs that currently spend time on a concept called memory hierarchy to add a power hierarchy.

With the memory hierarchy, designers allocate memory resources based on the response times of different memory formats. With a power hierarchy, designers would have to design systems that made trade-offs around where to put data processing based on the amount of power used. According to Pankratius, each step closer to the sensor where the data is processed results in a 10-fold reduction in power consumption.

This is a radically different way of thinking about system architecture, but Pankratius is not alone. His co-panelist, Andrea Onetti of STMicroelectronics, agreed. Onetti’s company is also pushing more intelligence into the sensor with the goal of moving the intelligence to the farthest node. STMicro has created a new line of sensors called intelligent sensor processing units, or ISPUs, which are designed for your view of the connected world.

It wasn’t just that panel promoting smarter sensors. In another presentation, a speaker outlined plans for TinyML algorithms that would run on a sensor and then, based on the results of that algorithm, trigger a second, higher-powered system to provide more detail. An example could be a security camera that only activates if an image sensor recognizes a person. If it does, then it could trigger a larger processor to find out if the person is a stranger or not.

This would reduce power consumption, but such designs also introduce complexity. A gentleman from Microsoft pointed that out during the Q&A after the smart sensor panel. He said that when he works with machine learning, he already has to worry about model creation and drift (that’s when a machine learning model becomes less accurate over time), in addition to managing the model itself. But with a sensor running his own model, he said, “suddenly my life becomes a lot more complicated.”

He is not wrong. Application and product developers would not only have to manage more models, they would also have to design a system that handles the possible interactions between the two models. Also, smarter sensors are also more expensive sensors. Theoretically, if a designer can save 10 times the power consumption by buying a smaller sensor, he could also spend less on a smaller battery. But that’s usually not how those tradeoffs work.

I can see why sensor manufacturers want to turn what has been a commodity into a high-value part, but I can also see why people will reject that. It will be more expensive, for one thing. It will also be more complicated. But the energy efficiency, privacy, and reduced latency arguments for processing at the farthest edge are compelling. And in addition to the big sensor manufacturers, I see other companies following this model. is a startup that has been pushing this idea for quite some time. SensiML is another company based on the idea that the data should be processed in the sensor itself.

So while it’s more complicated, the history of computing is littered with the integration of hardware and software that were once packaged into more complex systems. And by doing so, costs are reduced, energy use is reduced, and performance is improved.

About the author


Leave a Comment