New IoT Chips See, Think & Act Autonomously
Rolling out the Internet of Things means using devices as our eyes and ears and even asking them to make decisions for us. The chips at the heart of those devices play critical roles, and recently some of them got better at their jobs.
While ARM introduced two minuscule processor architectures with security features borrowed from larger chips, Intel unveiled its Atom E3900 chips with improved computer vision and industrial-grade timing.
The E3900s are designed for a wide range of applications, including manufacturing and surveillance, and they’ll soon be joined by a version specifically for vehicles, called the A3900.
Intel is working to help machines evolve from accurately sensing what’s going on around them to acting on those senses. For example, if a device can see defective parts going through an assembly line, it can alert someone or even stop the line. Cameras in cars could see that the driver is drowsy and set off an alarm in the car, and ones pointed in front of the vehicle could tell a pedestrian from a shadow and stop the car, if its vision was accurate enough.
Rival Qualcomm also improved its chips for IoT vision recently. The E3900s have more computing power than their predecessors (by 1.7 times), along with faster memory speeds and memory bandwidth. But they also have better graphics and vision: 3D graphics performance is 2.9 times higher than in the previous generation, and the new chips can render 4K Ultra HD video on as many as three independent displays, Intel says.
Those three screens could be the virtual dashboard of a car and two seat-back displays for passengers to watch videos. By controlling each separately, the chip could make sure the dash display isn’t affected by the rendering activity happening on the entertainment screens, said Ken Caviasca, vice president of Intel’s IoT group.
The new chips are also better at capturing and processing images. They have four vector image processing units to perform video noise reduction, improve low-light image quality, and preserve more color and detail.
In a networked video recorder, an E3900 could take 1080p video streams from 15 cameras and display their feeds simultaneously at 30 frames per second on a video wall, Caviasca said.
Visual processing needs to keep getting better as technology evolves from rendering images to decoding content and on to image processing. The last step is computer vision, where machines understand what they see well enough to make decisions.
“What people are wanting is a processor that can sense like we do in an environment,” Caviasca said. And rather than just report back to humans, it can take action.
For industrial uses, the E3900 series gets Intel’s TCC (Time Coordinated Computing) technology. This feature lets the chip tightly control the timing of a device’s actions.
Some industrial systems rely on precise timing to be productive. For example, a robotic arm that takes parts off a conveyor belt needs to act when each object comes along. The more tightly the arm is synchronized with the rest of the system, the faster the assembly line can run.
Adding TCC, which wasn’t in the e3800 series, cuts the maximum delay to about one-tenth what it would have been. There are also uses for this technology in the automotive world, Caviasca said.
Computerworld: A Strategic Company: The Internet of Things & How ARM Fits In: