As the Internet of Things (IoT) evolves into the Internet of Everything (IoE) and expands its reach into virtually every domain, high-speed data processing, resiliency, analytics and shorter response times are becoming more necessary than ever. Meeting these requirements is somewhat problematic through the current centralized model of retrieving data from the edge, transporting and processing it in the cloud, returning values to the edge and then taking action.

 

The Problem With The Cloud

Explosive growth to the connection of physical things and operation technologies (OT) to analytics and machine learning applications, which can help glean insights from device-generated data and enable devices to make “smart” decisions without human intervention. Many “smart” (using quotes somewhat sarcastically here) applications require that the sensor/machine data be sent from the edge device to the cloud service provider, where transformations and applications on the data are performed and presented back to the customer, across another network and, ultimately, onto their device of choice. First, there are tremendously under-utilized compute and storage resources available on these edge compute devices. These under-utilized resources have significant computation power on their own; however, hashing these resources together provides a tremendous performance, reliability and security improvement over “traditional” cloud centralized architectures. Secondly, the centralized approach requires significant volumes of data being transferred to the cloud across one or many network service providers. As the IoT/IoE continues to scale into the billions of connected devices, the concept of “sending all the data all the time” to the cloud approaches an infinite cost, sending “all or near infinite money” to the network service providers. Third, roundtripping the data from the edge device to the cloud, executing computations on the data and sending results back to the edge device introduces a number of points of failure, latency and possible security/data validity problems.

 

SenseOps Solution

SenseOps is building technologies that allow for very high-speed data acquisition from sensors and legacy systems and provide affordable data storage solutions through low-cost edge hardware devices, leveraging distributed web approaches. With the data available on the decentralized architectural pattern that brings computing resources and application services closer to the edge, a wide variety of distributed, secure and resilient applications can be developed and deployed. SenseOps software is deployed on edge compute gateway devices that connect to sensors and assets. For certain applications, commercial, off-the-shelf gateways are perfectly suitable. However, for specific use cases, SenseOps can contract, manufacture and supply use case specific gateway devices (e.g. need a milspec or Class 1 Div II enclosure?) at low to medium volumes. In addition, SenseOps “curates” sensor packages for specific use cases and pre-packages the gateway/sensor combinations suited for the edge deployment requirements. These sensors and gateways are designed to be installed by technicians, not engineers. Once installed, the SenseOps software, running on the local edge device, is pre-configured to acquire, contextualize and store the sensor data, and immediately enable other advanced applications.