Recent Posts
Biggest tech trends we have seen across 2024.
Dec 2024
Digital Twins in the next decade: Trends, innovations, and industry...
Dec 2024
Understanding the different types of Artificial Intelligence (AI).
Nov 2024
Internet of Things (IoT) technology is on the rise. It has been for a number of years, albeit largely in niche areas. However, it feels that the prevalence of IoT devices is growing, more importantly, in more generic areas.
Smart office, smart city, smart supply chain are all areas which are seeing growing adoption of sensors. And this growing demand is forcing vendors to address challenges regarding scale and operationally viability.
Vendors have been working to address challenges such as battery life, firmware updates, re-charging, etc. for a number of years. And success in these areas is one of the key advances that is making the adoption of IoT technology more widespread. However, as vendors seek to develop their systems to be more ubiquitous, they are in danger of making the IoT sensors too smart.
We are starting to see really smart devices developed. Devices that have the capability to store information locally and communicate that alongside conditional data from the various sensors they boast. We are seeing devices that have more sophisticated communication algorithms and even smart physical interfaces…
Whilst some of this added intelligence is absolutely necessary and will be key to unlocking adoption in certain industries and across various use cases, there is a real danger that the intelligence baked into the devices and relied on within larger systems reduces the flexibility of the system, creates more potential points of failure, creates single points of failure and reduces technical lifespan of systems/solutions.
As a rule of thumb, it is my belief that IoT devices should be made as ‘dumb’ as possible. As much intelligence should be within the cloud and any functionality aside from collection of specific data points through sensors should also operate within the cloud.
The reason I so strongly advocate this approach is for a number of reasons, namely: Long- and short-term flexibility; future proofing; reducing potential for faults; reducing single points of failure.
To illustrate this, imaging using a device to store inventory locally. This may be uploading a packet of data that represents inventory. So, the device has enough local memory (and some spare – maybe and additional 50%) to store large inventories or say a pallet of vehicle. That works perfectly for a couple of years, then the inventory system advances, giving more detail on the inventory, increasing the packet sizes and filling up the local memory much quicker – potentially making the process non-viable.
What do you do? Buy new hardware?
By leveraging the cloud, you have the flexibility to scale and adapt systems in this way. You also have centralised intelligence which can be upgraded far easier than an estate of sensors – even if they do have Over the Air (OTA) capabilities.