One of the core assumptions about IBM's Smarter Planet initiative is that everything that can be measured will be. Driving that assumption is the continuing drop in the cost of sensors such as RFID tags, which in turn will make it cost effective to deploy sensors just about everywhere.
The missing part of this equation, however, is how will all this data get aggregated? We're really talking about is billions, maybe trillions, of sensors continuously sending small amounts of information all day long. That data needs to be aggregated at some point and then turned into a format that can be consumed by analytic applications.
That's a role that companies such as Lantronix are hoping to play as more and more sensors get rolled out. By deploying a recently released 32-bit XPort Pro embedded network server, IT organizations can essentially feed all the data they collect from various end points into a device that turns information into an XML format that can be consumed by an application.
Traditionally, many organizations that needed to access information from various embedded systems tended to build their own aggregators. But with more customers looking to deploy billions of sensors, it might make more sense to look to offerings from commercial vendors that are specifically designed to scale up to meet these new requirements.
We're still in the early stages of this whole Smarter Planet phenomenon. And surely, there will be other variations of the concept beyond IBM's. But what is clear is that the business wants to track the processes with as much minute data streaming into analytic applications in as close to real time as possible.
And how that exactly gets accomplished is going to take a little extra work given the current state of most IT infrastructure.