ABOUT THE AUTHOR


Bob Emmerson
Bob Emmerson is an English national living in the Netherlands. He holds a degree in electronic engineering and mathematics from...
Read Full Bio >>
SHARE



Bob Emmerson | June 27, 2013 |

 
   

Analyze M2M Data in Real Time

Analyze M2M Data in Real Time New systems offer easy-to-configure interfaces to M2M and Big Data. The result should be better decisions by end users.

New systems offer easy-to-configure interfaces to M2M and Big Data. The result should be better decisions by end users.

M2M solutions turn event and parameter data into real-time information that is transformed into a corporate asset when it is integrated into the enterprise environment and processed to meet a specific set of end-user requirements. But the really big benefits accrue when visual analytics is employed--when the resulting real-time intelligence is presented in an easy-to-understand, graphical interface.

It's an exciting, innovative prospect--the ability to transition from raw device data into decision-making processes based on customized dashboards. Such dashboards allow operational and financial trends and issues to be pinpointed more easily in real time. Moreover, it's a logical development that addresses a generic issue: organizations lack real-time insight into the critical aspects of their business--aspects that are getting increasingly complex in today's highly competitive marketplace.

M2M has a solid track record as a logical way of enabling this transition. It has the proven ability to cut costs, save time, improve operational efficiency and enhance customer service. This is evidenced by the M2M industry's steady growth during a sustained period of economic uncertainty. Therefore, if you've got something that's delivering a tangible ROI, it makes sense to derive additional value by transferring data sets into mainstream business applications, databases and enterprise service buses.

There are challenges. Systems integration requires specialist knowhow and experience in both M2M and enterprise environments; but it's not only doable, it's been done: see "M2M As An Integral Part Of The Enterprise".

It's clear that dumping vast amounts of seriously Big Data into the environment isn't going to cut it with IT. The following schematic, which comes from Axeda, was taken from the earlier integration article. It illustrates how traffic coming in from the Axeda M2M system--traffic that they call "Big Machine Data"---can be conceptually mapped to enterprise systems.


This schematic is a neat illustration of the way that data emanating from different M2M solutions can be leveraged when it is integrated into an enterprise's systems, e.g. CRM and ERP. Alarms, for example, can generate a ticket, which improves customer and field service. Data could also be sent to a billing or supply chain management system in order to eliminate the mistakes that can come with manual processing. One Axeda customer collects hundreds of readings per minute on their machines in order to monitor early indicators of a potential failure and thereby proactively schedule maintenance or part replacement..

The Big Challenge
So far we've only shown how M2M data can map to specific systems, but that still leaves IT with the task of delivering real-time information to different parties, e.g. C-level management, business analysts and operations engineers: information that allows them to react immediately if necessary.

However, that's a challenge too far: it consumes precious IT resources, takes too much time and costs too much. What's needed are tools that can be used by business users to assemble their own data visualizations--queries and dashboards that allow them to conduct analytics in real time. In a nutshell, users at all levels in an organization want a Google-type experience, i.e. effortless ways of finding the requisite information: that is the expectation bar.

Customer service staff wants to make on-the-spot decisions based on the profitability of customers, using data from CRM, transactional and data warehouse systems. Operations executives want to be able to prioritize production orders, taking into account the scheduling and forecasting data that's at their fingertips.

These are not "dream-on" scenarios. Here's a real-world example: A large publisher was unable to quickly correlate its book inventory (held in the ERP system and corporate database) with real-time trends on the Web. But by implementing a tool called Presto from JackBe, the company could predict successful promotions based on real-time public interest. Details of this real world use case and several others can be found on the JackBe website.

JackBe and Axeda are partnering in order to leverage their complementary technologies and thereby increase the value of M2M data. As shown here, multiple views of related data sets can be displayed on tablets and smartphones as well as desktop portals. These devices therefore become mobile, decision-making tools.

However, conducting analytics in real time is not limited to M2M. It's a new addition for the Big Data, which comprises historic and transactional data. When all these components are mashed up with live M2M data, then management and other interested parties get the full 360-degree picture.

This do-it-yourself approach to analytics reminds me of the early days of PCs, when employees downloaded mainframe data into spread sheets like Lotus 1-2-3 to create financial and other models in a few days, versus several months as had previously been required. In today's data-driven business climate it's very similar: users cannot rely on IT to unlock data resources.

Now it Gets Really Interesting
Big Data is seriously big: it runs into terabytes and petabytes and it keeps on flowing 24/7 into enterprises, from numerous sources. Applying analytics therefore requires the latest, fastest technologies, and this is where in-memory technology enters the equation.

Data processing involves the transfer of data between a computer's random-access memory (RAM) and disk storage, but this established technique doesn't cut it for Big Data. It's too slow, and the disk would be constantly thrashing about and wearing itself out. Therefore, the processing has to be done in RAM, which is about 1,000 times faster than doing an equivalent task in the traditional way.

A massive amount of RAM is needed--it can run into terabytes--but it's affordable. Today it costs around $1 per gigabyte and this figure continues to head south. Back in the mainframe-computing era, 1 gigabyte would have cost $512 billion.

In-memory technology deployed on computer servers therefore allows high-speed processing of M2M data that has been integrated into the enterprise environment. Moreover, it can be combined with data emanating from mainstream processes such as CRM and ERP.

The ability to process entire datasets in high-speed memory opens the way for more sophisticated market analysis, what-if analysis, data mining, and predictive analytics. And of course the results can be visualized, which is the way we remember information. In addition, fast, easy-to-run analytics frees up end users' imaginations, enabling them to pose questions they wouldn't even have thought of asking before. This indicates that in-memory analytics is more than a new technology. It's a disruptive game-changer.

JackBe Presto & Terracotta BigMemory
JackBe calls itself a provider of real-time actionable intelligence. The company's enterprise platform Presto mashes, analyzes and presents data in live dashboards that run on desktops, mobile devices, portals, and MS SharePoint. And as indicated earlier, it gives data users the requisite tools to create their own dashboards with minimal help from IT.

A dashboard is a set of applications: experienced Java programmers could create them, but that takes time and it costs money. In order to leverage their integration technology, Axeda needed a much easier way, hence the partnership agreement with JackBe, and the easier way is enabled by the drag & drop process, as illustrated below.

JackBe Presto has a tool called "Wires" that enables users to create mashups visually. Wires blocks define the information source and the data content. The wires, shown in blue, define the information flow. They allow users to connect data sources and manipulate them. The arrows between the blocks indicate how the data is flowing and mashing through the system. This is a very basic outline of the process; there is a video on the site that gives full details.

Another company, Terracotta, provides Big Data management solutions for the enterprise. Its flagship product, BigMemory, is a Big Data in-memory solution that delivers low-latency, low-millisecond access for up to hundreds of TBs of data.

Conclusions
The ability to analyze M2M data in real-time and visualize the result is an exciting concept. It takes the industry out of the relatively narrow confines of sector-specific information and positions it at the center of the enterprise environment. Individuals can create their own dashboards and obtain multiple views of related data sets on tablets, smartphones and desktop portals. In turn this allows management at all relevant levels to make faster, better decisions wherever they are.



COMMENTS



October 8, 2014
Today's fast pace of business combined with an environment of constant change creates stress on even the highest performing organizations. Join us for this interactive webinar to learn how to successf...
September 24, 2014
Distributed enterprises face a long list of challenges when deploying UC services to remote offices, including survivability, security and performance. IT managers need flexible and reliable solutions...
September 10, 2014
Cloud solutions offer companies the unprecedented ability to forego the costly and painful process of updating their contact centers every few years in order to maintain some semblance of modernity, i...