No Jitter is part of the Informa Tech Division of Informa PLC

This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them. Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 8860726.

M2M: Looking Further Ahead

In earlier articles on M2M (machine to machine) communications the focus was on developments that are enabling real-time, actionable information to be deployed within mainstream business processors such as ERP. And the single most important development right now is the creation of M2M solutions that are standards-based, open, and cloud-centric. However, in order for these solutions to be accepted as an integral, desirable and robust constituent of an enterprise's environment, more is needed.

IT and communications managers would obviously want to be able to manage this new constituent, and ideally they would welcome a platform that would also enable solutions to be created in house. That said, the process would have to be done by programmers who do not have M2M know-how and experience, and the exercise would be pointless unless it could be done in short time frames. Do that and development costs are reduced and much shorter ROIs are realized.

But before looking under the hood of a solution that maps to these objectives, let's remind ourselves why M2M is becoming such an important development. Solutions are deployed in order to reduce costs, save time, and improve both operational efficiency and customer service. That is a powerful combination that boosts bottom lines. M2M has the proven ability to deliver these benefits. And deployment in enterprises allows management decisions to be based on real-time information.

Those enterprise requirements sound like a very tall order, but they are doable. For example, a company called Eurotech has created the requisite software framework, but it clear that this relatively small firm cannot carry this innovative concept forward on its own, nor can it undertake projects that require systems integration skills and resources. What's needed is a community effort--a new M2M ecosystem that includes heavy hitters like IBM and Intel: companies that operate in the enterprise space and that are respected and trusted.

One such community effort is the M2M Industry Working Group at the Eclipse Foundation, which is a not-for-profit, open source Website. IBM and Eurotech are members. The inclusion of IBM's systems integration resources and experience is significant. Another effort is the Eclipse Koneki project, the goal of which is to provide M2M solution developers with tools that ease the development, simulation, testing/debugging and deployment of solutions. The initial open source contributions provide a common set of tools and APIs.

Rapid Creation and Deployment
Cloud computing facilitates the rapid creation and deployment of new services and business processes, offering the ability to join up individual components that have been tested and deployed in other apps and processes--wheels don't have to be reinvented.

Something similar to virtualization and decomposition is taking place in the M2M space. The common application interface in Figure 1 is the place where baseline functionality that is employed in many if not most M2M applications is located. Solution providers can tap into this resource and employ the requisite functions, which clearly speeds up and simplifies the development process.


Figure 1. This schematic visualizes M2M's transition from a vertical, stand-alone architecture to one that is line with the horizontal model of an enterprise environment. Note that the common application interface plays a key role in the rapid creation of new applications

Eurotech employs a similar concept but takes it to a higher level. Figure 2 illustrates the company's "software framework". In this case the common application interface is known as the Foundation Layer: it comprises over 20 generic components that are employed in most applications. They include device configuration, management and virtualization. In addition there are component bundles that address the typical need of specific vertical market applications. That is one of the keys that enables rapid creation.


Figure 2. This software framework is a programming environment that wraps the complexity of low-level device management with high level constructs, an approach that enables simpler, faster programming; shorter, easier to read code.

As illustrated the current offer targets healthcare, transportation, logistics, industry and defense. These software bundles have been tested and deployed and they can be combined using an SOA-type mechanism. In a somewhat simplistic nutshell, the combination of generic and industry-specific components adds up to a set of baseline solutions that developers can customize in order to meet the specific requirements of their customers. The Rest of the Stack
The inclusion of an OSGi (Open Services Gateway initiative) layer in the software framework allows enterprises to add, amend and drop M2M services in line with changing requirements. Components or bundles can be remotely installed, started, stopped, updated and uninstalled without requiring a reboot.

The Java Virtual Machine (JVM) interface sits between the operating system and the OSGi bundles. Java is a popular, high-level programming language. All the customer has to do to realize a customized solution is provide the business logic and get a college graduate to program it in Java.

Finally, on the right-hand side of the framework there is a cloud client add-on, which is a product that delivers actionable M2M data from the field to downstream applications and business processes, dashboards, and reports. This device-to-cloud concept also maps to the needs of SMBs who are unlikely to have the requisite technical resources, i.e., staff that are embedded systems developers, or employees who can wrestle with C++ code.

Communications
M2M applications are based on based on monitoring and measuring parameter data, converting it into IP packets for transmission over a network (wireless in most cases) and then processing the data into information.

Many developers that entered the M2M and embedded computing space had an IT background. Therefore they gravitated toward the communication protocols they used already, e.g., HTTP (Hypertext Transfer Protocol). However, HTTP employs thousands of bytes as a message header, and in a typical M2M app, the parameter data will be a few tens of bytes; it's clearly inefficient. This hasn't been a problem up to now, but mobile data traffic is going through the roof and networks are becoming congested.

MQTT (Message Queue Telemetry Transport) is a lightweight broker-based publish/subscribe messaging protocol. The transport overhead (the fixed-length header is just 2 bytes), and protocol exchanges are minimized to reduce network traffic. However, MQTT is not just an efficient protocol, it should be seen as a technology that adds important communications functionality to this rapid creation and deployment model.

MQTT sends messages: they could be sensor data, but they can also be IM-type messages. Multiple clients subscribe to topics in which they are interested, and there is a simple, common interface to which everything can be connected. In an M2M scenario this would allow communications between interested parties to be established instantly when a critical event occurs. It's worth checking out this demo.

Interim Summary
M2M solutions are adopting mainstream data communications technologies in order to enable seamless integration with mainstream business processes. They currently comprise:

* Cloud computing model; integration of components into applications using SOA
* OSGi, a service platform for the Java that implements a dynamic component model,
* Java, used to customize industry-specific component bundles
* MQTT, a multi-functional publish/subscribe messaging protocol.
Data Centric Designs
The adoption of this concept, together with DDS (Data Distribution Service) and MQTT, will enable mapping between recent and upcoming M2M developments like local area networks to that of enterprise environments.

Data-centric design is becoming the preferred way to build data-critical embedded solutions. This development has been enabled by data collection "edge" devices such as smartphones, the availability of high performance messaging and database technologies, and the increasing adoption of SOA and Web Services in the enterprise world. Processing and storage costs have declined much faster than network costs, so it makes sense to move computing resources into local environments and to employ data distribution technology to move data around.

Data-centric design is the key to systems in which: (1) participants are distributed; (2) interactions between participants are data-centric; (3) data is critical because of large volumes, or predictable delivery requirements; (4) computation is time sensitive and may be critically dependent on the predictable delivery of data.

DDS is networking middleware that simplifies complex network programming. It implements a publish/subscribe model for sending and receiving data, events, and commands among the nodes. Nodes that are producing information (i.e., publishers) create "topics" (e.g., temperature, location, pressure) and publish "samples." DDS takes care of delivering the sample to all subscribers that declare an interest in that topic. Does that sound familiar? MQTT is the enabling delivery technology.

Conclusions
We've outlined the use of mainstream computing concepts and technologies in enterprise-centric M2M solutions. This development is taking place because the business case is compelling; costs are reduced; time is saved; operational efficiency is boosted; and customer service improved. Therefore IT and communications managers want to make M2M solutions an integral part of the environment.

These solutions deliver real-time, actionable information on the physical topics that impact the enterprise's performance. In addition, MQTT enables communications between interested parties to be established instantly when there is a critical event.

Communications management wants to be able to create customized applications in house and realize short ROIs. This is enabled by adding the business logic to generic solutions.

Cloud computing facilitates the integration of real-time information into mainstream business processes. This architecture is the only way to realize the full potential of the mobile enterprise, which is being driven by the functionality that’s enabled by smart phones. Therefore it enables real-time information coming from devices and mobile employees to keep the information content of those processes up-to-date.