Virtualization Technology News and Information
Article
RSS
Insider's Guide: Architecting for Edge Success, Pt. 1
By Lucas Beeler, Senior Architect, Hazelcast

The phrase "digital transformation" is thrown around a lot in the technology industry and especially amongst enterprise businesses. While this move is certainly happening within most organizations, I think people would be surprised to learn it's not happening at the pace they anticipated.

With how quickly this industry advances, as a business leader, it can be tough to keep up. One technology that is already taking off and bound to take over before executive teams know it is edge computing. In this two part series we'll explore the business value of edge computing as it relates to data capture and what it takes to set up a successful edge-to-cloud architecture.

Analytical Chemistry: A Blast from the Past

When I was in college, I took an upper-level course in analytical chemistry requiring the use of the department's gas chromatograph. Keep in mind that gas chromatography (GC) machines are complex and pricey pieces of equipment that university departments, often short on funds, loath to replace. So, what I'm really trying to say is this piece of equipment was old.

In fact, this particular example had been acquired in the early 1990s and was extensively upgraded about a decade later. It was attached to a computer for process control and the display of analyses. What shocked my lab partner and me is the kind of computer the GC machine was connected to: an Apple IIe. This was 2006 and we both assumed everyone wanted the latest-and-greatest of everything. We were off base to say the least.

As strange as the Apple IIe may have seemed to us then, it worked! It proved to us that having the shiniest and newest piece of scientific and industrial equipment in the internet-of-things (IoT) doesn't always suffice. GCs, aircraft engines, fleet vehicles, and line assembly robots are purchased when plants are built and budgets are available. They're maximized until there is some compelling reason-and an adequate budget-to replace them.

Spring forward 14 years-far from my days as a college student and serving now as principal architect at Hazelcast, I quickly came to learn the world of IT is much different than that of analytical chemistry. Servers in data centers are refreshed every three years and you can be certain whatever you're building will run on the latest version of Linux installed on an x86 server. In the operational technology (OT) industry however, you can't always count on standard processors or data sets where the computing endeavor is concerned with monitoring and controlling physical processes.

In a recent survey addressing new and persistent digital challenges to organizations, nearly half of respondents noted updating operational systems as a priority over the next two years. This is a huge reality when you begin to implement edge-to-cloud use cases because you will often run into legacy technologies from decades past. Not always the GC machine and Apple IIe I dabbled with several years ago, but not too far off. As a result, the data canonicalization will present challenges when you try to translate it into useful information. And this is where in-memory streaming solutions can dramatically simplify the data transformation process.

The Value of IoT Data for Businesses

Organizations are privy to the value real-time data presents to their operations and its usefulness in physical devices and sensors is immense. When coupled with the widespread application of machine learning and artificial intelligence, the information generated by IoT devices can result in a plethora of value sets that informs business decisions quickly. Specifically with predicting failures, sensing and remediating product defects on assembly lines, and detecting violations of business rules by people or machines.

To unlock the value of data produced here, three key objectives must be achieved:

  1. Data must be aggregated and canonicalized as soon as possible
  2. Data must be moved from the edge of the business back to a central data center, either a physical, on-premises data center or a virtual data center in a public cloud
  3. Once in the data center, it must be possible to query, enrich, visualize, transform, and otherwise make the data useful to business decision-makers. Likewise for data science and data engineering teams

These steps correspond roughly to what we call the edge-to-cloud pipeline-each with its own distinct stage. It's here where data is acquired from IoT sensors and devices, aggregated and transformed, then moved into the cloud (or to an on-prem data center) to unlock its business value.

Entering the Gateway: Inside the Edge-to-Cloud Pipeline

With a service as complex as the cloud, that many people picture as "imaginary" given it doesn't have a material presence, the architecture of the pipeline does indeed have a physical location. It's here where each of the stages in the pipeline run.

Stages one and two generally run on an IoT gateway device that is a miniaturized, industry-standard, x86- or ARM-based computer. These are often mounted in a ruggedized enclosure with a wide variety of connectors. Curious what I mean by a miniaturized version of the industry standard? Two well-known examples include the GL20 IoT Gateway from Hewlett Packard Enterprise and the Edge Gateway 3001 for IoT Applications from Dell, Inc.

True to their product names, these gateway devices are deployed at the outer "edge" of a business. Meaning they are far and away from any central data center or cloud. Gateway devices come with a variety of connectors-past, present, and future-even legacy RS232 serial ports that would allow them to connect to an Apple IIe controlling you guessed it: a gas chromatograph!

Stay Tuned for Next Time

We've barely scratched the surface of unlocking the architecture needed to build success at the edge. If you're itching for more, make sure to stay tuned for part two where we'll discuss these three stages in-depth - starting with data capture.

##

About the Author

Lucas Beeler 

Lucas is a senior architect at Hazelcast, where he helps Hazelcast's most demanding customers architect, design, and operationalize enterprise software systems based around Hazelcast IMDG and Jet. Before joining Hazelcast, Lucas held similar positions at GigaSpaces and GridGain, giving him a uniquely broad and deep understanding of the in-memory platform space. Lucas holds a B.S.E. in computer science from the University of Michigan.
Published Tuesday, July 28, 2020 7:34 AM by David Marshall
Comments
There are no comments for this post.
To post a comment, you must be a registered user. Registration is free and easy! Sign up now!
top25
Calendar
<July 2020>
SuMoTuWeThFrSa
2829301234
567891011
12131415161718
19202122232425
2627282930311
2345678