[go: up one dir, main page]

Skip to Main Content
PCMag editors select and review products independently. If you buy through affiliate links, we may earn commissions, which help support our testing.

A Guide to Using BI Apps With Edge Computing

Here's what you need to know about business intelligence (BI) applications and edge computing because companies can't profit by buzz alone.

April 25, 2018
A Guide to Using BI Apps With Edge Computing

Everybody's talking about edge computing these days but few understand what it is, much less what to do with it. Succinctly, edge computing means processing close to the source of the data, either on the sensor or close to the gateway. If you'd like to know how IT can best manage edge computing as an alternative, then check out "IT Needs to Start Thinking About 5G and Edge Cloud Computing," a column by Wayne Rash, my colleague and PCMag IT Watch contributor. But for the purposes of this article, we can start with an explanation from market research firm IDC, which defines edge computing as a "mesh network of micro data centers" that have "a footprint of less than 100 square feet."

As with most new terms in the technology space, "edge computing" is broadly used and has been linked with a variety of other buzzword technologies, including blockchain, content delivery networks (CDNs), grid computing, mesh computing, and peer-to-peer computing. The common task, whichever tech is imployed in conjunction with edge computing, is to speed any data analysis and related actions by shortening the distance between where the data is processed and where the end result of that output will have an effect.

When it comes to turning your hard-won business intelligence (BI) insights into actionable insights, that's a key consideration. But even though BI (especially low-latency analytics) and edge computing seem to be a match made in tech heaven, there's a lot to consider before combining the two.

Analytics at the Edge vs. Streaming Analytics

Edge computing's significance to analytics is clear once you realize there's no other practical way to transfer an ongoing tsunami of Internet of Things (IoT) data to the cloud without creating untenable latency and one heck of a network traffic jam. That latency issue can prove fatal in many emerging analytics applications, such as autonomous driving. The data overflow will take you from broadband to bottleneck in less time than it takes to say "Stream it up, Scotty."

Yes, streaming analytics was touted just a couple of years ago as a latency-sensitive panacea for fetching a real-time read on IoT data. But, while streaming analytics still has plenty of upsides, it hasn't been able to change physics. Huge data transfers are slowed by numerous router hops, virtualization packet delays, dropped connections, and other physical constraints in a network. In the case of IoT in remote areas, getting a network connection at all is a mighty iffy proposition on any given day.

It doesn't help matters that all of these problems are magnified by the physical distance between the data and the computing processes. For these reasons and others, streaming analytics tends to be in "near-real-time" rather than real time. That delay—no matter how small—is a huge problem if, say, you need the outputs in time for an autonomous car to brake and avoid a collision. It's an even bigger problem if you want all of the cars on that highway to brake at once.

In short, Star Trek and real-life data transporters have their limits and there's nothing much any Scotty in IT can do about that. There is simply too much IoT data for current-day networks to handle and the volume is still growing at a breathtaking rate. The big takeaway here: Edge computing stems the tide of information over the network and provides faster analytics outputs, too.

Edge Cloud vs. Cloud

Since these micro data centers can be, and often are, joined together in collaborative, communicative, or interdependent functions, some people like to use the term the "edge cloud."

For example, modern-day cars have hundreds of embedded computers that are designed for managing individual systems but are also connected to each other so that the systems can communicate with one another and adapt as needed. In other words, they individually, collectively, and heavily use edge computing to complete a variety of complex functions.

"Not only do they respond to the observed conditions but they learn and adapt over time," said Johnathan Vee Cree, PhD., Embedded and Wireless Systems Scientist/Engineer at the US Department of Energy's Pacific Northwest National Laboratory (PNNL). "For example, modern fuel injection systems will observe the car's driving patterns in order to optimize for power and fuel efficiency. The real-time nature of this data would make it impossible to process anywhere other than at the edge."

Even with multi-system onboard interdependence, the term "edge cloud" tends to muddy understanding further because it's imprecise.

"When talking about IoT devices, the considerations are nearly opposite of the cloud," said Vee Cree. "IoT devices typically have limited storage and processing power, potentially intermittent connectivity to the outside world, and may be powered by a battery. The key value in these devices is their ability to transform the raw sensor values available to them into meaningful data."

Edge Computing Devices
Edge Computing Devices graphic above reprinted with permission from TECHnalysis Research.

However, edge computing and cloud computing are not mutually exclusive. Indeed, they are intertwined in the most successful IoT data strategies. That's not likely to change any time soon.

"An example of the combination of edge and cloud computing comes from Tesla's autopilot features. The autopilot system must sense and react to ever-changing driving conditions. It does this through the use of machine learning [ML] algorithms that are able to detect and avoid hazards while controlling the car. While this data is used to make decisions in real time, it is also shared with the cloud and used to improve the autopilot feature for all drivers," explained William Moeglein, a Software Engineer at PNNL.

The edge and cloud combo play is common simply because it works; it leverages the best of both worlds but it's not the only game in town. In fact, 36 percent of edge analytics are located in the corporate data center, 34 percent on the edge, and 29 percent in the cloud, according to "Computing on the Edge: Survey Highlights," a report by Bob O'Donnell, President and Chief Analyst at TECHnalysis Research. This means that there are options in how edge analytics are implemented. The choice depends entirely on what you're trying to do and the conditions under which you're trying to accomplish that goal.

"The tradeoff between computing power and energy usage can be a limiting factor when devices are run from a battery. In cases where power consumption is important, decisions may be made based on small samples of data despite having access to continuous sensor readings," said PNNL's Moeglein.

"Edge computing enables feedback for devices in the field where communications are not guaranteed, are one-way, or are limited," Moeglein continued. "In cases where systems are expected to operate for years or decades on batteries, edge computing can be used to provide longer device lifetime by reducing the data being transmitted."

What Fog Computing Is
Fog computing graphic above reprinted with permission from Cisco Systems, Inc.

De-Fogging the Edge Cloud

Automation to manage and optimize where and how the analytics are done soon followed, thus leading to the concept of "fog computing," a term that IT and networking vendor Cisco Systems coined. In this strategy, as Cisco explains in a white paper, "developers either port or write IoT applications for fog nodes at the network edge. The fog nodes closest to the network edge ingest the data from IoT devices. Then—and this is crucial—the fog IoT application directs different types of data to the optimal place for analysis." As depicted in the graphic above, in Cisco's view, fog computing extends the cloud closer to the actual devices doing the data collection. By putting fog nodes in close proximity with IoT devices, Cisco seeks to speed analytics while decreasing latency.

Some say it's easier to think of this as cloud computing pushed to the edge—decentralized, in other word—as opposed to edge computing which is computing on the edge of the network, often actually on a IoT device. A very nuanced difference, to be sure.

Often people use "edge computing" and "fog computing" interchangeably as the two concepts are very similar. It is fog computing's ability to sort and route data to various locations for analysis that sets it apart. That, and fog computing is most often "near edge" (i.e., a gateway) rather than truly on the edge such as on an IoT device.

In short, there is no consensus on what, precisely, edge computing is, but plenty of folks who say fogging up the issue isn't helping any. According to the aforementioned TECHnalysis Research report, "more people think edge computing is made of endpoints (29.8 percent) than gateways (13.2 percent), but 44 percent think it's both."

In any case, "the end-use application ultimately drives the system needs and aims to find a balance between the benefits of processing at the edge or the cloud," said PNNL's Vee Cree.

There's only one rule of thumb here: If you need a decision in near- or real-time, then do the processing as close to the data source as possible. Edge computing is the choice to eliminate latency, lower energy spend, and reduce network traffic.

APIs, Apps, and Ecosystems

APIs, Apps, and Ecosystems

In general, apps used in conjunction with edge computing are aimed at achieving speed and efficiency. Here you are less likely to find standalone business intelligence (BI) apps, but rather, embedded BI functions and, of course, application programming interfaces (APIs) to join IoT data to existing BI apps and frameworks in the cloud.

"The concept of edge computing helps companies embrace the advantages of cloud computing even in scenarios where latency and connectivity are issues. Some applications deal with a size of data or a speed requirement that prohibit round tripping to the cloud and, in such cases, Tableau analytics embedded in the local applications provide insights quickly," said Mark Jewett, Vice President of Product Marketing at Tableau Software.

"In other cases, edge computing offers a way to deal with scenarios where connectivity is not dependable or is expensive or periodic. Examples like things that move, such as ships, things that are remote, such as oil platforms or mines, or even situations where connectivity is good but not worth taking a risk on interruptions, such as manufacturing plant systems where downtime is extremely expensive. Analysts and other users in the field, who may not have access to a full workstation, still want the same power of analytics they have come to know."

Tableau is not the only BI vendor working on or with data at the edge. Microsoft pointed to Schneider Electric, one of its customers as a case study. Schneider Electric has an edge app that does predictive maintenance on an oil rod, using Azure Machine Learning and Azure IoT Edge to improve safety and reduce incidents in remote areas, a Microsoft spokesperson said. The data processing is done on the device. This is accomplished by bringing cloud intelligence—ML models they trained in the cloud—to the edge device itself. This enables faster detection of anomalies based on the large training data set.

Meanwhile, IBM Watson is reporting myriad use cases, including ambient and device voice and conversation analytics, drone image and video analytics, and maintenance and safety acoustic analytics.

"In all of these cases, edge analytics is enabling improved performance, cost, and privacy by operating locally in devices," said Bret Greenstein, Vice President of IBM Watson IoT, Consumer Offerings. "The growth is exciting as computing power at the edge grows, and ML matures and creates more specialized use cases.

"Devices can 'understand' what they see and hear, and use that understanding to provide better service and make better choices. This is happening in real time. And since the actual data can be converted to insights in the edge device, you may not have to send the data to the cloud, which improves cost and helps enable new forms of privacy protection."

Adding new layers of privacy protections potentially go a long way in reducing company liabilities while still affording data companies need to thrive.

Edge Computing Apps by the Numbers

Edge Computing Apps by the Numbers

Keeping in mind that edge computing is in its infancy, it's not surprising that only a smattering of edge computing apps are new (39 percent), according to TECHnalysis Research. The majority (61 percent) are migrated cloud apps. That said, the following are the top edge computing apps:

  1. Operations analytics (44 percent)

  2. Process monitoring (35 percent)

  3. Employee monitoring (32 percent)

  4. Remote asset monitoring (28 percent)

  5. Workplace/safety compliance (24 percent)

  6. Predictive maintenance (22 percent)

  7. Physical asset tracking onsite (20 percent)

The top five reasons for migrating cloud apps to the edge, according to that same TECHnalysis Research report, are to improve security, reduce costs, reduce latency, improve local control, and reduce network traffic.

Mobile Edge Computing

Through the lens of BI, efficiencies and opportunities are enhanced with edge computing. Therefore, it makes sense to first migrate cloud apps or embed analytics in existing IoT apps that can put you in the best position the fastest. For example, instead of streaming and analyzing all of the data from a robotics unit on the factory floor, you can jettison the flotsam, which is the seemingly endless amount of repetitive information generated by the sensor.

Instead, edge computing can be used to note and analyze only the "change data," meaning the data that is different in some way from the other data streaming from the same source. For example, imagine a windmill in the arctic circle reporting: "I'm fine. I'm fine. I'm fine. Blade stuck for two seconds. I'm fine. I'm fine. I'm fine." The bit about the blade sticking would be the change data. So would "wind shift," which could trigger the machine to turn and gather more energy. Change data are the data points with the most significance precisely because they note a change.

In such cases, apps at the edge are working only with relevant data; some would call it "smart data." Why boil the sea when important details can be readily seen? Smart data apps make data usable at the collection point and can also decide which data to ship to the cloud for further blending and analysis in traditional BI apps. In this way, data mining is optimized for maximum business effect.

4 Tips for Your BI and Edge Computing Strategy

It's relatively easy to jump on board the edge computing trend and decide to begin with migrating apps from the cloud. But springing into action without a strategy would be a serious mistake. Remember the early days of IoT when random things such as toasters were quickly connected to the internet and then proudly displayed at the next CES?

Even smart data can't help you if your strategy is nonsensical or missing. So, here are four considerations to keep in mind when forming your BI and edge strategy.

1. Reevaluate your current IoT play for additional data mining opportunities. For example, a grocer or manufacturer might want to use data from its supply chain, such as refrigeration and trucking sensors, to establish or validate the source of the raw materials. Such information added to a sustainability blockchain can be used in marketing to attract environmentally conscious consumers.

A retailer might use computer vision and edge computing in its store to scan consumers to show a 3D on-the-spot representation of how the clothing the shopper is looking at will actually fit them. This could improve sales as well as eliminate the need for dressing rooms and the associated security and privacy issues. But the data can also be sent to the cloud to be blended with other consumer data to inform the company's larger strategy.

Look for opportunities to get more out of the IoT you have. What else can you do with the data it generates? What other data can you use it to collect and process?

2. Decide which apps you need at the edge. You might need to migrate an app, embed some analytics, or even write a custom app; it all depends on what you're trying to do. Let your business goals guide you in selecting apps.

A good place to learn more about developing apps for the edge is an OpenDev conference, organized by the OpenStack Foundation. OpenStack is the open-source cloud computing project, and it just so happens that edge computing is a hot topic there. It also happens that open source is hot in edge computing, as it is almost in all computing. You can also consider apps offered by edge computing vendors and embedded analytics provided by BI app vendors.

3. Select new tech you want to use. You can ask vendors to give you a demo so you can get a feel for which tech you want to use, what apps are available, and some guidance on developing apps for it. For example, Amazon Web Service (AWS) Lambda@Edge and AWS Greengrass, Microsoft Azure IoT Edge, and Cisco and IBM Watson IoT offer a blend of tech as well as analytics and apps for IoT edge computing.

You can also check out a wide variety of blockchain, CDN, peer-to-peer, and other pure play vendors. But don't overlook the tech giants such as Dell Inc., IBM Corp., and Hewlett Packard Enterprise (HPE), which have all taken to adding additional storage and computing and analytics capabilities to their hardware to transform them into edge devices.

Get a feel for your options before you start seriously evaluating vendors. Also, take an inventory of the types of IoT tech your company currently uses and the types it would like to add, before you start talking to vendors. That way, you're more likely to stay on track.

4. Plan for the evolution. There is a pattern in the path to maturity that all immature tech and trends follow. Expect that same evolution to occur with BI and the edge. So, yes, there will likely be a consolidation of vendors are some point; do keep that in mind.

Also look for the decoupling of cloud tech from the cloud proper so that they can also be used at the edge, too. You'll want to see such a decoupling as that will give you the maximum flexibility in using cloud or edge. It will likely drive down costs and drive up efficiencies through smarter apps from a diverse ecosystem rather than from a single vendor. Make your plan both short-term and long-term to ensure you can adapt to foreseeable changes without a large loss in previous investments.

Get Our Best Stories!

Sign up for What's New Now to get our top stories delivered to your inbox every morning.

This newsletter may contain advertising, deals, or affiliate links. Subscribing to a newsletter indicates your consent to our Terms of Use and Privacy Policy. You may unsubscribe from the newsletters at any time.


Thanks for signing up!

Your subscription has been confirmed. Keep an eye on your inbox!

Sign up for other newsletters

Table of Contents

TRENDING

About Pam Baker

A prolific writer, Pam Baker's published work appears in many leading print and online publications such as Institutional Investor magazine, CIO, President & CEO magazine, IT World, and ReadWriteWeb as well as many others. Formerly Managing Editor for Wireless IQ and Telematics Journal at NY-Based ABI Research, she also served as a freelance analyst for London-based VisionGain and a freelance market research report writer for Evans Data Corporation.

Read Pam's full bio

Read the latest from Pam Baker