It’s a familiar scenario: you’re responsible for sales campaigns at an ecommerce or online retailer and have devised a promotion that is offering an attractive bundle of products at a knockdown price.  The bundle comprises a mix of standard off-the-shelf items but grouped together make an unusual one-off offer. On launch, the bundle flies off the shelves and within 24 hours stock is exhausted. A roaring success!

Later, analysis of the online visitor information showed that the offer had garnered even greater interest and was many times oversubscribed, leaving multiple abandoned shopping carts. A historical check on inventory also the following day revealed that the individual items from the promotional bundle were actually available; separately and from different locations around the company. Knowing this fact 24 hours later made the campaign feel bittersweet. Yes, it had been a success, the promotion had sold out in a matter of hours but the high demand meant it could have been so much better. If only your company had known the whereabouts of all its products in real time and married that with ‘live’ purchasing information then maybe sales teams could have made the adjustments to re-route sales: a case of ‘not having true real time insights costing you business’.

The future of business lies in real time and businesses that are able to close the time gap between customer demands and fulfilment can expect to flourish. Like it or not, real time tools are changing business forever and set to become even more important in an increasingly complex-data-intensive world. IDC estimates that by 2020, business transactions on the Internet will reach £310 billion per day. This is driven by many things, but possibly nothing greater than the era of the Internet of Things. And this transition will require a new way of handling the exponential increase in data.

The Internet of Things will require us to potentially rethink both how we process data and how we make real time decisions within the business intelligence systems that rely on that data.  Today many enterprise analytics solutions won’t be delivering a view of things in ‘true’ real-time.  The user may be able to interact with the data in real time but if the data is old, then the insights are not a true reflection of events.  It’s great to get actionable insights from your data but any action often needs to take place almost immediately.

Real time is becoming genuinely ‘real-time’ but within the context of its environment.  It used to be that when a customer or prospect asked you for real-time, what they really wanted was something faster than their apps and systems were performing at the time – so maybe this meant on the same day (instead of overnight) or maybe within an hour (instead of hours later).  In a sale’s situation you may want to know about an abandoned shopping cart within minutes to understand the reasons why it never completed – real time should therefore be as fast as it needs to be and appropriate to the contextual environment.

In our ecommerce/online retailer scenario the business had created a campaign based on an assumption that there was sufficient stock to meet the needs. Having to wait 24 hours for an accurate report of the success of the campaign resulted in loss of sales, frustrated customers and an embarrassed marketer.

For the Internet of Things with increased volumes and velocity of data real time is about the interaction and what you can do with it.  Data systems must be able to process and ingest this data in real time or the ability to make relevant decisions will be lost to time.

A true real time data processing system has to be able handle data of any type and from any source, in other words: Big Data.  Data growth is expected to continue on its explosive trajectory, a recent report from IBM stated that over the last 2 years 90% of the world’s data has been created.  The volumes are staggering with 90% captured in an unstructured format i.e. in the form of ‘conversations being generated on Facebook, Twitter, WhatsApp etc.  Any source might include a website, smartphone, refrigerator or a sensor in a car; there will be billions of potential use cases and applications.  All of these different data sources would need to be aggregated together in real time for analysis to feed action systems instantly making demands of data ingestion, data storage and data query.

Rapid ingestion of data is crucial for any streaming pipeline to be as efficient as possible and the problem with many of today’s analytics solutions is that they rely on batch processing to provide what are purported to be real time reports for business intelligence.

Poor data storage also holds back companies with their analytics strategies.  A data lake is increasingly one of the most common approaches for storing large amounts of data in one place.  It works on the principle that by creating a vast reservoir of all your data so it can be accessed equally by everyone in the business, without any need to specially prepare it.  The contents within the data are often likely to reveal important online patterns of behaviour and potentially identify individual user activity.  Consumer data could reveal anything from age, gender, income or demographic and even the propensity to purchase.  Insights on behaviour can help you improve customer service, tailor product offerings and provide personalised treatments. The right information also makes it easier to identify and resolve any problems.  This rich granular information enables the discovery of insights without knowing what questions need to be asked in advance.  There should be no limits to what you can collect and store.  You should also be able to inspect and ask questions of your historical data using any set of measures, date ranges and event details at any time.

The final piece of the jigsaw is an ability to ask and query any data across any timeframe without the need for predefined measures or time buckets for data.  What would be ideal is that if you could preserve each data point to enable unlimited segmentation and inspection so any event detail is never lost in processing or reporting.  When analysis happens on the fly we don’t need to know the questions in advance.

Real-time, all of the time, requires a new way of collecting, storing and interrogating data enabling marketers to both monitor campaigns and make ‘live’ adjustments to meet demand.  Ultimately real time is relative in context to the user.  While blistering real time speeds are bewitching don’t fall foul of that ‘need for speed’.  Yes, there’s an allure to instantaneous information but the reality is that real time data is only ever beneficial if you can actually do something with it, regardless of whether it took milliseconds or minutes to arrive.

Not being ‘genuinely’ real time could mean missing out on some of the action but having ‘appropriate’ real-time is likely to win you more of the business and for much less.

John Fleming

John Fleming

Contributor


Marketing Director EMEA & APAC - Webtrends.