It is no secret that the world has grown impatient: from same-day delivery services to smartphone apps eliminating the wait for a cab, a date, or a table at a restaurant. This need for instant responsiveness comes with its corollary: relevance.
“As buyers move through on their journey, what is relevant to them changes. In fact, it changes at each step of the buyers’ journey. Relevance is defined by the situation, pressures and feelings the buyer has at each step; in other words it’s contextual.” Forbes.
What does this implies in terms of technology?
Contextual Real-time triggering
Use cases unleashed by this technology are endless. Just to mention a few:
Contextual Marketing: John upgrades from 3G to a brand new 4G phone. He is now at home and the place is well covered by 4G. No marketing material has been sent to him in the past 15 days: it is the right time to offer him a Try & Buy 4G upgrade plan.
Customer Care: Javier is at the airport watching the last episode of Game of Thrones streamed in HD. Unfortunately, his monthly fair-use will not be sufficient to view the remaining 500MB. An alert suggesting to reduce the quality or to buy an instant 3GB extension would surely be appreciated...
Policy Enforcement: Nabila is visiting Europe this summer. Every time she lands in a different country, she receives a customized message with the local tariff plan and her remaining credit for voice and data. This would definitely boost her usage abroad...
Fraud: Susan has just landed in Manila and is now paying her local Travel Agent. Having located Susan’s mobile at JFK a few hours ago and now in the Philippines, her service provider can instantly confirm with MasterCard that it’s not a fraudulent transaction. No bad surprise.
IoT: a truck containing an urgent shipment is now less than 10 miles away from its warehouse in Alaska. The weather is good enough for offloading but the team on site is shorthanded. Detecting this situation, a notification is triggered to the warehouse, allowing the standby team to get ready to help. Anticipation at its best!
Security: Peter’s password for your online portal has been locked due to too many attempts to change it. This is now the second time someone is calling the call center to reset Peter’s password. The first time, it failed due to lack of proper credentials. The call is routed to a specialist team and an SMS is sent to Peter asking him if he is the originator of these requests and providing direct access to the fraud prevention team. Reassuring.
Location tarification to boost usage abroad
Nowadays, any relevant Service Provider solution touching customers must be made constantly aware of their individual context and spot Real-time Triggers.
Challenges raised by Real-time triggers have been addressed for a long time; for example being notified before exhaustion of your roaming pass; but we are far from having explored all use cases for Context awareness.
Why not? Because Context is way more complex when it comes to evaluating it compared to Real-time triggers. For a modern mobile operator, identifying a real-time trigger is as easy as it is for clothes retailers to notice someone has entered their shop.
But, context awareness supposes also to answer questions like: Is this person a recurring customer? What is their average spending in store? What mood are they in? What are their needs? Etc.
For each of these case many triggers have to be monitored.
State of the Art
Current Rules Engines are running on top of Databases. This was fine with low volumes and small streams of data. With traffic increasing in speed and volume, issues were solved by connecting faster Databases. But when coping with millions of events per second, this architecture reaches its limits in term of scalability and responsiveness.
Traditional Rules Engine
Two classical approaches coexist in traditional rules engines:
Whatever the speed of the underlying DB, processing Rules always apply to an Object (e.g. a customer), taking into account its Context. This Context is linked to both the Object and the Rule; e.g. 100 MB of data being used since the beginning of the Rule.
- Load in-memory copies of all Contexts and Objects. However, this approach raises scalability issues for large volumes. Indeed, keeping contexts updated generates huge flows of transfers to the database, leading to strong latency and potential inconsistency between the database and its in-memory copy.
Split of objects in nodes
- Load in-memory only those Objects and Contexts that are required by the rule being executed. This technical choice can be beneficial in some cases, but generally induces even more latency and requests throughput. In both cases, the split of Objects and Contexts in different nodes directly affects performance and creates bottlenecks.
When the Solution is inside the box: In-Memory Event Driven Architecture
The Intersec platform executes the rules engine directly inside the In-Memory database, eliminating database request latencies and bottlenecks. Each incoming event is directly handled in the Context of the attached Object (a customer, a device, a data record…) resulting in a very high event processing capability. To process Rules related to an Object, two pieces of information are typically required: the current state of the Object and the local execution Context of the Rule. Horizontal scalability can be achieved by splitting the Object database across multiple nodes, each handling only a portion of the database (called a shard).
In Ram DB EDA
However, such a distribution can affect performance if it increases the volume of information transferred back and forth between the nodes. To avoid this risk and maximize event throughput, all the rules execution Contexts are embedded inside each Object data record. The event throughput is maximized while scalability ensured.
Maximizing event throughput requires that Object data and Rule Context be both stored on the same node. This is achieved by allocating the Rule execution Context of each rule inside each Object data record. Each Object record may store Contexts for Rules that don’t apply to this Object. To alleviate the induced RAM memory waste and avoid scaling issues, Intersec’s In-Memory database supports dynamic data compression and eliminates null records. As a result, given an Object database containing 100 million Objects with thousands of provisioned Rules, a typical platform with 4 processing nodes is able to handle millions of events per second.
Operators & Customers' Benefits
Thanks to these evolutions of Event Driven Architectures, combined with In-Memory Databases, operators can now cope with the huge amount of data they generate and transform it into valuable insights. Beyond the variety of applications, the advantage of having a flexible rule engine is to allow the customization of any use case without the need for intensive IT resources. With such powerful tools, Service Providers have the ability to boost their responsiveness at the level of their competitors OTT, despite the complexity of their legacy IT. Intersec is poised to help operators with these challenges and provide a modular approach to the rules engine. In addition to the benefits above that the operator realizes, end users will be able to enjoy all of the services they have opted into, with associated permissions, and feel that the operator is being attentive by ensuring that they are not being inundated with alerts and messages.