The world has grown impatient: from same-day delivery services to smartphone apps eliminating the wait for a cab, a date, or a table at a restaurant. This need for instant responsiveness comes with its corollary: relevance.
“As buyers move through on their journey, what is relevant to them changes. In fact, it changes at each step of the buyers’ journey. Relevance is defined by the situation, pressures and feelings the buyer has at each step; in other words it’s contextual.” Forbes.
What does this implies in terms of technology?
Well, the basic fact is that current Rules Engines run on top of Databases. And this was absolutly fine with low volumes and small streams of data. But with traffic increasing in speed and volume, technical teams started to connect faster Databases. However, when coping with millions of events per second, this architecture reaches its limits in term of scalability and responsiveness.
Two classical approaches coexist in traditional rules engines:
Load in-memory copies of all Contexts and Objects.
Load in-memory only those Objects and Contexts that are required by the rule being executed.
In both cases, the split of Objects and Contexts in different nodes directly affects performance and creates bottlenecks.
Learn how Intersec overcame this issue by downloading our whitepaper.