In-memory analytics company Kognitio will base the pricing of its Analytical Platform solely on the amount of memory employed by the servers hosting the platform. The Kognitio Analytical Platform utilises the performance of data held in memory to satisfy queries quickly. Data that is instantly needed by systems can stay in memory, while the remaining data can stay on cost-effective standard mechanical hard disks.
The Massively Parallel Processing (MPP) scale-out architecture allows platforms to be built with memory sizes ranging from half a terabyte to hundreds of terabytes. Customers are free to attach as much disk as they like to these systems, insulating them against increasing software license costs from the growth in data volumes.
Regardless of the amount of data put on a Kognitio Analytical Platform, users will only be charged for what they actually put into memory. Only 10 per cent of a data set the size of the US Library of Congress – around 200 terabytes – is regularly needed and would therefore reside in memory. As an example of the new pricing structure, the Library would pay only for the 20TBs of data used regularly.