Redis grabs AI and edge trends by the horns with new offerings

Redis grabs AI and edge trends by the horns with new offerings

Database management system provider Redis Labs used its user conference RediConf to introduce the community to RedisAI, RedisGears, and RedisEdge.

RedisGears is a serverless engine to build operation pipes. It is meant to work with multi-model and cluster operations that can be done either event-driven or in batch. The new addition to the company’s enterprise stack can be accessed through a C-API and comprises of components for cluster management, execution management, and map/reduce.

While the first creates an abstraction to help access data through other components, execution management registers events on keys, and schedules script execution. Parts of a script that need to run on different shards are handled by map/reduce.

Since the proliferation of machine learning comes with its own set of challenges when it comes to tasks like serving models, Redis Labs has developed a module for serving tensors and executing deep learning graphs where the data is. RedisAI is a collaboration with [tensor]werk, who apparently work on infrastructure for data-defined software, but their internet presence is rather spare at this point so we’ll have to take their word for it for now.

Apart from the serving issue, RedisAI is meant to make moving data to a host for model execution unnecessary, which could speed the process up and help smooth the user experience, should there be a need for interaction. In the coming months the team wants to make the module generally available with support for TensorFlow, PyTorch, and ONNXRuntime.

Afterwards the focus will be on loading those machine learning backends dynamically, so that users can choose the one that makes the most sense for the device at hand, and implementing autobatching, a process to consolidate calls to the same model for better efficiency.

Though RedisAI is still in preview, it is already used in another new product: RedisEdge – a multi-model database for internet of things use cases. With more and more companies looking into edge computing, lately Rancher Labs and HashiCorp for example, it seems like an almost safe bet to make at this point.

RedisEdge bundles the new AI module with data structure store Redis, the stream data type, the RedisTimeSeries module, and also newly introduced RedisGears for communication between modules. The database promises to be able to “ingest millions of writes per second with <1ms latency and a very small footprint (<5MB), so it easily resides in constrained compute environments. It can run on a variety of edge devices and sensors ranging from ARM32 to x64-based hardware.”

RedisEdge is already available for the EdgeX Foundry platform, and Microsoft’s Azure IoT Edge.

Other announcements from RedisConf include the acquisition of data analysis tool RDBTools from HashedIn, which the company plans to keep available as a free tool, but also as a fully-managed service in the future.
Lately Redis mostly got attention for the changing of its licensing policy, which spurred discussions about the state of open source licenses and the usage of os projects in commercial offerings.