After 14 months of work, the developers behind multi-dimensional tensor library Nx have decided to cut their first official release and share v0.1 with the wider Elixir ecosystem. Nx developer and Elixir creator José Valim also used the opportunity to provide a quick glance at what the future holds for those looking to use the programming language for machine learning.
Before putting his efforts into Elixir, Valim was mostly known as a member of the core team behind web app framework Ruby on Rails. There he was confronted with a heightened demand for building concurrent applications, which sparked his interest in multi-core software. In 2011 Valim decided to tackle the problem by starting work on Elixir, which he described as a “modern approach to programming for the Erlang VM” and a tool for building scalable, fault-tolerant and maintainable applications. The first official release followed in 2012.
Although Elixir can’t be found in the top ranks of programming language lists after its first decade, it has been picked up for various web applications banking on its delivery guarantees and concurrency features. Prominent examples are the core services of communication platform Discord and PagerDuty’s service for scheduling notifications.
Nx signifies a first step towards what Valim calls Numerical Elixir, an initiative hoped to satisfy the growing interest in machine learning and data analytics by offering corresponding tools. Besides Nx, the effort saw Elixir neural networks component Axon, dataframes project Explorer, and science dataset normalisation tool scidata come into being.
While other language teams decided on adding Python bindings to their creation to support the ML use case, the Elixir developers went the route of a foundational project to leverage the potential of the Erlang VM Elixir runs on, and offer an alternative platform for new developments.
Nx mostly promises to provide a collection of functions and data types for numerical computing, as well as compilers that allow their effective use on CPUs and GPUs. Its main feature is a subset of Elixir called numerical definitions (defn) that is adjusted for numerical computing and makes for “highly optimized code to run on the CPU, the GPU, or even Cloud TPUs” once compiled. The numerical definitions also serve Axon as a basis, and help the project to realise things like convolutional, generative, structured, and vision-related neural networks.
In the leadup to the 0.1 release, Nx learned to work with
while loops in order to support recurrent models like the ones used in speech recognition in the future, and gained hooks to numerical definitions as well as data streaming capabilities. The latter is useful when implementing distributed learning — something Valim and company plan to look into somewhere along the line — and inference, amongst other things.
Nx 0.1 also comes fitted with an Nx.LinAlg module, which contains an initial series of linear algebra functions to support models relying on matrix factorisation. It will surely see more additions in the coming months, however, Valim’s plans for Nx right now mainly focus on bigger things. Amongst other work items, he plans an implementation of checkpointing for the library’s automatic differentiation system to hit soon, to reduce memory usage when training large models.
To help users switch to Elixir for their machine learning projects, his team also looks into integration between model exchange format ONNX and Axon, which would make it easier to use already trained models, and offering precompiled Explorer bindings. A desktop version of Elixir code notebook Livebook should, meanwhile, provide developers who want to give Elixir a try first an easier way to start.