Today, we are announcing the Early Adopter release of version 2.0 of the Aerospike Node.js client. Version 2.0 represents a major update to the Node.js client library with some user-visible changes, as well as lots of refactoring under the hood. Please find below a list of all major changes, including some backward incompatible API changes.
This Early Adopter version of the Aerospike Node.js client is not yet considered production-ready. It is intended to preview the set of significant changes that are coming in the final v2.0 release and to solicit feedback from the community. If you have an existing application that uses the Aerospike Node.js client v1.x, we highly recommend that you download the firstname.lastname@example.org release from npmjs.com and try it out in a non-production environment. Please report any issues or problems you encounter, and provide any other feedback about the changes described below.
Depending on the feedback we receive, we intend to release the final version of the v2.0 client in a few weeks’ time.
Aerospike Node.js client v2.0 builds on the foundation of the recently released Aerospike C/C++ client version 4.0 and its support for asynchronous event-based processing, as introduced in this blog post. All single record commands (put, get, operate, apply, …) as well as the new batch read command (see below) are now implemented using libuv’s asynchronous I/O model and run on Node.js’s main event loop.
Switching to the async client commands has resulted in significant performance improvements. Using the Aerospike Node.js client benchmark, we measured improvements in the number of transactions per second (TPS) of 35% to 45%, depending on the size of the libuv thread pool used (the sync client benefits significantly from more worker threads, while there is no impact for the async client). We will post a future blog post to show in detail all the performance gains.
Supported Node.js Versions
With the release of the Aerospike Node.js client v2.0, we are dropping support for Node.js v0.10. This 3-year-old version of Node.js is build on libuv v0.10, which is incompatible with the v1.x releases of libuv used in all later releases of Node.js. The Aerospike client continues to support Node.js v0.12, as well as all subsequent releases (including io.js).
Error-First Callback Semantics
One of the most frequent pieces of feedback we have received from the Node.js community is that the callback semantics used by the Aerospike Node.js client v1.x did not follow the established Node.js convention of “error-first callbacks”. We have taken this feedback to heart: in v2.0, the callback semantics of the client have been updated to follow these conventions.
Note that this is a backward incompatible change: existing applications may need to be updated to work correctly with the new callback semantics. Please refer to the detailed list of backward incompatible API changes for more details.
The changes to the callback semantics have been implemented as part of a major cleanup/rewrite of the JS portion of the Aerospike Node.js client that has brought many improvements to maintainability and ongoing feature development for the client. We wish to especially thank Robert Lindstädt of Exozet Berlin GmbH for his major contributions in designing and implementing the new API.
New Batch Read Command
The new batch read client command allows a single request to retrieve multiple records using any combination of keys, namespaces, bin name filters, and read types (read bins, read header, exists). The new command replaces the existing batch get/select/exist commands which are now considered deprecated. The batch read command uses the batch index protocol and requires Aerospike server version >= 3.6.0.
Where to Get More Information
For more information about Aerospike’s Node.js client, please refer to the following:
- Aerospike’s Node.js client is described in our documentation
- The source code for Node.js client is on Github
Tell us about your awesome project that uses Aerospike’s Node.js client by:
- Tweeting us at @aerospikedb
- Sharing it on our LinkedIn user group
- Posting your story in our user forum