Multiple client objects cause memory usage to keep growing

I have a simple healthcheck program that runs every 15s on aerospike instances. We primarily use this for service discovery and other internal stuff. The code is straightforward and looks like this:

func aspkHealth() error {

policy := aerospike.NewClientPolicy()

client, err := aerospike.NewClientWithPolicy(policy, "", 3000)
defer func(client *aerospike.Client) {
	if client != nil {

if err != nil {
	log.Error("error: %v", err)
	return err

return nil


Even though we close the client, the memory usage keeps growing. This is part of a long running process by the way. After doing a bit of memory profiling, I found this:

I know it is recommended to create a single client and reuse it. I just want to know why this particular design pattern causes the memory usage to grow?

The memory growing is a concern. I have to check why it is happening. Did you check via ps/top/htop how much resident memory is being used? Virtual Memory is not really allocated, so if all this is virtual, I wouldn’t be too worried.

Clients manage lots of connections and buffers in the back. Limiting the ClientPolicy.ConnectionQueueSize to 1 (or 2 if you use data API) will considerably help with the memory usage.

I’m very curious about the Resident memory consumption of the process. I’d appreciate if you could report back your observations.

Sure, the resident memory consumption keeps growing as well. The process with PID 21694 is the one that has the above health check function running on it.

Screen Shot 2020-02-14 at 3.25.42 PM

Also, the aerospike go client version used is v2.7.2 and aerospike server version is E-

Did setting the ClientPolicy.ConnectionQueueSize to 1 help?

No setting ConnectionQueueSize to 1 did not help. I changed my code to avoid creating and closing multiple client instances for now.

Thank you for your persistence. I think the latest release should resolve the issue (v2.8.1).

Changing your code to avoid using multiple instances is the superior solution though.