got a composite key: {a:b:c} to identify each row on a set.
(a) currently has 20K values (and keeps growing)
(b) could have 1 to 100 (top)
(c) 6
so at the end I could have like 20 000 x 100 x 6 = 12 M
I’m constructing the key using loops, once I have 6K keys ready, I do a get many,
saved the response, and keep looping.
But once I get close to 5M keys (close over 800 getmany calls),
I got seg fault.
[2] 35182 segmentation fault (core dumped)
Just wanted to share how we resolved the issue.
We created an IDX over the set through the AMC console.
In code, we are using ->query (Predicates::EQUALS)
And loop over the index.
That’s a very big batch, and it would make sense to iterate more times over smaller batches. There’s a performance problem with getting that batch back and filling your network. There’s also, as you pointed out, an issue with the space needed to hold those keys. I would keep each batch call under 2K keys.