Lua record.key function returns nil

lua
udf

#1

Hello,

I have problem with function record.key(rec) which described on https://www.aerospike.com/docs/udf/api/record.html

This function returns nil when I do record.key(rec) into my lua-script

From link metnioned above i check other functions - they works correct For example: record.bin_names(rec) or record.last_update_time(rec)

PS: I’m writing lua-script that will be applied on multiple records. In this script I want to get record PK and check does PK contain some substring or doesn’t.

Is it a bug? Or maybe I don’t understand how this function works.

Would anybody help me with this problem?

Explain me what i’m doing wrong or maybe there are other methods how to get record’s primary key.

The problem is reproduced on server version: 4.3.0.10

How to repdoduce issue:

1. Create record

aql> INSERT INTO test-namespace.test-set (PK, ‘key1’, ‘key2’) VALUES (‘myPK’, ‘value1’, ‘value2’)

2. save script to file ‘example.lua’

function get_record_key(rec)
    -- other function works correct
      names = record.bin_names(rec)
        for i, name in ipairs(names) do
        info("bin %d name = %s", i, tostring(name))
      end
 
    local recKey = record.key(rec)
    
    if (recKey == nil) then
      info("Record key is Nil")
    else
      info("Record: key=%s", tostring(recKey))
    end
end

3. Register module

aql> register module ‘example.lua’

4. Read server log

$ tail -f /var/log/aerospike/aerospike.log

5. Execute script

aql> execute example.get_record_key() on test-namespace.test-set

6. You will see INFO output : with 2 key\value and “Record key is Nil” message


Writing record with as_policy_write.AS_POLICY_KEY_SEND but at the time of key select keys are missing from record .it is multi node cluster
#2

UPD: When I set “send_key” to true

Example: aql> set key_send true

After that function record.last_update_time(rec) started return PK’s value for new records and updated records.

But how to be with records which is not updated ?


#3

You will have to use the digest in that case.