Returns the text from its tokenized representation.
decode(tokens, options?)
tokens
An array of integer tokens.
options optional
An optional object containing the following optional properties:
The name of the model to use. Defaults to the current codebook model.
For example, "llama2-7b-chat" or "gpt-3.5-turbo".
An object that maps additional special tokens to their integer values.
For example: { "<|endoftext|>": 50256 }. The additional tokens are
merged with the default special tokens for the model.
(Note that not all models support additional special tokens.)
The decoded string.
ai.decode([1, 29871, 22172], { model: 'llama2-7b-chat' })
"hello"
ai.decode([100264, 15339, 1917, 0], { model: "gpt-4", specials: {"<|im_start|>": 100264 }})
"<|im_start|>hello world!"