This endpoint can be used to convert strings into tokens. It is a simple proxy forwarding your requests to the desired model. Any LightOn model is deployed on a vLLM-based image. Supported Input:
prompt: Simple text string to tokenizemessages: Array of chat messages to tokenize (alternative to prompt)Bearer token authentication
Successful response
Response serializer for tokenize endpoint results.
Total number of tokens in the input text
The model used for tokenization
List of token IDs from the tokenization
Number of tokens in the prompt (alias for total_tokens)
Object type, always 'tokenization'