Member-only story

Large model output probability logprobs

Beck Moulton
5 min readJun 24, 2024

--

When the logprobsparameter is enabled in the ChatGPT API, the API returns the logarithmic probability of each output tag, as well as a finite number of the most likely tags and their logarithmic probabilities at each tag location.

https://cookbook.openai.com/examples/using_logprobs

Relevant request parameters include:

  • Logprobs: Whether to return the logarithmic probability of the output token. If set to true, returns the logarithmic probability of each output token in the message content. Not currently available in the gpt-4-vision-previewmodel.
  • top_logprobs: An integer between 0 and 5 specifying the most likely number of tokens to be returned at each tag position, each with an associated logarithmic probability. If this parameter is used, logprobsmust be set to true.

The logarithmic probability of the output tag represents the likelihood of each tag appearing in the sequence given the context. Simply put, the logarithmic probability is log (p), where pis the probability based on the previous tag in the context. Some key points about logprobs:

  • A higher logarithmic probability indicates a higher likelihood of the marker appearing in that context. This allows users to evaluate the confidence level of the model in its output or explore alternative responses considered by the model.
  • The logarithmic probability can be any negative number or 0.0. 0.0corresponds to a 100% probability.

--

--

Beck Moulton
Beck Moulton

Written by Beck Moulton

Focus on the back-end field, do actual combat technology sharing Buy me a Coffee if You Appreciate My Hard Work https://www.buymeacoffee.com/BeckMoulton

No responses yet