Skip to content

Estimate token usage + dry run option #318

@taylorjdawson

Description

@taylorjdawson

Confirm this is a feature request for the Node library and not the underlying OpenAI API.

  • This is a feature request for the Node library

Describe the feature or improvement you're requesting

There a couple nodejs tokenizers laying around npm land:

However, they have to be kept maintained and up to date with the official tiktoken. It feels very natural that such a tool be apart of this repository and updated regularly by the openai team. I feel that pre-request token estimation is vital to ensuring hobbyist developers don't unknowingly succumb to financial demise.

Why apart of this library?
Take function calling for example:

    const response = await openai.chat.completions.create({
        model: "gpt-3.5-turbo",
        messages: messages,
        functions: functions,
        function_call: "auto",  // auto is default, but we'll be explicit
    });

I could intercept this function call above to view what data is sent to chatGPT so I can figure out how many tokens this call takes with the message + the functions, pipe them into gpt-tokenizer and see how many tokens are used. But if it were to be built into the API itself then you can have a dry run option on your nodejs api functions that estimate cost and then the developer could determine in the logic if they want to proceed with the actual run.

What else could be very helpful would be to include some parameter whether in the configuration or when calling api functions such that if the tokens and/or usd cost exceeds a certain limit the call auto fails and doesn't actually get executed.

Additional context

No response

Metadata

Metadata

Assignees

No one assigned

    Labels

    enhancementNew feature or requestopenai apiRelated to underlying OpenAI APIsdk

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions