how much does chatgpt api cost(gpt-3.5 & gpt-4)?

Is chatgpt api free?

The OpenAI API is not free. The pricing varies depending on the model you choose.

how much does chatgpt api cost?

The cost of using the ChatGPT API varies based on the number of tokens utilized. Tokens are the building blocks of the AI's responses, with each word or piece of a word counting as a token.

chatgpt-3.5 pricing

For the GPT-3.5 model that powers ChatGPT conversations, pricing is tiered based on context length:

  • For the 4k token context, it costs $0.0015 per 1,000 input tokens and $0.002 per 1,000 output tokens generated.

  • For the 16k token context, pricing increases to $0.003 per 1,000 input tokens and $0.004 per 1,000 outputs.

GPT-4 pricing

The newly launched GPT-4 has two pricing options depending on context length:

For models with an 8k token context (like GPT-4 and GPT-4-0314), pricing is:

  • $0.03 per 1,000 prompt tokens
  • $0.06 per 1,000 sampled output tokens

For models with a 32k token context (like GPT-4-32k and GPT-4-32k-0314), pricing is:

  • $0.06 per 1,000 prompt tokens
  • $0.12 per 1,000 sampled output tokens

Importantly, the above API pricing is separate from the ChatGPT Plus subscription that runs $20 per month. The Plus subscription only covers usage on chat.openai.com.

What is token?

To put the token amounts into perspective, 1,000 tokens equates to approximately 750 words.

In summary, ChatGPT API costs scale based on the power of the model used and number of tokens required, with more advanced models and longer conversations carrying higher fees. Users should select the pricing plan that aligns with their unique use case and projected volume.

visit AI.LS to get a Chatgpt API KEY