I have a simple langgraph chain in place and I noticed that the counting of tokens is oddly off in langsmith in comperison to OpenAI online tokenizer or Python tokenizer:
Python program:
import tiktokendef num_tokens_from_string(string: str) -> int: encoding = tiktoken.get_encoding("cl100k_base") num_tokens = len(encoding.encode(string)) return num_tokenstest_string = """ Ctrl+c ctrl+v from langsmith trace """print(num_tokens_from_string(test_string))Output:11185
Question:
- Why are the token counts different for langsmith and OpenAI?
- How do I send / set in Langsmith correct counting token methods? For this python code:
response = ChatOpenAI().invoke("Hello!")

