With chatgpt-tokenizer.com you convert long texts into several parts for processing by ChatGPT. The tool takes into account the maximum length of 2048 tokens, and provides ChatGPT with additional information for improved processing. https://chatgpt-tokenizer.com/