Open
Description
I assume that currently, the LLM App API client wrapper for OpenAI API does not support this streaming completions feature.
It is nice to have it where we can stream ChatGPT final responses into Pathway's output connectors such as Kafka, Redpanda or Debezium.
References:
https://github.com/openai/openai-cookbook/blob/main/examples/How_to_stream_completions.ipynb
https://platform.openai.com/docs/api-reference/completions/create#completions/create-stream