I'm using fastAPI with websocket
I want to do a stream of responses from chatGPTthis is my websocket endpoint
@srt_router.websocket("/ws")async def websocket_endpoint(websocket: WebSocket): await websocket.accept() try: file_name = await websocket.receive_text() file_path = os.path.join(folder_path, file_name) await websocket.send_text("File received, processing will start shortly...") await process_text(file_path , websocket) await websocket.send_text("Done") except WebSocketDisconnect as e: print(f"Problem: {e}") return except Exception as e: return
and the process_text code is basically
async def process_text(file:str, websocket_connection) with open(out_file, "w") as outfile: segments = load_segments(file) print('start processing file {}'.format(file)) print('__________________________________________________________________\n') text_gen,seg_count = zip(*TextFromSegments(segments)) for text in text_gen: output_text = send_to_gpt(text) print(output_text ) await websocket_connection.send_text(output_text +"</br>") outfile.write(output_text +'\n\n') return outfile
what happens is that instead of processing a segment, print it it to the console, and sending the text to the websocket, nothing happens on the frontend side, and after the entire thing get processed, all of the websocket messages get posted at once.
even:
await websocket.send_text("File received, processing will start shortly...")
which should at least be done before the start of process_text
what can I do to make sure the streaming happens while the processing is being done?