I'm trying to make the image/video streaming from here work on socket so I could make it send that stream to a server in the cloud. For testing, I just use Live Server to host the webpage and use websocket as the mechanism for transferring the data. I was able to make something show on the webpage, but it seems like it only shows the first frame the camera got after I ran my code. The webpage does seem to be getting more frames but I did notice that they seem to look the same. Then I stumbled upon cv2.imshow() and I does seem to be that VideoStream isn't giving a new frame/image, but if I ran the code from source repo, everything works fine.
Here's what I have so far, I'll only add the parts that I changed/added:
async def generate(): ... # yield (b'--frame\r\n' b'Content-Type: image/jpeg\r\n\r\n'+ # bytearray(encodedImage) + b'\r\n') yield encodedImage.tobytes()I made a main.py with:
import argparseimport asyncioimport threadingimport aioschedule as schedulefrom webstreaming import detect_motionfrom socket_helper import socket_serveif __name__ == '__main__': loop = asyncio.new_event_loop() asyncio.set_event_loop(loop) # construct the argument parser and parse command line arguments ap = argparse.ArgumentParser() ap.add_argument("-i", "--ip", type=str, required=True, help="ip address of the device") ap.add_argument("-o", "--port", type=int, required=True, help="ephemeral port number of the server (1024 to 65535)") ap.add_argument("-f", "--frame-count", type=int, default=32, help="# of frames used to construct the background model") args = vars(ap.parse_args()) # start a thread that will perform motion detection t = threading.Thread(target=detect_motion, args=( args["frame_count"],)) t.daemon = True t.start() asyncio.get_event_loop().run_until_complete(socket_serve()) asyncio.get_event_loop().run_forever()I made a socket helper as well: import time import websockets from websockets.sync.client import connect from webstreaming import generate HOST_NAME = "localhost" SOCKET_PORT = 15000 wsConn = any async def stream_cam_handler(websocket): print(f"stream_cam_handler") async for frame in generate(): time.sleep(0.2) await websocket.send(frame) async def handler(websocket, path): global wsConn wsConn = websocket data = await websocket.recv() reply = f"Data received as: {data}!" await websocket.send(reply) await stream_cam_handler(websocket) def socket_serve(): start_server = websockets.serve(handler, "localhost", 15000) return start_serverThen here's what I got for the webpage:
<!DOCTYPE html><html lang="en"><head><meta charset="UTF-8"><meta name="viewport" content="width=device-width, initial-scale=1.0"><title>Document</title></head><body><h1>Pi Video Surveillance</h1><img id="img_vid_feed" src="/" alt="video_feed"><br><script> let image = document.getElementById("img_vid_feed"); const ws = new WebSocket('ws://localhost:15000'); ws.binaryType = "blob"; ws.addEventListener('open', function (event) { ws.send('Connection Established'); }); ws.onmessage = function (event) { image.src = URL.createObjectURL(event.data); };</script></body></html>