I create 2 shared memory arrays with different sizes as below. The first one works but the second one results in error:
TypeError: buffer is too small for requested array
If I change the size from 224x224 to 300x300 it's fine. Why?
### first shm - ok### fullFrame_bytes = 1280*720*3 # assuming RGB self.fullFrame_buffer_shm = SharedMemory(name="cap_buffer", create=True, size=fullFrame_bytes) # create buffer for getting results from the shm fullFrame_shape = [720,1280,3] # needs to be rows, cols as per openCV self.fullFrame_buffer = np.ndarray(fullFrame_shape, buffer=self.fullFrame_buffer_shm.buf, dtype=np.uint8) ### second shm - not ok ### segFrame_bytes = 224*224*4 # RGBA self.segFrame_buffer_shm = SharedMemory(name="seg_buffer", create=True, size=segFrame_bytes) # create buffer for getting results from the shm segFrame_shape = [224, 224, 4] self.segFrame_buffer = np.ndarray(segFrame_shape, buffer=self.segFrame_buffer_shm.buf, dtype=np.uint8)