Imagine you want to broadcast the data generate with a generator. There are so many consumer which receive the data and consume it. Generator will generate the data regarding if there is any consumer or not. Every consumer which join can receive the data from the moment which joined. The best solution is to use observer pattern, but what if the number of consumer are so high and the rate of generating the information is also so high. In this case when the data 1 is available and you are broadcasting the data to all observers, the data 2 is ready and you didn't finish broadcasting the data 1 yet. I think about is there any solution with this problem or not? One thing that I think about is: Is it possible some thread execute a same function in a shared space? In other word, is it possible to run function once and all the thread get the result all together.
Look at this code:
class data_provider(): def __init__(self) -> None: self.cnt = 0 self.stop = False self.data_received = Event() self.data_received.clear() data_generator_thread = threading.Thread(target= self.data_generator) data_generator_thread.start() def data_sender(self): self.data_received.wait() while not self.stop: yield { 'Name': 'Data', 'Count': self.cnt} self.data_received.clear() self.data_received.wait() def data_generator(self): while(self.cnt < 20): self.cnt += 1 self.data_received.set() sleep(1) self.stop = True self.data_received.set()
and main.py
obj = data_provider()def printing(id): for data in obj.data_sender(): print(str(id) +' : '+ str(data))our_threads = []for i in range(100): our_threads.append(Thread(target= printing, args=(i,))) our_threads[i].start()for i in range(100): our_threads[i].join()print('End')
If all the thread run the data_sender() once and all get the same result as soon as the other thread receive the information.
Any idea?