I have a class which is a set of words:
import mathfrom typing import SelfFREQUENCIES: dict[str, float] = {}FREQUENCIES['trial'] = 0.023FREQUENCIES['raise'] = 0.124def entropy(probability: float) -> float: if probability == 0.0: return 0.0 return -probability*math.log(probability)/math.log(2.0)class Word(str): def __new__(cls, value: str = 'aaaaa') -> Self: value = value.strip().lower() if not re.match(r'[a-z]{5}', value): raise ValueError(f'"{value}" does not follow the pattern [a-z]{5}!') return super().__new__(cls, value) def __init__(self, value: str = 'aaaaa') -> None: self.frequency = FREQUENCIES[value]class Universe(set[Word]): def __init__(self) -> None: super().__init__(set()) self.frequency = 0.0 self.statistics: dict[int, dict[str, float]] = {} self.updated = True def __iadd__(self, word: Word) -> Self: if word in self: return self self.updated = False self.add(word) return self @property def uncertainty(self) -> float: if not self.updated: self.frequency = sum(word.frequency for word in self) self.statistics = {i: {chr(letter): sum(wordle.frequency for wordle in self if wordle[i] == chr(letter)) for letter in range(ord('a'), ord('z')+1)} for i in range(len(Word()))} self.updated = True return sum(sum(entropy(letterFrequency/self.frequency) for letterFrequency in self.statistics[position].values()) for position in self.statistics)
When I add (with +=) an element to it I make sure I make the self.updated
dirty.
This is the tricky part. Originally, I created the @property
to calculate the uncertainty of the Universe
.
I would like to use this something like:
universe = Universe()universe += Word('trial')universe += Word('raise')print(f'My universe has {len(universe)} words and an uncertainty of {universe.uncertainty} bits."
The odd side effect I seem to noticed is that self.updated
is ALWAYSTrue
(i can see it on the debugger window)!!!
If I remove @property
the code runs exactly like I would expect it to (self.updated
only becomes True
after I explicitly call self.uncertainty()
).
What am I missing here?
SOLUTIONI am using Visual Studio Code and when the variable universe
appears on the debugger window, its @property
self.uncertainty
shows up on the debugger as a variable (showing its value). That means the Visual Studio Code debugger is calling the @property
every step of my debugging and therefore it is setting self.updated
to True
.That also explains why the code "works" when I remove the @property
: the debugger does NOT call self.uncertainty()
on its own (which sets self.update
to True
)