-
Notifications
You must be signed in to change notification settings - Fork 34
Open
Description
The HyperNEAT class has a bug I think. The setup() method reads as follows:
`class HyperNEAT(BaseAlgorithm):
def init(
self.pop_size = neat.pop_size
def setup(self, state=State()):
state = self.neat.setup(state)
state = self.substrate.setup(state)
return self.hyper_genome.setup(state)`
The variables in the function definition def setup(self, state=State()):
should be def setup(self, state: State):
. This way the state is not automatically always a new State() object.
Thoughts?
Activity
WLS2002 commentedon May 11, 2025
Thanks for the suggestion!
The
HyperNEAT
class (like many other classes in TensorNEAT inherits fromStatefulBaseClass
, which defines thesetup(self, state=State())
method. The use ofstate=State()
in thesetup()
method is intentional. The goal is to support both use cases:state
,StatefulBaseClass
creates a newState()
object. The algorithm then stores its internal setup data in this state and returns a new state object with the updates.state
, the algorithm stores its setup information in that state, again returning a new (modified) state object.So I believe this design is reasonable, and I'm happy to continue the discussion if needed.
mg10011 commentedon May 11, 2025
That makes sense. My issue was that I kept getting an error as when the HyperNEAT.setup() called the NEAT.setup(), using a RecurrentGenome. For some reason, the state became reinitialized, so my monkey patch was to rewrite NEAT.setup(). However, I see your reasons for the design choice. It seems sensible.
My monkey patch: