Reputation: 23
I am working on a cache simulator using simpy on Python3. I create a Request object like this:
req = Request(req_id, mapper_id, task , source, destination, path, req_offset, req_size, task.job.iotype)
and pass it to the following function:
def generate_event(req_old, dc, env, event_type):
req = copy.deepcopy(req_old)
del req_old
req.rtype = event_type
req.set_startTime(env.now)
event = env.process(readReqEvent(req, dc, env))
If I donot make a deepcopy if the object in generate_event function, whenever I change a field in one object, the other objects that I create after the first one, will see the change. The deepcopy function is a time-wise heavy function and makes the whole simulator slow.
My question is the objects should be independant and should not share the same reference. Am I missing something here?
Upvotes: 0
Views: 666
Reputation: 45750
... whenever I change a field in one object, the other objects that I create after the first one, will see the change"
That means that they're all literally the same object. You have multiple references pointing to the same object, not multiple references to multiple objects. Deepcopying fixes that by making sure that you have multiple objects that can operate independently.
If you're dealing with mutable objects, you need to either make copies at critical steps so that every consumer receives their own object that they can manipulate as they need, or have the objects be read-only, and force the consumers to make copies as they need (which may be better if only some users need to make modifications).
Upvotes: 1
Reputation: 11229
You don't seem to need the deepcopy. Just name req_old as req:
def generate_event(req, dc, env, event_type):
req.rtype = event_type
req.set_startTime(env.now)
event = env.process(readReqEvent(req, dc, env))
Upvotes: 1