Nello
Nello

Reputation: 769

Is this expected behaviour for a Set of arrays in Ruby?

We're doing a bit of work in Ruby 1.8.7 that requires traversing and partitioning an undirected graph, that has been failing weirdly in production. When I distil the failing code down to its barest components, I get this strangely failing test:

it 'should be able to clear a ruby set of arrays' do
  a = ["2", "b", "d"]
  b = ["1", "a", "c", "e", "f"]
  set = Set.new([a, b])
  a.concat(b)

  p "before clear: #{set.inspect}"
  set.clear
  p "after clear: #{set.inspect}"
  set.size.should == 0
end

The test fails with this output:

"before clear: #<Set: {[\"1\", \"a\", \"c\", \"e\", \"f\"], [\"2\", \"b\", \"d\", \"1\", \"a\", \"c\", \"e\", \"f\"]}>"
"after clear: #<Set: {[\"2\", \"b\", \"d\", \"1\", \"a\", \"c\", \"e\", \"f\"]}>"

expected: 0
     got: 1 (using ==)

Attempts to delete from the set also behave in strange ways. I'm guessing that Ruby is getting hung up on the hash values of the keys in the array changing under concat(), but surely I should still be able to clear the Set. Right?

Upvotes: 4

Views: 186

Answers (2)

Nello
Nello

Reputation: 769

The .dup approach was indeed my first work-around, and did as advertised.

I ended up adding the following monkey-patch to Set:

class Set
  def rehash
    @hash.rehash
  end
end

which allows me to rehash the set's keys after any operation that changes their hash values.

This appears to all be fixed in Ruby 1.9.

Upvotes: 1

philosodad
philosodad

Reputation: 1808

There is a workaround for this, if you duplicate the set after you modify the keys, the new set will have the updated keys and clear properly. So setting set = set.dup will fix that problem.

Upvotes: 1

Related Questions