Isaac
Isaac

Reputation: 147

Sampling from weighted constraint solving in Minizinc?

Slightly unusual constraint solving problem I'm trying to implement in MiniZinc: I have a CSP that has some hard constraints and a soft constraint that the relations in the solution should have a statistically similar profile to an example solution. That is, if the input has 5 'red' and 12 'green', the found solution doesn't have to have exactly 5 and 12, but should be statistically likely to have a similar distribution...but also allow rare solutions that, for example, have no red.

While I could have a hard constraint that the distribution must exactly match the distribution, or get all of the possible solutions and sample from that, I'd prefer a search strategy that can get one solution that is statistically likely to have similar distribution (but can vary). Or a performance-equivalent way to do it.

Using the indomain random for the assignmentannotation seems like it might work, but from my understanding the only way to use weighting with it is to populate the domains with multiple values to pad out the domain with the correct weighting.

I could, alternately, frame it as an optimization problem and look for a solution that maximizes the similarity to the desired distribution, but I'd prefer something that meets the hard constraints and does a weighted sample from the entire possible solution space for the soft constraints.

Upvotes: 3

Views: 204

Answers (1)

Dekker1
Dekker1

Reputation: 5786

Handling soft constraint can be quite a challenge for most optimisation solvers. It generally depends on the application of the optimisation what the best way is to handle soft constraints. I think for your problem there are two approaches:

  • Given the problem it seems that you actually could do the processing of the soft constraints yourself. Given that the model is relatively easy solve, you could run the model with only the hard constraints using the -a flag, to give you all solutions, and then select the rank the solutions using a custom script. (Maybe have a look a the IPython extension to MiniZinc that would make processing the solutions easy). This is probably the most flexible solution and really leaves it up to you how you want to use the soft constraints. I think this approach works well for your problem since your soft constraint seems more of a ranking mechanism for the solutions.
  • Model with the actual soft-constraints and formulate an objective function. Soft constraints can be modelled with slack variables or boolean expressions. Recently there even appeared a specialised language for soft-constraints: MiniBrass. When modelling with soft-constraints it is often hard to formulate an objective function. It is not straightforward to decide which combination of soft-constraints would be better than others. This technique is still often used as it can be much more efficient than the first one.

In your question you refer to the search strategy using random selection. I believe that this actually wouldn't be an answer to your problem (or at least not by itself). Since searching with random selection would either, for satisfiability, only state the first solution it finds, which may not satisfy any soft constraints, or you will have to formulate an objective value anyway.

Just as general advise, you might want to consider running the model and constraining the similarity to a minimal value and seeing if these give the answers you are looking for. Getting these boundaries might be a great help in the search for solutions. (Setting this minimal value might also greatly increase the performance of the first approach)

Upvotes: 3

Related Questions