infinite value while using cumulative constraint - minizinc

I am stuck with a cumulative constraint I seem not to use properly, I seek for help ! :)
I have tasks with precedence and, for some of them, required & forbidden resources.
I need to decide when to start a task & who to assign to it.
To do so, I'm using an array of decision variable resource_allocation:
array[Tasks, Resources] of var 0..1: resource_allocation; %Selection of resources per task.
To manage the required/forbidden resources, I used the following:
% Mandatory & Forbidden constraint allocation
constraint forall(t in Tasks, r in resource_required[t])(resource_allocation[t,r]=1);
constraint forall(t in Tasks, r in resource_forbidden[t])(resource_allocation[t,r]=0);
resource_required being set of int storing the resources number that are required/forbidden.
Each resource represents 1 worker, and each worker can only perform one task at a time, so I am trying to state that, for every resources, the cumsum of allocation can at max be 1.
It might be important to note that start is also a decision variable.
% Constraint allocated to only one task at a time
constraint forall(t in Resources)(
cumulative(start, duration, [resource_allocation[t, r] | t in Tasks], 1)
);
Doing so, I always end up with the following error
JC:70.12-82
in call 'cumulative'
cumulative:21-3.49-7
in binary '/\' operator expression
cumulative:25-3.49-7
in if-then-else expression
cumulative:26-5.48-9
in binary '/\' operator expression
cumulative:30-5.48-9
in if-then-else expression
cumulative:47.7-32
in call 'fzn_cumulative'
fzn_cumulative:4-9.20-17
in let expression
fzn_cumulative:8-13.20-17
in if-then-else expression
fzn_cumulative:10-17.19-17
in let expression
fzn_cumulative:12.21-74
in variable declaration for 'late'
in call 'max'
with i = <expression>
MiniZinc: evaluation error: arithmetic operation on infinite value
Process finished with non-zero exit code 1.
I need a little guidance, I looked in the source code of fzn_cumulative, but I don't get what is going on.
Thanks !

You might consider to limit the domains of your decisions variables.
int is unlimited (disregarding the limited number of bits per int) and may lead to overflow situations or complaints about infinite values.
include "globals.mzn";
set of int: Tasks = 1..3;
set of int: Resources = 1..3;
set of int: Durations = 1..10;
set of int: Times = 1..1000;
% Use this editor as a MiniZinc scratch book
array[Tasks, Resources] of var 0..1: resource_allocation; %Selection of resources per task.
array [Tasks] of var Times: start;
array [Tasks] of var Durations: duration;
array [Tasks] of set of Resources: resource_required = [{1,2},{2,3},{3}];
array [Tasks] of set of Resources: resource_forbidden = [{},{1},{1}];
% Mandatory & Forbidden constraint allocation
constraint forall(t in Tasks, r in resource_required[t])(resource_allocation[t,r]=1);
constraint forall(t in Tasks, r in resource_forbidden[t])(resource_allocation[t,r]=0);
% Constraint allocated to only one task at a time
% Changed "t in Resources" to "r in Resources"
constraint forall(r in Resources)(
cumulative(start, duration, [resource_allocation[t, r] | t in Tasks], 1)
);

Related

Few minizinc questions on constraints

A little bit of background. I'm trying to make a model for clustering a Design Structure Matrix(DSM). I made a draft model and have a couple of questions. Most of them are not directly related to DSM per se.
include "globals.mzn";
int: dsmSize = 7;
int: maxClusterSize = 7;
int: maxClusters = 4;
int: powcc = 2;
enum dsmElements = {A, B, C, D, E, F,G};
array[dsmElements, dsmElements] of int: dsm =
[|1,1,0,0,1,1,0
|0,1,0,1,0,0,1
|0,1,1,1,0,0,1
|0,1,1,1,1,0,1
|0,0,0,1,1,1,0
|1,0,0,0,1,1,0
|0,1,1,1,0,0,1|];
array[1..maxClusters] of var set of dsmElements: clusters;
array[1..maxClusters] of var int: clusterCard;
constraint forall(i in 1..maxClusters)(
clusterCard[i] = pow(card(clusters[i]), powcc)
);
% #1
% constraint forall(i, j in clusters where i != j)(card(i intersect j) == 0);
% #2
constraint forall(i, j in 1..maxClusters where i != j)(
card(clusters[i] intersect clusters[j]) == 0
);
% #3
% constraint all_different([i | i in clusters]);
constraint (clusters[1] union clusters[2] union clusters[3] union clusters[4]) = dsmElements;
var int: intraCost = sum(i in 1..maxClusters, j, k in clusters[i] where k != j)(
(dsm[j,k] + dsm[k,j]) * clusterCard[i]
) ;
var int: extraCost = sum(el in dsmElements,
c in clusters where card(c intersect {el}) = 0,
k,j in c)(
(dsm[j,k] + dsm[k,j]) * pow(card(dsmElements), powcc)
);
var int: TCC = trace("\(intraCost), \(extraCost)\n", intraCost+extraCost);
solve maximize TCC;
Question 1
I was under the impression, that constraints #1 and #2 are the same. However, seems like they are not. The question here is why? What is the difference?
Question 2
How can I replace constraint #2 with all_different? Does it make sense?
Question 3
Why the trace("\(intraCost), \(extraCost)\n", intraCost+extraCost); shows nothing in the output? The output I see using gecode is:
Running dsm.mzn
intraCost, extraCost
clusters = array1d(1..4, [{A, B, C, D, E, F, G}, {}, {}, {}]);
clusterCard = array1d(1..4, [49, 0, 0, 0]);
----------
<sipped to save space>
----------
clusters = array1d(1..4, [{B, C, D, G}, {A, E, F}, {}, {}]);
clusterCard = array1d(1..4, [16, 9, 0, 0]);
----------
==========
Finished in 5s 419msec
Question 4
The expression constraint (clusters[1] union clusters[2] union clusters[3] union clusters[4]) = dsmElements;, here I wanted to say that the union of all clusters should match the set of all nodes. Unfortunately, I did not find a way to make this big union more dynamic, so for now I just manually provide all clusters. Is there a way to make this expression return union of all sets from the array of sets?
Question 5
Basically, if I understand it correctly, for example from here, the Intra-cluster cost is the sum of all interactions within a cluster multiplied by the size of the cluster in some power, basically the cardinality of the set of nodes, that represents the cluster.
The Extra-cluster cost is a sum of interactions between some random element that does not belong to a cluster and all elements of that cluster multiplied by the cardinality of the whole space of nodes to some power.
The main question here is are the intraCost and extraCost I the model correct (they seem to be but still), and is there a better way to express these sums?
Thanks!
(Perhaps you would get more answers if you separate this into multiple questions.)
Question 3:
Here's an answer on the trace question:
When running the model, the trace actually shows this:
intraCost, extraCost
which is not what you expect, of course. Trace is in effect when creating the model, but at that stage there is no value of these two decision values and MiniZinc shows only the variable names. They got some values to show after the (first) solution is reached, and can then be shown in the output section.
trace is mostly used to see what's happening in loops where one can trace the (fixed) loop variables etc.
If you trace an array of decision variables then they will be represented in a different fashion, the array x will be shown as X_INTRODUCED_0_ etc.
And you can also use trace for domain reflection, e.g. using lb and ub to get the lower/upper value of the domain of a variable ("safe approximation of the bounds" as the documentation states it: https://www.minizinc.org/doc-2.5.5/en/predicates.html?highlight=ub_array). Here's an example which shows the domain of the intraCost variable:
constraint
trace("intraCost: \(lb(intraCost))..\(ub(intraCost))\n")
;
which shows
intraCost: -infinity..infinity
You can read a little more about trace here https://www.minizinc.org/doc-2.5.5/en/efficient.html?highlight=trace .
Update Answer to question 1, 2 and 4.
The constraint #1 and #2 means the same thing, i.e. that the elements in clusters should be disjoint. The #1 constraint is a little different in that it loops over decision variables while the #2 constraint use plain indices. One can guess that #2 is faster since #1 use the where i != j which must be translated to some extra constraints. (And using i < j instead should be a little faster.)
The all_different constraint states about the same and depending on the underlying solver it might be faster if it's translated to an efficient algorithm in the solver.
In the model there is also the following constraint which states that all elements must be used:
constraint (clusters[1] union clusters[2] union clusters[3] union clusters[4]) = dsmElements;
Apart from efficiency, all these constraints above can be replaced with one single constraint: partition_set which ensure that all elements in dsmElements must be used in clusters.
constraint partition_set(clusters,dsmElements);
It might be faster to also combine with the all_different constraint, but that has to be tested.

Why my array is of type var int instead of var set of int?

I have the following problem: I want to call the global constraint at_most but I got an error related to the signature
constraint forall(i in 0..w-1)(at_most(l_max, [board[i,j] | j in 0..l_max-1], 0..n));
the second argument does not match because it turns out to be var int instead of var set of int but I have previously defined board in this way:
set of int: VALUES = 0..n;
array[0..w-1,0..l_max-1] of var VALUES: board;
Just as a general message: at_most is among the list of deprecated constraints: https://www.minizinc.org/doc-2.5.5/en/lib-globals.html#deprecated-constraints.
Instead, you should use a count constraint. These constraints are more flexible and better supported by the solvers.
In this case there seems to be a misconception about what at_most does. At most only restrict the number of time a single value occurs. You are. however, giving it a full set of values.
If you are counting all the different values, then you instead can use global_cardinality_low_up. (You might also want to look at the closed version).
I think you meant to write the following constraint.
constraint forall(i in 0..w-1)(
global_cardinality_low_up([board[i,j] | j in 0..l_max-1], 0..n, [0 | i in 0..n], [l_max | i in 0..n])
);
This constraint insure that for the comprehensions the values in 0..n only occur l_max times.
Note that if you are using the comprehension to select a full row, then it would be better to use slice notation: board[i,..].

Find the smallest alldifferent array whose sum is n

This seems like such a simple problem, but I can't find a simple way to represent this in MiniZinc.
include "globals.mzn";
int: target;
int: max_length;
var 1..max_length: length;
array[1..length] of int: t;
constraint sum(t) = target;
constraint alldifferent(t);
solve minimize length;
This program errors with:
MiniZinc: type error: type-inst must be par set but is ``var set of int'
Is there a clean/simple way to represent this problem in MiniZinc?
Arrays in MiniZinc have a fixed size. The compiler is therefore saying that array[1..length] of int: t is not allowed, because length is a variable.
The alternative that MiniZinc offers is arrays with optional types, these are values that might exist. This means that when you write something like [t | t in 1..length], it will actually give you an array of 1..maxlength, but some elements can be marked as absent/<>.
For this particular problem you are also overlooking the fact that t should itself be a array of variables. The values of t are not yet known when at compile-time. A better way to formulate this problem would thus be to allow the values of t to be 0 when they are beyond the chosen length:
include "globals.mzn";
int: target;
int: max_length;
var 1..max_length: length;
array[1..max_length] of var int: t;
constraint sum(t) = target;
constraint alldifferent_except_0(t);
constraint forall(i in length+1..max_length) (t[i] = 0);
solve minimize length;
The next step to improve the model would be to ensure that the initial domain of t makes sense and instead of being all different, forcing an ordering would be equivalent, but eliminate some symmetry in the possible solutions.

Minizinc: optimal ordering on table feature

I have a table with features = {A,B}. B is a column of integers. Scanning the table, when I have a value change in B column, I increment a variable "changes" of 1:
if data[i,B]!=data[i-1,B]
then changes=changes+1
I want to find an order that minimizes changes and at the same time keep the repetition of a value in B in [0,upper_bound].
I'm thinking to use an array as a decision variable where save the position j for the element i:
order[i]=j means i element in data is the j-th element in ordering.
How can I model with constraint? This is what I do until now:
array[1..n, Features] of int: data;
int: changes=0;
constraint
forall(i in 1..n) (
if data[i,B] != data[i-1,B] then
changes=changes+1
endif
)
;
minimize changes;
I think I'm wrong using changes as a constant variable, right? Thank you in advance.
In MiniZinc (and in constraint programming in general) you cannot increment a variable as changes=changes+1).
If changes is a variable used only for the total count of changes you can use sum instead, something like:
% ...
var 0..n: num_changes;
constraint
changes = sum([data[i,B] != data[i-1,B] | i in 2..n])
;
% ...
However, if you want to use the number of accumulated changes for each i then you have to create a changes array to collect the values for each step, e.g.
var[1..n-1] of var 0..n: changes;
% the total number of changes (to minimize)
var 0..n-1: total_changes = changes[n-1];
constraint
forall(i in 1..n-1) (
if data[i,B] != data[i-1,B] then
changes[i] = changes[i-1]+1
else
changes[i] = changes[i-1]
endif
)
;

Assigning times to events

include "globals.mzn";
%Data
time_ID = [11,12,13,14,15];
eventId = [0011, 0012, 0013, 0021, 0022, 0031, 0041, 0051, 0061, 0071];
int:ntime = 5;
int:nevent = 10;
set of int: events =1..nevent;
set of int: time = 1..ntime;
array[1..nevent] of int:eventId;
array[1..nevent] of var time:event_time;
array[1..ntime] of int:time_ID;
solve satisfy;
constraint
forall(event in eventId)(
exists(t in time_ID)(
event_time[event] = t ));
output[ show(event_time) ];
I'm trying to assign times to an event using the code above.
But rather than randomly assign times to the events, it returns an error " array access out of bounds"
How can I make it select randomly from the time array?
Thank you
The error was because you tried to assign the index 11 (the first element in eventId array) in "event_time" array.
The assigment of just 1's is correct since you haven't done any other constraints on the "event_time" array. If you set the number of solutions to - say - 3 you will see other solutions. And, in fact, the constraint as it stand now is not really meaningful since it just ensures that there is some assignment to the elements in "event_time", but this constraint is handled by the domain of "event_time" (i.e. that all indices are in the range 1..ntime).