As is usually the case with finite domain constraint solvers, this solver is not complete. That is, it does not ensure that the set of posted constraints is satisfiable. One must resort to search (enumeration) to check satisfiability and get particular solutions.
The following predicates provide several variants of search:
indomain(
?X)
labeling(
:Options,
+Variables)
first_bound(
+BB0,
-BB)
later_bound(
+BB0,
-BB)
value(
Enum)
option
(see below).
minimize(
:Goal,
?X)
minimize(
:Goal,
?X,
+Options)
since release 4.3maximize(
:Goal,
?X)
maximize(
:Goal,
?X,
+Options)
since release 4.3labeling/2
goal. The algorithm calls Goal repeatedly with
a progressively tighter upper (lower) bound on X until a proof of
optimality is obtained.
Whether to enumerate every solution that improves the objective function, or only the optimal one after optimality has been proved, is controlled by Options. If given, it whould be a list containing a single atomic value, one of:
best
since release 4.3all
since release 4.3The Options argument of labeling/2
controls the order in
which variables are selected for assignment (variable choice heuristic),
the way in which choices are made for the selected variable (value
choice heuristic), whether the problem is a satisfaction one or an
optimization one, and whether all solutions or only the optimal one
should be returned. The options are divided into five groups. One
option may be selected per group. Also, the number of assumptions
(choices) made during the search can be counted. Finally, limits on
the execution time and discrepancy of the search can be imposed:
leftmost
input_order
min
smallest
max
largest
ff
first_fail
anti_first_fail
since release 4.3occurrence
since release 4.3ffc
most_constrained
max_regret
since release 4.3variable(
Sel)
Sel is expected to succeed determinately, unifying Selected and Rest with the selected variable and the remaining list, respectively.
Sel should be a callable term, optionally with a module prefix,
and the arguments Vars,Selected,Rest will be appended to it. For
example, if Sel is mod:sel(Param)
, it will be called as
mod:sel(Param,Vars,Selected,Rest)
.
The following options control the way in which choices are made for the selected variable X:
step
#=
B and
X #\=
B, where B is the lower or upper bound of
X. This is the default.
enum
bisect
#=<
M and
X #>
M, where M is the middle of the domain of
X, i.e. the mean of min(
X)
and max(
X)
rounded down to the nearest integer. This strategy is also known as
domain splitting.
median
since release 4.3 #=
M and
X #\=
M, where M is the median of the domain of
X. If the domain has an even number of elements, the smaller
middle value is used.
middle
since release 4.3 #=
M and
X #\=
M, where M is the middle of the domain
of X, i.e. the mean of min(
X)
and max(
X)
rounded down to the nearest integer.
value(
Enum)
Enum is expected to succeed nondeterminately, pruning the domain
of X, and to backtrack one or more times, providing alternative
prunings. To ensure that branch-and-bound search works correctly, it
must call the auxiliary predicate first_bound(
BB0,
BB)
in its first solution. Similarly, it must call the auxiliary predicate
later_bound(
BB0,
BB)
in any alternative solution.
Enum should be a callable term, optionally with a module prefix,
and the arguments X,Rest,BB0,BB will be appended to it. For
example, if Enum is mod:enum(Param)
, it will be called as
mod:enum(Param,X,Rest,BB0,BB)
.
The following options control the order in which the choices are made
for the selected variable X. Not useful with the
value(
Enum)
option:
up
down
The following options tell the solver whether the given problem is a satisfaction problem or an optimization problem. In a satisfaction problem, we wish to find values for the domain variables, but we don't care about which values. In an optimization problem, we wish to find values that minimize or maximize some objective function reflected in a domain variable:
satisfy
since release 4.3minimize(
X)
maximize(
X)
time_out/2
,
best
, and all
options (see below). If these options occur
more than once, the last occurrence overrides previous ones.
The following options are only meaningful for optimization problems. They tell the solver whether to enumerate every solution that improves the objective function, or only the optimal one after optimality has been proved:
best
since release 4.3all
since release 4.3The following options are only meaningful for optimization problems. They tell the solver what search scheme to use:
bab
since release 4.3restart
since release 4.3The following option counts the number of assumptions (choices) made during the search:
assumptions(
K)
Finally, limits on the discrepancy of the search and the execution time can be imposed:
discrepancy(
D)
time_out(
Time,
Flag)
best
, bab
, and minimize(V)
or
maximize(V)
options, and the time limit Time in
milliseconds is reached, then (if at least one solution was found,
then Flag, Variables and V are respectively unified
with time_out
and the values of the best solution found, else the
search merely fails). If used otherwise, equivalent to a goal
time_out(labeling(...),Time,Flag)
, which gives a time budget of
Time ms per solution to labeling(...)
; see lib-timeout.
For example, to enumerate solutions using a static variable ordering, use:
| ?- constraints(Variables), labeling([], Variables). %same as [leftmost,step,up,satisfy]
To minimize a cost function using branch-and-bound search, computing the best solution only, with a dynamic variable ordering using the first-fail principle, and domain splitting exploring the upper part of domains first, use:
| ?- constraints(Variables, Cost), labeling([ff,bisect,down,minimize(Cost)], Variables).
If you want to give a time budget and collect the solutions generated until the time has expired, there is no exported predicate that captures this task. But you can express it for example as follows:
| ?- constraints(Variables), retractall(soln(_)), time_out((labeling(Options,Variables), assertz(soln(Variables)), fail; true), Budget, Flag), findall(Soln, retract(soln(Soln)), Solutions),
where Flag=success
will hold if all solutions were found, and
Flag=time_out
will hold if the time expired.
The file library('clpfd/examples/tsp.pl')
contains an example of
user-defined variable and value choice heuristics.
Note that, when used for optimization, labeling/2
has a
limitation compared to minimize/[2,3]
and maximize/[2,3]
:
the variable and value choice heuristics specified by labeling/2
must apply to the whole set of variables, with no provision for
different heuristics for different subsets. As of release 4.3, this
limitation has been lifted by the following predicate:
solve(
:Options,
:Searches)
since release 4.3labeling/2
, and Searches is a list of labeling/2
and
indomain/1
goals, or a single such goal. The domain variables of
Searches must all have bounded domains. True if the conjunction
of Searches is true.
The main purpose of this predicate is for optimization, allowing to
use different heuristics in the different Searches.
For satisfiability problems, a simple sequence of labeling/2
and
indomain/1
goals does the trick.
The treatment of the Options, as well as the suboption lists given
in the labeling/2
goals of Searches, is a bit special.
Some options are global for the whole search, and are ignored if they
occur in the suboption lists. Others are local to the given
labeling/2
goal, but provides a default value for the whole
search if it occurs in Options. The following table defines the
role of each option as global
or local
:
all | global
|
anti_first_fail | local
|
assumptions/1 | global
|
bab | global
|
best | global
|
bisect | local
|
discrepancy/1 | local
|
down | local
|
enum | local
|
ffc | local
|
ff | local
|
first_fail | local
|
input_order | local
|
largest | local
|
leftmost | local
|
maximize/1 | global
|
max | local
|
max_regret | local
|
median | local
|
middle | local
|
minimize/1 | global
|
min | local
|
most_constrained | local
|
occurrence | local
|
restart | global
|
satisfy | global
|
smallest | local
|
step | local
|
time_out/2 | global
|
up | local
|
value/1 | local
|
variable/1 | local
|
For example, suppose that you want to minimize a cost function using branch-and-bound search, enumerating every improving solution, using left-to-right search on some variables followed by first-fail domain splitting search on some other variables. This can be expressed as:
| ?- constraints([X1,X2,X3,Y1,Y2,Y3], Cost), solve([minimize(Cost),all], [labeling([leftmost],[X1,X2,X3]), labeling([ff,bisect],[Y1,Y2,Y3])]).