Pattern Reduction in Paper Cutting

A large part of the paper industry involves supplying customers with reels of specified width in specified quantities. These ‘customer reels’ must be cut from a set of wider ‘jumbo reels’ as economically as possible. The first priority is to satisfy the customer demands using as few jumbo reels as possible. This is an example of the well-known onedimensional cutting stock problem, which can be solved by creating a series of patterns, each corresponding to a different set of reels that can be cut from a single jumbo. A secondary problem is to minimize the number of patterns needed in order to reduce frequency of cuttingknife resets. Existing methods account for 90% of cases. The problem is to provide methods for the remaining 10%. This is shown to be an NP-hard problem, and so efficient design heuristics are proposed and tested for reducing further the frequency of resets.


Summary
A large part of the paper industry involves supplying customers with reels of specified width in specified quantities. These 'customer reels' must be cut from a set of wider 'jumbo reels' as economically as possible. The first priority is to satisfy the customer demands using as few jumbo reels as possible. This is an example of the well-known onedimensional cutting stock problem, which can be solved by creating a series of patterns, each corresponding to a different set of reels that can be cut from a single jumbo. A secondary problem is to minimize the number of patterns needed in order to reduce frequency of cuttingknife resets. Existing methods account for 90% of cases. The problem is to provide methods for the remaining 10%. This is shown to be an NP-hard problem, and so efficient design heuristics are proposed and tested for reducing further the frequency of resets.
Pattern Reduction in Paper Cutting Greycon 1 Background A large part of the paper industry involves supplying customers with reels of specified width in specifed quantities. These 'customer reels' must be cut from a set of wider 'jumbo reels', in as economical a way as possible. The first priority is to minimize the waste, i.e. to satisfy the customer demands using as few jumbo reels as possible.
This is an example of the one-dimensional cutting stock problem, which has an extensive literature [1,2,3,4]. Greycon have developed cutting stock algorithms which they include in their software packages. Mathematically, the problem would be stated CUTTING STOCK Input: positive integer J (the jumbo width), distinct positive integers rl,"" rs (the various customer reel widths), and positive integers d1, ••• , a, (the quantity of each customer width that must be produced).
Task: use as few jumbos of width J as possible to satisfy the demand for da customer reels of width ra (for each a = 1, ... ,s).
-'-A solution to CUTTING STOCK consists of a series of 'patterns', each corresponding to a different (unordered) set of customer reels that may be cut from a single jumbo. The solution also specifies how many jumbos must be cut according to each pattern to satisfy all demands. Typically, the number of patterns in a solution are in the range 5-30, with occasional large problems of up to 80 patterns. Each pattern consists of 2-10 customer reels and is used for 1-1.5 jumbos.
Within the class of solutions having minimum waste, a secondary problem is to minimize the number of different patterns needed. The motivation here is to reduce the number of times that the knives have to be reset on the paper-cutting machine. The specification now becomes PATTERN Input: positive integer J, distinct positive integers rl, ... ,rs and positive integers dt, ... ,ds.
Task: find a minimum-waste solution to the corresponding instance of CUTTING STOCK, which further minimizes the number of patterns used.
The problem studied here is the second part of PATTERN, i. e. we assume that a solution of minimum waste (or very nearly so) is given and the task is to refine it by reducing the number of patterns, without increasing the waste.
It is convenient to express a collection of patterns in the form of an array P. Each row (indexed by i, taking values 1, ... , n) represents a single pattern, which we shall denote Pi; each column (indexed by a, taking values 1, ... , s) corresponds to a customer reel width. The entry Pia specifies how many times customer width ra appears in pattern Pi. This array P is supplemented with a list of 'multiplicities', Ci, giving the number of times each pattern is used.
There are two types of reduction that Greycon already implement in their software. First, there are 2 -t 1 reductions, which replace two patterns Pi and Pj with a single one Pi, This is possible whenever Pk can be found such that A necessary and sufficient condition is for all a.
Secondly, there are the so-called 3 -t 2 'staircase' reductions, which take three patterns, of the generic form Here the customer reel widths appear in the top row, and the pattern multiplicities in the right most column. The condition for three patterns to be the subject of a 3~2 staircase is that one of them should be completely 'covered' by the other two, and these two must have equal multiplicity, i.e. Pja~Pia + Pka for all a, and c, = Ck. It must also in general be checked that the patterns reached by the reduction are feasible, in the sense that L:a raPia~J for all i. The remainder of the report presents the further contributions made by participants in the Study Group. Having shown that the pattern reduction problem is NP-hard (even when the cutting stock problem is easy), we go on to put forward proposals for designing efficient heuristics, for which the computational budget imposed by Greycon was 1 minute on a Pentium-based PC. Initial application to several test problems is also described.

Computational Complexity
Before looking at the issue of pattern reduction, we first remark that the problem CUTTING STOCK is NP-hard. To see this consider the problem

Question:
can the input be partitioned into m disjoint sets (which must necessarily be triples) such that the sum of the a, in each set is equal to B?
If the a, are the various customer reel widths (appearing with multiplicities equal to the corresponding demands) and B is the jumbo width, then 3-PARTITION asks whether there exists a solution of zero waste, and is therefore contained in CUTTING STOCK. It is known to be strongly NP-hard (see, for example Garey and Johnson [6], pp 96-100). Hence CUTTING STOCK is also NP-hard, and remains so even if J is bounded by a polynomial in the number of customer reels demanded.
However, there are instances of CUTTING STOCK that are easy to solve. In particular, suppose that all the customer reels satisfy Ta > J/3, so that no more than two customer reels can be cut from a single jumbo. A minimum-waste solution is generated by the 'First-Fit-Decreasing' rule. The first jumbo supplies a reel with the largest width, together with a second reel if possible, of the largest width that will fit; subsequent jumbos are cut in the same way, looking at the remaining demand.
Turning now to pattern reduction, each instance of PATTERN is clearly at least as hard as the corresponding CUTTING STOCK instance, and so is generally NPhard. What is not so obvious is that PATTERN may remain hard even when CUTTING STOCK is easy. Let 2-PATTERN be the same as PATTERN, but with the restriction r a > J/3 for all a, so that the waste-minimization is easy. In this instance of 2-PATTERN, the minimum number of jumbos is mB, achieved if and only if each pattern contains a large width and a small width. The number of patterns in any such solution is at least 3m, since there must be at least one pattern for each small width. A minimum-waste solution with 3m patterns exists if and only if the corresponding instance of 3-PARTITION has a solution, and we have already seen that this question is strongly NP-hard. 0

Systematic Searching for Reductions
The 2-PATTERN problem discussed above shows that even when all the patterns that can appear in minimum-waste solutions are known, minimizing the number of patterns can be nontrivial. (Note that the theorem shows that the problem is NP-hard even when small integers are involved, in contrast to, say, the knapsack problem.) However, it is possible to systematically search for particular types of reduction, of which the 2 -+ 1 and 3 -+ 2 staircase are examples.
With a little more effort, one can pick out all 3 -+ 2 reductions. Suppose that there is a set of p 'available' patterns. (Ideally, this set will include all patterns that appear in some minimum-waste solution, but in any case should represent the most ecomonical ways of cutting a single jumbo.) Provided the customer reel widths ra are all bounded below by a fixed fraction of the jumbo width J, p is only polynomially large in the number of reel widths s. (None of the examples supplied by Greycon had any ra less than J/16.) Now suppose that one has a (minimumwaste) solution using n patterns Pt, ... , Pn• Run through all triples (Pi, Pj, Pk), where i,j, k = 1, ... , n with i < j < k and calculate the contribution fa that they make to the total demand for customer width ra, namely If any triple can be the subject of a 3 -+ 2 reduction, there are patterns Pa and Pb, picked from the original set of p patterns, and an integer x satisfying 0 < x~t, There are O(n 3 ) triples in the starting solution and O(p2) possibilities for (Pa, Pb). However, the 3 -+ 2 reductions can be identified by examining fewer than O(n3p2) cases: rewrite (2) as (3) and then for each choice of Pb and x, calculate the right hand side of (3), checking whether it corresponds to a valid pattern. In other words, one has to examine only O(n 3 pt) cases to compile all 3 -+ 2 reductions. Typically p rv 10 3 , while n, t rv 10 1 and so this difference is significant.
The same idea can be applied to higher reductions. All m1 -+ m2 reductions could be found by examining O(nmlpm2-1tm2-1) cases. However, for m2 > 2, exhaustive searches of this sort are probably impractical in real problems.

A Possible Algorithm
Having studied small examples of the various types of pattern reduction that might be important, it is possible to set about designing an algorithm that can be applied to much larger problems. It should be capable of catching 2 -+ 1 and 3 -+ 2 staircase reductions, but also more general possibilities. In addition, it should respect the computing budget of 1 minute of Pentium time for typical real problems. This section describes an initial implementation of one such algorithm, and also identifies ways in which it might be further refined.

Overview
The operation of the algorithm is summarized in the following remarks:

Remarks
(1) The algorithm preserves all row and column sums in the table of patterns, and so Pi and Pj must be chosen to have the same row sum, i. e. the same number of customer reels, otherwise there is no possibility of making them the same.
(2) The patterns involved in each interchange of reels should have the same multiplicity, to preserve the quantity of each customer width produced. Hence if c, =1= Cj then only type II moves are available.
(3) Type II moves are made only when no type I move is available.
(4) Checks are needed to ensure that interchanges do not produce infeasible patterns, by substituting a large reel for a small one in a jumbo roll that is already full or nearly full.
(5) If patterns Pi and P, (of the same multiplicity) can take part III a 2~1 reduction, this can be a.chieved by a sequence of type I moves.
(6) A 3~2 staircase reduction corresponds to a series of type II moves, in which the two patterns with repeated multiplicity take part in the interchange, one of them playing the catalyst to make the other two match.

A small example
The following example shows how type I and type Il moves can work together to effect a pattern reduction that would not be found by a 2 -+ 1 or 3 -+ 2 staircase procedure. It is also not difficult to check that no further reduction is possible. It is assumed in this example that the customer widths are sufficiently close that no infeasible patterns will be encountered (for example, taking ro: :s; J/5 for all a will do). Initially, all pairs of patterns are equally unmatched, so (i,j) = (1,2) is chosen arbitrarily. The first move is type I, swapping reels of widths rl and r3 between the first and second patterns. No further type I moves are then available. The second move is type Il, with pattern P4 used as a catalyst. The interchange of reels occurs between patterns P2 and P4, involving widths rl and r2.

Choices of implementation
Within the general framework outlined above there are various ways in which the details may be implemented. Some of the choices to be made are the following: (1) Subject to the constraint that Pi and P j have the same row sums in each round, their choice is arbitrary. However, it seems reasonable to choose them to be already as closely matched as possible, since, intuitively at least, fewer moves will then be required to reach a reduction. The initial implementation

Possible refinements
The main shortcoming of the algorithm appears to be that it does not allow moves between patterns of different multiplicity. As a result, it would miss some 2 -t 1 reductions (although these could easily be detected separately) and also examples such as the non-staircase 3 -t 2 reduction given in Section 1. On the positive side, it will catch all 3 -t 2 staircases, and also many n -t n -1 reductions with n > 3.
However, there is no reason why the method should not be extended to allow more moves. For example, the non-staircase in Section 1 can be effected by Here the first move is performed by the current algorithm, but the second involves an interchange (using widths 1200 and 1000) between patterns P2 and P3, which have different multiplicities. When making such moves, the relative multiplicities must be taken into account, in order to preserve the level of production of each customer width.
Finally, initial experimentation indicates that steps are needed to prevent the algorithm entering a repeating cycle of moves that does not lead to a reduction. The algorithm can be terminated by putting a very small limit on the maximum length of the tabu list, but this is not satisfactory, since it sometimes leads to reductions being missed.

Test Problems
Greycon provided several real problems, which have been used to make a preliminary evaluation of the effectiveness of the proposed algorithm. The input to each problem was a set of patterns generated by a waste-minimization algorithm accompanied by preliminary pattern reduction, although we found that not all 2 -+ 1 and 3 -+ 2 staircase reductions had been made.
Some samples are given below, giving just the input and final output, together with the number of moves used. Some of the lengths have been rescaled to reduce the numerical values of the ror and J. Execution times were well within the allocated computing budget, with typical problems requiring only a few seconds on a modest PC.
The scope for pattern reduction seems sensitive to the waste percentage of the input. If the waste percentage is small, say below 1%, then there are generally few feasible moves available, but if the percentage approaches 5% then the possible pattern reductions can be quite dramatic.

Problem 4
This was the largest problem, involving 91 customer widths, ranging from 1500 to 2490 .. The jumbo width was 4340 (although it could be taken to be 4300 without changing anything that follows), and so it is an example in which the first-fit-" decreasing rule generates a minimum-waste solution. The initial solution supplied by Greycon is given below by listing the two widths in each pattern, in preference to the large, sparse matrix P. Overall, the waste is 4.1%.

Summary
Greycon's initial presentation to the Study Group posed several questions, which are listed below, along with (partial) answers arising from the work described above.
(1) Given a minimum-waste solution, what is the minimum number of patterns required?
It has been shown in Section 2 that even when all the patterns appearing in minimum-waste solutions are known, determining the minimum number of patterns may be hard. It seems unlikely that one can guarantee to find the minimum number of patterns for large classes of realistic problems with only a few seconds on a PC available.
(2) Given an n -+ n -1 algorithm, will it find an optimal solution to the minimum-pattern problem?
There are problems for which n -+ n -1 reductions are not possible although a more drama.tic reduction is. For example, suppose there are three customer widths, all slightly smaller than J/3. Then may be reduced to but there is no set of two distinct patterns satisfying the same demands.
(3) Is there an efficient n -t n -1 algorithm?
In light of Question 2, Question 3 should perhaps be rephrased as 'Is there an efficient algorithm to reduce n patterns?' However, if an algorithm guaranteed to find some reduction whenever one existed then it could be applied iteratively to minimize the number of patterns, and we have seen this cannot be done easily.
(5) Is it worthwhile seeking alternatives to greedy heuristics?
In response to Questions 4 and 5, we point to the algorithm described earlier, or variants of it. Such approaches seem capable of catching many higher reductions.
(6) Is there a way to find solutions with the smallest possible number of single patterns?
The Study Group did not investigate methods tailored specifically to this task, but the algorithm proposed here seems to do reasonably well. It will not increase the number of singleton patterns under any circumstances, and when the number of singletons is high there will be many possible moves that tend to eliminate them.
(7) Can a solution be found which reduces the number of knife changes?
The algorithm will help to reduce the number of necessary knife changes because it works by bringing patterns closer together, even if this does not proceed fully to a pattern reduction. If two patterns are equal across some of the customer widths, the knives for these reels need not be changed when moving from one to the other.
(b) The theorem in section 2 has been extended to the case J/3 < ra~J/2 for each 0', so that no three customer reels fit on a single jumbo, but any two do. The proof is considerably more complicated.