**About the problem**

The cutting optimization problem belongs to the class of Nondeterminist Polynomial Complete (NP-Complete) problems [1]. Other problems in this class are the Hamiltonian path, Travelling Salesman, Subset sum, Clique, Independent set, Graph colouring etc. All these problems have been deeply analyzed by a huge number of researchers, but no polynomial-time algorithm was discovered for them. This has a direct consequence over the running time and the quality of the optimization.

A polynomial-time algorithm is that one whose running time
is bounded by a polynomial function of its input size. For instance, if we have
*n* = 1000 pieces to cut and the cutting algorithm would have the complexity O(*n*^{2}),
then the running time would have been directly and linear proportional to 1000^{2}
which is (10^{6}) units of time. Assuming that our computers can perform
10^{9} operations per second, the cutting optimization algorithm would
run in less than a fraction of a second. Sadly, this is not the case for the
cutting optimization problem. There is no such fast algorithm for solving it.

The only perfect
algorithm for solving the cutting optimization problem is an exponential one. An exponential algorithm will run in
an exponential amount of time (2* ^{n}*, 3

if the complexity is 2 | |

if the complexity is 3 | |

if the complexity is n! , then the total number of
operations is 1000! which can be approximated by 10 |

These algorithms run in an impressive number of years. Even if we put all computers in the world to solve the problem in parallel we still don't get a significant improvement in speed.

This is why another possibility (which is employed by our
software
too) is to use *heuristics*
or approximation algorithms. A heuristic is an algorithm which is fast and
returns a good solution (often the best one) of the problem. However, **there is
no guarantee** that the obtained solution is the optimal one.

An important parameter of the component is the Optimization Quality. This will basically tell how many configurations are explored before the best found solutions is outputted. If you set the OptimizationLevel to very low value you will obtain a solution very fast. But the quality of the solution might be not so good. If you set the OptimizationLevel to very high value you will obtain a good solution but not so fast. Thus, one must employ a trade-off between the quality of the solutions and the running time.

**References**

[1]. Garey, M.R., Johnson D.S., Computers and Intractability: A Guide to NP-completeness, Freeman & Co, San Francisco, USA, 1979.