Solving static optimization problem using modified Kuhn-Tucker conditions
Today we are going to solve step by step an optimization problem using a method called Kuhn-Tucker conditions. Given a function to be minimized and a set of constraints, we can formulate a set of equations, which we can solve. Thus, we will find an optimal solution for our problem.
We first start by defining our function to be minimized. We will try with a simple 2D problem:
Even without differentiation, we can conclude that the minimum of the function lies in (0, 0).
However, we will impose a couple of constraints that our solution needs to satisfy:
We see that solution (0, 0) would not satisfy the second condition, so we need to look for another one. We can find a minimum by using KT conditions, but first, we need to transform our constraints to the necessary form:
Where h denotes our constraints. By using simple mathematical operations on linear inequalities we will obtain:
The first constraint is quite handy. When we have an inequality such that one of our parameters needs to be just non-negative (as in this case), we can simplify our life by using modified Kuhn-Tucker conditions. I will point out the difference in a second.
The next step is to construct a Lagrange function:
The function is simply our objective function with constraints added to it except for the first constraint that will be taken into account in a second. We can now formulate our set of equations to be solved in order to obtain a solution:
The first condition is additionally multiplied by x. This is exactly a modified KT condition, where parameter x (because x was the parameter that needed to be non-negative due to the first constraint) multiplies the partial derivative of Lagrange function with respect to itself. The additional condition to take into account in the case of modified KT is to guarantee that this partial derivative is also non-negative (condition 6 in the next figure).
Anyway, four equations, four unknowns, so we can find exactly the points which will minimize our solution. Additionally, these points need to satisfy the following conditions:
The conditions from the first row come from our initial constraints. Conditions 4) and 5) come from the KT problem definition, where each inequality parameter must be non-negative. The last condition comes from the modified KT assumption, which says that the partial derivative with respect to the non-negative variable also needs to be non-negative.
We can find a solution by using Wolfram Alpha software:
Our solution lies in (0, 2). To verify that, we could try to substitute a couple of values in our objective function to check if (0,2) produces the smallest values. However, the best way is to take advantage of the fact, that our problem is 2D and we can visualize the problem. First, let us plot the constraints:
We can plot the contours of our objective function on that plot to check, where will be the first intersection of the function with our feasibility region:
We see that the first intersection of the contours with our feasible region indeed lies in (0,2).
In this way, we solved the static optimization problem with the use of Kuhn-Tucker conditions. Moreover, we have used a slight modification of the method to reduce the number of equations to solve from 5 to 4. In a standard approach, we could have solved the problem without modified KT, and the result would be the same.