Solving static optimization problem using modified Kuhn-Tucker conditions

Wojciech Prazuch
4 min readSep 1, 2020

Today we are going to solve step by step an optimization problem using a method called Kuhn-Tucker conditions. Given a function to be minimized and a set of constraints, we can formulate a set of equations, which we can solve. Thus, we will find an optimal solution for our problem.

We first start by defining our function to be minimized. We will try with a simple 2D problem:

Our objective function

Even without differentiation, we can conclude that the minimum of the function lies in (0, 0).

Our objective function plotted, generated by https://academo.org/

However, we will impose a couple of constraints that our solution needs to satisfy:

Constraints for our problem

We see that solution (0, 0) would not satisfy the second condition, so we need to look for another one. We can find a minimum by using KT conditions, but first, we need to transform our constraints to the necessary form:

Where h denotes our constraints. By using simple mathematical operations on linear inequalities we will obtain:

Transformed constraints

The first constraint is quite handy. When we have an inequality such that one of our parameters needs to be just non-negative (as in this case), we can simplify our life by using modified Kuhn-Tucker conditions. I will point out the difference in a second.

The next step is to construct a Lagrange function:

The function is simply our objective function with constraints added to it except for the first constraint that will be taken into account in a second. We can now formulate our set of equations to be solved in order to obtain a solution:

The first condition is additionally multiplied by x. This is exactly a modified KT condition, where parameter x (because x was the parameter that needed to be non-negative due to the first constraint) multiplies the partial derivative of Lagrange function with respect to itself. The additional condition to take into account in the case of modified KT is to guarantee that this partial derivative is also non-negative (condition 6 in the next figure).

Anyway, four equations, four unknowns, so we can find exactly the points which will minimize our solution. Additionally, these points need to satisfy the following conditions:

The conditions from the first row come from our initial constraints. Conditions 4) and 5) come from the KT problem definition, where each inequality parameter must be non-negative. The last condition comes from the modified KT assumption, which says that the partial derivative with respect to the non-negative variable also needs to be non-negative.

We can find a solution by using Wolfram Alpha software:

You may find the exact query here: https://www.wolframalpha.com/input/?i=x*%286*x%2B2*u2%29%3D0%3B4*y-u1%2Bu2%3D0%3Bu1*%282-y%29%3D0%3Bu2*%282-y%29%3D0%3Bu2*%28y%2B2*x-8%29%3D0%3Bx%3E%3D0%3B2-y%3C%3D0%3By%2B2*x-8%3C%3D0%3Bu1%3E%3D0%3Bu2%3E%3D0%3B6*x%2B2*u2%3E%3D0

Our solution lies in (0, 2). To verify that, we could try to substitute a couple of values in our objective function to check if (0,2) produces the smallest values. However, the best way is to take advantage of the fact, that our problem is 2D and we can visualize the problem. First, let us plot the constraints:

The set of feasible solutions is contained in a triangle formed by our 3 inequality constraints. Generated by https://www.desmos.com/

We can plot the contours of our objective function on that plot to check, where will be the first intersection of the function with our feasibility region:

Circles denote the contours of our objective function. Generated by https://www.desmos.com/

We see that the first intersection of the contours with our feasible region indeed lies in (0,2).

In this way, we solved the static optimization problem with the use of Kuhn-Tucker conditions. Moreover, we have used a slight modification of the method to reduce the number of equations to solve from 5 to 4. In a standard approach, we could have solved the problem without modified KT, and the result would be the same.

--

--

Wojciech Prazuch
0 Followers

Working at Netguru. Passionate about Machine Learning, Politics and Travel | Linkedin: https://www.linkedin.com/in/wprazuch