# Karush-Kuhn-Tucker conditions

- Karush-Kuhn-Tucker – Wikipedia
- Karush-Kuhn-Tucker – Google Scholar
- Karush-Kuhn-Tucker – Google Search

*for linearized constraints*with the set of descent directions (see Fletcher, 1987; McCormick, 1983)). This rather technical regularity assumption holds for all support vector machines, since the constraints are always linear. Furthermore, the problem for SVMs is convex (a convex objective function, with constraints which give a convex feasible region), and for convex problems (if the regularity condition holds), the KKT conditions are

*necessary and sufficient*for

**w**,

*b*, α to be a solution (Fletcher, 1987). Thus solving the SVM problem is equivalent to finding a solution to the KKT conditions. This fact results in several approaches to finding the solution (for example, the primal-dual path following method mentioned in Section 5). As an immediate application, note that, while w is explicitly determined by the training procedure, the threshold

*b*is not, although it is implicitly determined. However

*b*is easily found by using the KKT “complementarity” condition, Eq. (21), by choosing any

*i*for which α

_{i}≠ 0 and computing

*b*(note that it is numerically safer to take the mean value of

*b*resulting from all such equations). Notice that all we�ve done so far is to cast the problem into an optimization problem where the constraints are rather more manageable than those in Eqs. (10), (11). Finding the solution for real world problems will usually require numerical methods. We will have more to say on this later. However, let�s first work out a rare case where the problem is nontrivial (the number of dimensions is arbitrary, and the solution certainly not obvious), but where the solution can be found analytically.” Burgess (1998)