Relative Extreme Values

f(x,y)f(x,y) 是双变量函数,点 (x0,y0)(x_0,y_0) 被称为极值点,当且仅当 f(x0,y0)()f(x,y)f(x_0,y_0)\ge (\le) f(x,y) for all (x,y)(x,y) in some neighborhood of (x0,y0)(x_0,y_0)

(x0,y0)(x_0,y_0)f(x,y)f(x,y) 的极值点时,有 fx(x0,y0)=fy(x0,y0)=0f_x(x_0,y_0)=f_y(x_0,y_0)=0.满足偏导数为 00 的点称为 critical point.

Saddle Point, Extreme Point

P(x0,y0)P(x_0,y_0)f(x,y)f(x,y) 的关键点,且 A=fxx(x0,y0),B=fxy(x0,y0),C=fyy(x0,y0)A=f_{xx}(x_0,y_0), B=f_{xy}(x_0,y_0),C=f_{yy}(x_0,y_0),对于 H=ACB2H=AC-B^2

  • 如果 H>0 and A<0H\gt 0\texttt{ and }A\lt 0,那么 PP 为极大值点 (Relative Maximum)
  • 如果 H>0 and A>0H\gt 0\texttt{ and }A\gt 0,那么 PP 为极小值点 (Relative Minimum)
  • 如果 H<0H\lt 0,那么 PP 是鞍点 (Saddle Point)
  • 否则 H=0H=0,无法判断.

拉格朗日乘数法找极值点

沃趣,这不是最优化理论里面讲到的符号优化方法吗(

Consider constraints g(x,y,z)=0g(x,y,z)=0, we just need to minimize the following objective function, so that the constraints are satisfied.

f(x,y,z)λg(x,y,z)f(x,y,z)-\lambda g(x,y,z)

Consider finding the partial derivatives for x,y,z,λx,y,z,\lambda so as to find the critical points.

fx=λgxfy=λgyfz=λgzg=0\begin{aligned} f_x &=\lambda g_x \\ f_y &=\lambda g_y \\ f_z &=\lambda g_z \\ g &= 0 \end{aligned}

Newton-Raphson Method to Solve Equations

沃趣,迭代法!

经典的牛顿迭代法是这样做单变量的迭代:

xn+1=xnf(xn)f(xn)x_{n+1}=x_n-\frac{f(x_n)}{f'(x_n)}

This method can be extended to multi-variable cases. Consider the following eqns

{f(x,y)=0g(x,y)=0\begin{cases} f(x,y)=0 \\ g(x,y)=0 \end{cases}

假设其真实解为 (a,b)(a,b) 我们初始的估计为 (x0,y0)(x_0,y_0),那么我们可以不妨设

x0+h=ay0+k=b\begin{aligned} x_0+h&=a\\ y_0+k&=b \end{aligned}

那么,让两个方程的左式都在 (x0,y0)(x_0,y_0) 处进行泰勒展开:

{0=f(x0,y0)+hfx(x0,y0)+kfy(x0,y0)0=g(x0,y0)+hgx(x0,y0)+kgy(x0,y0)\begin{cases} 0=f(x_0,y_0)+h f_x(x_0,y_0)+k f_y(x_0,y_0)\\ 0=g(x_0,y_0)+h g_x(x_0,y_0)+k g_y(x_0,y_0)\\ \end{cases}

于是,我们可以解出 h0,k0h_0,k_0 的具体数值,再代入 x1=x0+h0,y1=y0+k0x_1=x_0+h_0,y_1=y_0+k_0 进行多轮迭代.