**函数:

机器学习单层感知机梯度推导
图示:
机器学习单层感知机梯度推导
梯度求解:
f(x)=11+ex f(x) = {1 \above{0.5px} 1 + {e^{-x}}}
f(x)=ex(1+ex)2 f'(x) = {e^{-x} \above{0.5px} ({1 + e^{-x}})^2}
f(x)=(1+ex)1(1+ex)2 f'(x) = {(1 + e^{-x}) - 1 \above{0.5px}({1 + e^{-x}})^2}
f(x)=11+ex1(1+ex)2 f'(x) = {\frac 1 {1 + e^{-x}} - \frac 1 {{(1 + e^{-x}})^2}}
σ(x)=σ(x)(1σ(x)) \sigma'(x) = \sigma(x)(1 - \sigma(x))
机器学习单层感知机梯度推导
E=12(O01t)2(tO01sigmoid,1/2wj0j E = {\frac 1 2} (O_0^1 - t)^2 \\ (t是标签值,O_0^1是经过sigmoid**函数后的输出值, 1/2是为了求导时消掉常数项w_{j_0}表示第j个权值)
φEφwj0=(O01t)φOφwj0(便OO01) \frac {\varphi_E} {\varphi_{w_{j_0}}} = (O_0^1 - t){\frac {\varphi_O} {\varphi_{w_{j_0}}}} \\ (这里为了便于书写,用O来代替O_0^1)
O=σ(x)φEφwj0=(Ot)φσ(x01)φ(wj0) O = \sigma(x) \\ \frac {\varphi_E} {\varphi_{w_{j_0}}} = (O - t){\frac {\varphi_{\sigma(x_0^1)}} {\varphi_{(w_{j_0})}}} \\
φEφwj0=(Ot)φσ(x01)φ(x01)φ(x01)φσ(wj0) \frac {\varphi_E} {\varphi_{w_{j_0}}} = (O - t){\frac {\varphi_{\sigma(x_0^1)}} {\varphi_{(x_0^1)}}} {\frac {\varphi_{(x_0^1)}} {\varphi_{\sigma(w_{j_0})}}}
φEφwj0=(Ot)σ(x01))(1σ(x01)))φ(x01)φσ(wj0) \frac {\varphi_E} {\varphi_{w_{j_0}}} = (O - t) \sigma(x_0^1))(1 - \sigma(x_0^1))){\frac {\varphi_{(x_0^1)}} {\varphi_{\sigma(w_{j_0})}}}
(线)x01=Σwj0xj0φEφwj0=(Ot)σ(x01))(1σ(x01)))xj0() (根据线性关系 )\\ x_0^1 = \Sigma w_{j_0}x_j^0 \\ \frac {\varphi_E} {\varphi_{w_{j_0}}} = (O - t) \sigma(x_0^1))(1 - \sigma(x_0^1))) {x_j^0} \\ (所有变量都为已知,求出梯度)

相关文章:

  • 2021-10-15
  • 2022-01-04
  • 2022-12-23
  • 2021-11-18
  • 2021-06-24
  • 2021-06-20
  • 2021-04-30
  • 2021-11-16
猜你喜欢
  • 2021-03-30
  • 2022-12-23
  • 2022-12-23
  • 2021-06-08
  • 2021-06-04
  • 2021-11-28
  • 2022-12-23
相关资源
相似解决方案