关于人工智能:2Softmax回归

49次阅读

共计 1060 个字符,预计需要花费 3 分钟才能阅读完成。

Softmax 回归

LR 解决的是二分类问题,Softmax 用于多分类场景。假如数据样本为:
$\{({x_1},{y_1}),({x_2},{y_2}),…,({x_m},{y_m})\}$,其中 ${y_i} \in \{1,2,…,K\}$,有 K 个类别,${y_i}$ 为 j 的概率是:

$$
p({y_i} = j|{x_i};\theta ) = \frac{{{e^{\theta _j^T{x_i}}}}}{{\sum\limits_{k = 1}^K {{e^{\theta _k^T{x_i}}}} }}
$$

预计每一类的概率为:

$$
{h_\theta}({x_i}) = \left[{\begin{array}{c}
{p({y_i} = 1|{x_i};\theta )}\\
{p({y_i} = 2|{x_i};\theta )}\\
{…}\\
{p({y_i} = k|{x_i};\theta )}
\end{array}} \right] = \frac{1}{{\sum\limits_{k = 1}^K {{e^{\theta _k^T{x_i}}}} }}\left[{\begin{array}{c}
{{e^{\theta _1^T{x_i}}}}\\
{{e^{\theta _2^T{x_i}}}}\\
{…}\\
{{e^{\theta _K^T{x_i}}}}
\end{array}} \right]
$$

最终要求的是参数 ${\theta _1},{\theta _2},…,{\theta _k}$, 代价函数:

$$
\begin{array}{l}
L(\theta) = – \frac{1}{m}[\sum\limits_{i = 1}^m {\sum\limits_{j = 1}^k {I({y_i} = j)\log \frac{{{e^{\theta _j^T{x_i}}}}}{{\sum\limits_{k = 1}^K {{e^{\theta _k^T{x_i}}}} }}} } ],\\
I({y_i} = j) = \left\{{\begin{array}{c}
{1,{y_i} \in j}\\
{0,{y_i} \notin j}
\end{array}} \right.
\end{array}
$$

同样用梯度降落法:

$$
\frac{{\partial L(\theta)}}{{\partial {\theta _j}}} = – \frac{1}{m}\sum\limits_{i = 1}^m {{x_i}(I({y_i} = j) – p({y_i} = j|{x_i};\theta ))}
$$

Softmax 的参数有冗余。

正文完
 0