間違いしかありません.コメントにてご指摘いただければ幸いです(気が付いた点を特に断りなく頻繁に書き直していますのでご注意ください).

共分散(covariance)

共分散(covariance)

$$ \begin{eqnarray} \mathrm{Cov}\left[X,Y\right]&=&\mathrm{E}\left[\left(X-\mathrm{E}\left[X\right]\right)\left(Y-\mathrm{E}\left[Y\right]\right)\right]\;\cdots\;共分散(covariance) \\ \mathrm{Cov}\left[c_0X_i, c_1X_j\right] &=&\mathrm{E}\left[\left( c_0X_i-\mathrm{E}\left[c_0X_i\right] \right)\left( c_1X_j-\mathrm{E}\left[c_1X_j\right] \right) \right] \;\cdots\;\mathrm{Cov}\left[X,Y\right]=\mathrm{E}\left[\left(X-\mathrm{E}\left[X\right]\right)\left(Y-\mathrm{E}\left[Y\right]\right)\right] \\&=&\mathrm{E}\left[\left( c_0X_i-c_0\mathrm{E}\left[\bar{X}\right] \right)\left( c_1X_j-c_1\mathrm{E}\left[\bar{X}\right] \right) \right] \;\cdots\;\href{https://shikitenkai.blogspot.com/2019/06/discrete-random-variable-expected-value.html}{\mathrm{E}\left[cX\right]=c\mathrm{E}\left[X\right]} \\&=&\mathrm{E}\left[ c_0\left(X_i-\mathrm{E}\left[\bar{X}\right]\right) c_1\left(X_j-\mathrm{E}\left[\bar{X}\right]\right) \right] \\&=&c_0c_1\mathrm{E}\left[ \left(X_i-\mathrm{E}\left[\bar{X}\right]\right) \left(X_j-\mathrm{E}\left[\bar{X}\right]\right) \right] \;\cdots\;\href{https://shikitenkai.blogspot.com/2019/06/discrete-random-variable-expected-value.html}{\mathrm{E}\left[cX\right]=c\mathrm{E}\left[X\right]} \\&=&c_0c_1\mathrm{Cov}\left[X_i,X_j\right] \\ \mathrm{V}\left[\sum_{i=1}^{n}X_i\right] &=&\mathrm{V}\left[X_1+\cdots+X_i+\cdots+X_n\right] \\&=&\mathrm{E}\left[\left\{\left(X_1+\cdots+X_i+\cdots+X_n\right)-\mathrm{E}\left[X_1+\cdots+X_i+\cdots+X_n\right]\right\}^2\right] \\&=&\mathrm{E}\left[\left\{X_1+\cdots+X_i+\cdots+X_n-\mathrm{E}\left[X_1\right]-\cdots-\mathrm{E}\left[X_i\right]-\cdots-\mathrm{E}\left[X_n\right]\right\}^2\right] \\&=&\mathrm{E}\left[\left\{\left(X_1-\mathrm{E}\left[X_1\right]\right)+\cdots+\left(X_i-\mathrm{E}\left[X_i\right]\right)+\cdots+\left(X_n-\mathrm{E}\left[X_n\right]\right)\right\}^2\right] \\&=&\mathrm{E}\left[ \left(X_1-\mathrm{E}\left[X_1\right]\right)^2+\cdots+\left(X_1-\mathrm{E}\left[X_1\right]\right)\left(X_j-\mathrm{E}\left[X_j\right]\right)+\cdots+\left(X_1-\mathrm{E}\left[X_1\right]\right)\left(X_n-\mathrm{E}\left[X_n\right]\right) \\+\cdots+\left(X_i-\mathrm{E}\left[X_i\right]\right)\left(X_1-\mathrm{E}\left[X_1\right]\right)+\cdots+\left(X_i-\mathrm{E}\left[X_i\right]\right)\left(X_j-\mathrm{E}\left[X_j\right]\right)+\cdots+\left(X_i-\mathrm{E}\left[X_i\right]\right)\left(X_n-\mathrm{E}\left[X_n\right]\right) \\+\cdots+\left(X_n-\mathrm{E}\left[X_n\right]\right)\left(X_1-\mathrm{E}\left[X_1\right]\right)+\cdots+\left(X_n-\mathrm{E}\left[X_n\right]\right)\left(X_j-\mathrm{E}\left[X_j\right]\right)+\cdots+\left(X_n-\mathrm{E}\left[X_n\right]\right)^2 \right] \\&=&\mathrm{E}\left[\left(X_1-\mathrm{E}\left[X_1\right]\right)^2\right] +\cdots+\mathrm{E}\left[\left(X_1-\mathrm{E}\left[X_1\right]\right)\left(X_j-\mathrm{E}\left[X_j\right]\right)\right] +\cdots+\mathrm{E}\left[\left(X_1-\mathrm{E}\left[X_1\right]\right)\left(X_n-\mathrm{E}\left[X_n\right]\right)\right] \\&&+\cdots+\mathrm{E}\left[\left(X_i-\mathrm{E}\left[X_i\right]\right)\left(X_1-\mathrm{E}\left[X_1\right]\right)\right] +\cdots+\mathrm{E}\left[\left(X_i-\mathrm{E}\left[X_i\right]\right)\left(X_j-\mathrm{E}\left[X_j\right]\right)\right] +\cdots+\mathrm{E}\left[\left(X_i-\mathrm{E}\left[X_i\right]\right)\left(X_n-\mathrm{E}\left[X_n\right]\right)\right] \\&&+\cdots+\mathrm{E}\left[\left(X_n-\mathrm{E}\left[X_n\right]\right)\left(X_1-\mathrm{E}\left[X_1\right]\right)\right] +\cdots+\mathrm{E}\left[\left(X_n-\mathrm{E}\left[X_n\right]\right)\left(X_j-\mathrm{E}\left[X_j\right]\right)\right] +\cdots+\mathrm{E}\left[\left(X_n-\mathrm{E}\left[X_n\right]\right)^2\right] \\&&\;\cdots\;\href{https://shikitenkai.blogspot.com/2019/06/discrete-random-variable-expected-value.html}{\mathrm{E}\left[X+Y\right]=\mathrm{E}\left[X\right]+\mathrm{E}\left[Y\right]} \\&=&\mathrm{Cov}\left[X_1,X_1\right] +\cdots+\mathrm{Cov}\left[X_1,X_j\right] +\cdots+\mathrm{Cov}\left[X_1,X_n\right] \\&&+\cdots+\mathrm{Cov}\left[X_i,X_1\right] +\cdots+\mathrm{Cov}\left[X_i,X_j\right] +\cdots+\mathrm{Cov}\left[X_i,X_n\right] \\&&+\cdots+\mathrm{Cov}\left[X_n,X_1\right] +\cdots+\mathrm{Cov}\left[X_n,X_j\right] +\cdots+\mathrm{Cov}\left[X_n,X_n\right] \\&=&\sum_{i=1}^{n}\sum_{j=1}^{n}\mathrm{Cov}\left[X_i,X_j\right] \\&&\;\cdots\;\mathrm{E}\left[\left(X_i-\mathrm{E}\left[X_i\right]\right)\left(X_j-\mathrm{E}\left[X_j\right]\right)\right]=\mathrm{Cov}\left[X_i,X_j\right] \\&=&\sum_{i=1}^{n}\mathrm{Cov}\left[X_i,X_i\right]+2\sum_{i\lt j}\mathrm{Cov}\left[X_i, X_j\right] \\&&\;\cdots\;\mathrm{Cov}の項は\mathrm{Cov}\left[X_i, X_j\right]=\mathrm{Cov}\left[X_j, X_i\right]でi\lt jとi\gt jで2回あるので2倍 \\&=&\sum_{i=1}^{n}\mathrm{V}\left[X_i\right]+2\sum_{i\lt j}\mathrm{Cov}\left[X_i, X_j\right] \\&&\;\cdots\;\mathrm{Cov}\left[X_i,X_i\right]=\mathrm{V}\left[X_i\right] \end{eqnarray} $$

単回帰における最小二乗推定量の分散(variance)・共分散(covariance)

単回帰における最小二乗推定量\(\hat{\alpha},\;\hat{\beta}\)の分散(variance)・共分散(covariance)

単回帰における観測値\(y_i\)の分散・共分散について

$$ \begin{eqnarray} y_i&=&\alpha+\beta x_i+\epsilon_i\;(i=1,\cdots,n) \\\left\{\epsilon_i|i=1,\cdots,n\right\}&:&\epsilon_i \overset{iid}{\sim} N(0,\sigma^2) \\&&\;\cdots\;独立同一分布(independent\;and\;identically\;distributed;\;IID,\;i.i.d.,\;iid) \\&&\;\cdots\;\mathrm{E}\left[\epsilon_i\right]=0,\;\mathrm{V}\left[\epsilon_i\right]=\sigma^2,互いに独立\left(\mathrm{Cov}[\epsilon_i, \epsilon_j]=\left\{\begin{array}\;\mathrm{V}\left[\epsilon_i\right]=\sigma^2&(i=j)\\0&(i \neq j)\end{array}\right.\right) \\\mathrm{V}\left[y_i\right] &=&\mathrm{V}\left[\alpha+\beta x_i+\epsilon_i\right] \\&=&\mathrm{V}\left[\epsilon_i\right] \;\cdots\;\mathrm{V}\left[X\pm t\right]=\mathrm{V}\left[X\right]\;(t:分散をとることについて定数) \\&=&\sigma^2 \\\mathrm{Cov}\left[y_i, y_j\right] &=&\mathrm{E}\left[\left(y_i-\mathrm{E}\left[y_i\right]\right)\left(y_j-\mathrm{E}\left[y_j\right]\right)\right] \;\cdots\;\href{https://shikitenkai.blogspot.com/2020/08/covariance.html}{\mathrm{Cov}\left[X,Y\right]=\mathrm{E}\left[\left(X-\mathrm{E}\left[X\right]\right)\left(Y-\mathrm{E}\left[Y\right]\right)\right]} \\&=&\mathrm{E}\left[\left(\alpha+\beta x_i+\epsilon_i-\mathrm{E}\left[\alpha+\beta x_i+\epsilon_i\right]\right)\left(\alpha+\beta x_j+\epsilon_j-\mathrm{E}\left[\alpha+\beta x_j+\epsilon_j\right]\right)\right] \;\cdots\;y_i=\alpha+\beta x_i+\epsilon_i \\&=&\mathrm{E}\left[\left(\alpha+\beta x_i+\epsilon_i-\alpha-\beta x_i-\mathrm{E}\left[\epsilon_i\right]\right)\left(\alpha+\beta x_j+\epsilon_j-\alpha-\beta x_j-\mathrm{E}\left[\epsilon_j\right]\right)\right] \;\cdots\;\href{https://shikitenkai.blogspot.com/2019/06/discrete-random-variable-expected-value.html}{\mathrm{E}\left[X\pm t\right]=\mathrm{E}\left[X\right]\pm t} \\&=&\mathrm{E}\left[\left(\epsilon_i-\mathrm{E}\left[\epsilon_i\right]\right)\left(\epsilon_j-\mathrm{E}\left[\epsilon_j\right]\right)\right] \\&=&\mathrm{Cov}\left[\epsilon_i, \epsilon_j\right] \end{eqnarray} $$ 上記を踏まえて\((x_i-\bar{x})\)を加えた分散・共分散について $$ \begin{eqnarray} \mathrm{Cov}\left[(x_i-\bar{x})(y_i-\bar{y}), (x_j-\bar{x})(y_j-\bar{y})\right] &=&(x_i-\bar{x})(x_j-\bar{x})\mathrm{Cov}\left[(y_i-\bar{y}), (y_j-\bar{y})\right] \;\cdots\;\href{https://shikitenkai.blogspot.com/2020/08/covariance.html}{\mathrm{Cov}\left[c_0X_i, c_1X_j\right]=c_0c_1\mathrm{Cov}\left[X_i,X_j\right]} \\&=&(x_i-\bar{x})(x_j-\bar{x})\mathrm{E}\left[\left\{(y_i-\bar{y})-\mathrm{E}\left[y_i-\bar{y}\right]\right\}\left\{(y_j-\bar{y})-\mathrm{E}\left[y_j-\bar{y}\right]\right\}\right] \;\cdots\;\href{https://shikitenkai.blogspot.com/2020/08/covariance.html}{\mathrm{Cov}\left[X,Y\right]=\mathrm{E}\left[\left(X-\mathrm{E}\left[X\right]\right)\left(Y-\mathrm{E}\left[Y\right]\right)\right]} \\&=&(x_i-\bar{x})(x_j-\bar{x})\mathrm{E}\left[\left\{y_i-\bar{y}-\mathrm{E}\left[y_i\right]+\bar{y}\right\}\left\{y_j-\bar{y}-\mathrm{E}\left[y_j\right]+\bar{y}\right\}\right] \;\cdots\;\href{https://shikitenkai.blogspot.com/2019/06/discrete-random-variable-expected-value.html}{\mathrm{E}\left[X\pm t\right]=\mathrm{E}\left[X\right]\pm t} \\&=&(x_i-\bar{x})(x_j-\bar{x})\mathrm{E}\left[\left(y_i-\mathrm{E}\left[\bar{y}\right]\right)\left(y_j-\mathrm{E}\left[\bar{y}\right]\right)\right] \\&=&(x_i-\bar{x})(x_j-\bar{x})\mathrm{Cov}\left[y_i, y_j\right] \;\cdots\;\href{https://shikitenkai.blogspot.com/2020/08/covariance.html}{\mathrm{Cov}\left[X,Y\right]=\mathrm{E}\left[\left(X-\mathrm{E}\left[X\right]\right)\left(Y-\mathrm{E}\left[Y\right]\right)\right]} \\&=&\left\{\begin{array} \;(x_i-\bar{x})^2\sigma^2&(i=j) \\(x_i-\bar{x})(x_j-\bar{x})0&(i \neq j) \end{array}\right. \;\cdots\;\mathrm{Cov}\left[y_i,y_j\right]=\mathrm{Cov}\left[\epsilon_i,\epsilon_j\right]=\left\{\begin{array}\;\mathrm{V}\left[\epsilon_i\right]=\sigma^2&(i=j)\\0&(i \neq j)\end{array}\right. \\ \mathrm{V}\left[(x_i-\bar{x})y_i\right] &=&(x_i-\bar{x})^2\mathrm{V}\left[y_i\right]\;\cdots\;\href{https://shikitenkai.blogspot.com/2019/06/discrete-random-variable-variance.html}{\mathrm{V}\left[cX\right]=c^2\mathrm{V}\left[X\right]} \\&=&(x_i-\bar{x})^2\sigma^2\;\cdots\;上記i=jのケース \end{eqnarray} $$

\(S_{xy}\)の分散

$$ \begin{eqnarray} \mathrm{V}\left[S_{xy}\right] &=&\mathrm{V}\left[\sum_{i=1}^{n}\left(x_i-\bar{x}\right)\left(y_i-\bar{y}\right)\right] \;\cdots\;S_{xy}=\sum_{i=1}^{n}\left(x_i-\bar{x}\right)\left(y_i-\bar{y}\right) \\&=& \sum_{i=1}^{n}\mathrm{V}\left[\left(x_i-\bar{x}\right)\left(y_i-\bar{y}\right)\right] +2\sum_{i\lt j}\mathrm{Cov}\left[\left(x_i-\bar{x}\right)\left(y_i-\bar{y}\right), \left(x_j-\bar{x}\right)\left(y_j-\bar{y}\right)\right] \\&&\;\cdots\;\href{https://shikitenkai.blogspot.com/2020/08/covariance.html}{\mathrm{V}\left[\sum_{i=1}^{n}X_i\right] =\sum_{i=1}^{n}\mathrm{V}\left[X_i\right]+2\sum_{i\lt j}\mathrm{Cov}\left[X_i, X_j\right]} \\&=&\sum_{i=1}^{n}\left(x_i-\bar{x}\right)^2\sigma^2+2\sum_{i\lt j}0 \\&&\;\cdots\;\mathrm{V}\left[\left(x_i-\bar{x}\right)\left(y_i-\bar{y}\right)\right]=\left(x_i-\bar{x}\right)^2\sigma^2 \\&&\;\cdots\;\mathrm{Cov}\left[\left(x_i-\bar{x}\right)\left(y_i-\bar{y}\right), \left(x_j-\bar{x}\right)\left(y_j-\bar{y}\right)\right]=0\;(i\neq j) \\&=&\sigma^2\sum_{i=1}^{n}\left(x_i-\bar{x}\right)^2 \;\cdots\;\sum_{i=0}^n cX_i=c\sum_{i=0}^n X_i \\&=&\sigma^2S_{xx} \;\cdots\;S_{xx}=\sum_{i=1}^{n}\left(x_i-\bar{x}\right)^2 \end{eqnarray} $$

最小2乗推定量\(\hat{\beta}\)の分散

$$ \begin{eqnarray} \mathrm{V}\left[\hat{\beta}\right] &=&\mathrm{V}\left[\frac{S_{xy}}{S_{xx}}\right] \;\cdots\;\hat{\beta}=\frac{S_{xy}}{S_{xx}},\;S_{xx}=\sum_{i=1}^{n}\left(x_i-\bar{x}\right)^2,\;S_{xy}=\sum_{i=1}^{n}\left(x_i-\bar{x}\right)\left(y_i-\bar{y}\right) \\&=&\frac{1}{S_{xx}^2}\mathrm{V}\left[S_{xy}\right] \;\cdots\;\href{https://shikitenkai.blogspot.com/2019/06/discrete-random-variable-variance.html}{\mathrm{V}\left[cX\right]=c^2\mathrm{V}\left[X\right]} \\&=&\frac{1}{S_{xx}^2}\sigma^2S_{xx} \;\cdots\;\mathrm{V}\left[S_{xy}\right]=\sigma^2S_{xx} \\&=&\frac{1}{S_{xx}}\sigma^2 \end{eqnarray} $$

最小2乗推定量\(\hat{\alpha}\)の分散

$$ \begin{eqnarray} \mathrm{V}\left[\hat{\alpha}\right] &=&\mathrm{V}\left[\bar{y}-\hat{\beta}\bar{x}\right] \;\cdots\;\hat{\alpha}=\bar{y}-\hat{\beta}\bar{x} \\&=&\mathrm{V}\left[\bar{y}-\frac{S_{xy}}{S_{xx}}\bar{x}\right] \;\cdots\;\hat{\beta}=\frac{S_{xy}}{S_{xx}},\;S_{xx}=\sum_{i=1}^{n}\left(x_i-\bar{x}\right)^2,\;S_{xy}=\sum_{i=1}^{n}\left(x_i-\bar{x}\right)\left(y_i-\bar{y}\right) \\&=&\mathrm{V}\left[\bar{y}\right]+\mathrm{V}\left[\frac{S_{xy}}{S_{xx}}\bar{x}\right] -2\mathrm{Cov}\left[\bar{y}, \frac{S_{xy}}{S_{xx}}\bar{x}\right] \\&&\;\cdots\;\href{https://shikitenkai.blogspot.com/2019/06/discrete-random-variable-variance.html}{\mathrm{V}\left[X\pm Y\right]=\mathrm{V}\left[X\right]\pm 2\mathrm{Cov}\left[X,Y\right]+\mathrm{V}\left[Y\right]} \\&=&\mathrm{V}\left[\bar{y}\right]+\mathrm{V}\left[\frac{S_{xy}}{S_{xx}}\bar{x}\right]-2\cdot0 \;\cdots\;\mathrm{Cov}\left[\bar{y}, \frac{S_{xy}}{S_{xx}}\bar{x}\right]=0\;(後述) \\&=&\mathrm{V}\left[\bar{y}\right] +\frac{\bar{x}^2}{S_{xx}^2}\mathrm{V}\left[S_{xy}\right] \\&=&\mathrm{V}\left[\frac{1}{n}\sum_{i=1}^{n}y_i\right] +\frac{\bar{x}^2}{S_{xx}^2}\sigma^2S_{xx} \;\cdots\;\mathrm{V}\left[S_{xy}\right]=\sigma^2S_{xx} \\&=&\frac{1}{n^2}\mathrm{V}\left[\sum_{i=1}^{n}y_i\right] +\frac{\bar{x}^2}{S_{xx}}\sigma^2 \;\cdots\;\href{https://shikitenkai.blogspot.com/2019/06/discrete-random-variable-variance.html}{\mathrm{V}\left[cX\right]=c^2\mathrm{V}\left[X\right]} \\&=&\frac{1}{n^2} \left\{ \sum_{i=1}^{n}\mathrm{V}\left[y_i\right] +2\sum_{i\lt j}\mathrm{Cov}\left[y_i, y_j\right] \right\} +\frac{\bar{x}^2}{S_{xx}}\sigma^2 \\&&\;\cdots\;\href{https://shikitenkai.blogspot.com/2020/08/covariance.html}{\mathrm{V}\left[\sum_{i=1}^{n}X_i\right] =\sum_{i=1}^{n}\mathrm{V}\left[X_i\right]+2\sum_{i\lt j}\mathrm{Cov}\left[X_i, X_j\right]} \\&=&\frac{1}{n^2}\left\{ \sum_{i=1}^{n}\sigma^2 +2\sum_{i\lt j}0 \right\} +\frac{\bar{x}^2}{S_{xx}}\sigma^2 \;\cdots\;\mathrm{V}\left[y_i\right]=\sigma^2,\;\mathrm{Cov}\left[y_i, y_j\right]=0 \\&=&\frac{1}{n^2}n\sigma^2 +\frac{\bar{x}^2}{S_{xx}}\sigma^2 \;\cdots\;\sum_{i=0}^n c=nc \\&=&\left(\frac{1}{n}+\frac{\bar{x}^2}{S_{xx}}\right)\sigma^2 \end{eqnarray} $$

最小2乗推定量\(\hat{\alpha}\)と\(\hat{\beta}\)の共分散

$$ \begin{eqnarray} \mathrm{Cov}\left[\hat{\alpha},\hat{\beta}\right] &=&\mathrm{E}\left[\left(\hat{\alpha}-\mathrm{E}\left[\hat{\alpha}\right]\right)\left(\hat{\beta}-\mathrm{E}\left[\hat{\beta}\right]\right)\right] \;\cdots\;\href{https://shikitenkai.blogspot.com/2020/08/covariance.html}{\mathrm{Cov}\left[X,Y\right]=\mathrm{E}\left[\left(X-\mathrm{E}\left[X\right]\right)\left(Y-\mathrm{E}\left[Y\right]\right)\right]} \\&=&\mathrm{E}\left[ \left( \left(\bar{y}-\hat{\beta}\bar{x}\right) -\mathrm{E}\left[\bar{y}-\hat{\beta}\bar{x}\right] \right) \left( \hat{\beta}-\mathrm{E}\left[\hat{\beta}\right] \right) \right] \;\cdots\;\alpha=\bar{y}-\hat{\beta}\bar{x} \\&=&\mathrm{E}\left[ \left( \bar{y}-\frac{S_{xy}}{S_{xx}}\bar{x} -\mathrm{E}\left[ \bar{y}-\frac{S_{xy}}{S_{xx}}\bar{x} \right] \right) \left( \frac{S_{xy}}{S_{xx}} -\mathrm{E}\left[ \frac{S_{xy}}{S_{xx}} \right] \right) \right] \;\cdots\;\hat{\beta}=\frac{S_{xy}}{S_{xx}},\;S_{xx}=\sum_{i=1}^{n}\left(x_i-\bar{x}\right)^2,\;S_{xy}=\sum_{i=1}^{n}\left(x_i-\bar{x}\right)\left(y_i-\bar{y}\right) \\&=&\mathrm{E}\left[ \left( \bar{y}-\frac{S_{xy}}{S_{xx}}\bar{x} -\mathrm{E}\left[ \bar{y} \right] +\mathrm{E}\left[ \frac{S_{xy}}{S_{xx}}\bar{x} \right] \right) \left( \frac{S_{xy}}{S_{xx}} -\mathrm{E}\left[ \frac{S_{xy}}{S_{xx}} \right] \right) \right] \;\cdots\;\href{https://shikitenkai.blogspot.com/2019/06/discrete-random-variable-expected-value.html}{\mathrm{E}\left[X\pm Y\right]=\mathrm{E}\left[X\right]\pm\mathrm{E}\left[Y\right]} \\&=&\mathrm{E}\left[ \left( \bar{y}-\frac{S_{xy}}{S_{xx}}\bar{x} -\mathrm{E}\left[ \bar{y} \right] +\frac{\bar{x}}{S_{xx}}\mathrm{E}\left[ S_{xy} \right] \right) \left( \frac{S_{xy}}{S_{xx}} -\frac{1}{S_{xx}}\mathrm{E}\left[ S_{xy} \right] \right) \right] \;\cdots\;\href{https://shikitenkai.blogspot.com/2019/06/discrete-random-variable-expected-value.html}{\mathrm{E}\left[cX\right]=c\mathrm{E}\left[X\right]} \\&=&\mathrm{E}\left[ \left\{ \bar{y}-\mathrm{E}\left[\bar{y}\right] -\frac{\bar{x}}{S_{xx}}\left(S_{xy}-\mathrm{E}\left[S_{xy}\right]\right) \right\} \left\{ \frac{1}{S_{xx}}\left(S_{xy}-\mathrm{E}\left[S_{xy}\right]\right) \right\} \right] \\&=&\mathrm{E}\left[ -\frac{\bar{x}}{S_{xx}}\left(S_{xy}-\mathrm{E}\left[S_{xy}\right]\right) \frac{1}{S_{xx}}\left(S_{xy}-\mathrm{E}\left[S_{xy}\right]\right) \right] \;\cdots\;\bar{y}-\mathrm{E}\left[\bar{y}\right]=\bar{y}-\bar{y}=0 \\&=&\mathrm{E}\left[ -\frac{\bar{x}}{S_{xx}^2}\left(S_{xy}-\mathrm{E}\left[S_{xy}\right]\right)^2 \right] \\&=&-\frac{\bar{x}}{S_{xx}^2}\mathrm{E}\left[ \left(S_{xy}-\mathrm{E}\left[S_{xy}\right]\right)^2 \right] \;\cdots\;\href{https://shikitenkai.blogspot.com/2019/06/discrete-random-variable-expected-value.html}{\mathrm{E}\left[cX\right]=c\mathrm{E}\left[X\right]} \\&=&-\frac{\bar{x}}{S_{xx}^2}\sigma^2S_{xx} \;\cdots\;\mathrm{E}\left[\left(S_{xy}-\mathrm{E}\left[S_{xy}\right]\right)^2\right]=\mathrm{V}\left[S_{xy}\right]=\sigma^2S_{xx} \\&=&-\frac{\bar{x}}{S_{xx}}\sigma^2 \end{eqnarray} $$

\(\mathrm{Cov}\left[\bar{y}, \frac{S_{xy}}{S_{xx}}\bar{x}\right]=0\)について

$$ \begin{eqnarray} \mathrm{Cov}\left[\bar{y}, \frac{S_{xy}}{S_{xx}}\bar{x}\right] &=&\mathrm{E}\left[\left(\bar{y}-\mathrm{E}\left[\bar{y}\right]\right)\left(\frac{S_{xy}}{S_{xx}}\bar{x}-\mathrm{E}\left[\frac{S_{xy}}{S_{xx}}\bar{x}\right]\right)\right] \;\cdots\;\href{https://shikitenkai.blogspot.com/2020/08/covariance.html}{\mathrm{Cov}\left[X,Y\right]=\mathrm{E}\left[\left(X-\mathrm{E}\left[X\right]\right)\left(Y-\mathrm{E}\left[Y\right]\right)\right]} \\&=&\mathrm{E}\left[\left(\bar{y}-\bar{y}\right)\left(\frac{S_{xy}}{S_{xx}}\bar{x}-\frac{\bar{x}}{S_{xx}}\mathrm{E}\left[S_{xy}\right]\right)\right] \;\cdots\;\mathrm{E}\left[\bar{y}\right]=\bar{y},\;\href{https://shikitenkai.blogspot.com/2019/06/discrete-random-variable-expected-value.html}{\mathrm{E}\left[cX\right]=c\mathrm{E}\left[X\right]} \\&=&\mathrm{E}\left[0\cdot\frac{\bar{x}}{S_{xx}}\left(S_{xy}-\mathrm{E}\left[S_{xy}\right]\right)\right] \\&=&\mathrm{E}\left[0\right] \\&=&0 \end{eqnarray} $$

単回帰における最小二乗推定量が不偏推定量であることの証明

線形推定量

以前に単回帰における最小2乗推定量(least squares estimator; LSE)を求めた際に利用したのは以下の式である.. $$ \begin{eqnarray} \left\{ \begin{array} \;\hat{\alpha}+\bar{x}\hat{\beta}-\bar{y} &=&0\\ \displaystyle n\bar{x}\hat{\alpha}+\left(\sum_{i=1}^n x_i^2\right)\hat{\beta}-\left(\sum_{i=1}^n x_i y_i\right) &=&0\\ \end{array} \right.\;\cdots\; \href{https://shikitenkai.blogspot.com/2020/03/blog-post.html}{線形単回帰の回帰直線 (\hat{\alpha},\hat{\beta}を求める)} \end{eqnarray} $$ 上記は以下の形にでき,これは正規方程式(normal equation)と呼ばれる. $$ \begin{eqnarray} \left\{ \begin{array}[rcl] \;\hat{\alpha}+\bar{x}\hat{\beta} &=&\bar{y}&=&\displaystyle \left(\sum_{i=1}^n \frac{1}{n} y_i\right) \\\displaystyle n\bar{x}\hat{\alpha}+\left(\sum_{i=1}^n x_i^2\right)\hat{\beta} &=&\displaystyle \left(\sum_{i=1}^n x_i y_i\right)\\ \end{array} \right.\\ \end{eqnarray} $$ 観測値\(y_i\)の一次式\(\sum_{i=1}^{n}c_iy_i\;(c_i:定数)\)で表される推定量を線形推定量(linear estimator)と呼ぶ.
よって\(\hat{\alpha},\;\hat{\beta}\)は\(y_i\)の線形推定量である.

観測値\(y_i\)の期待値

$$ \begin{eqnarray} y_i&=&\alpha+\beta x_i+\epsilon_i\;(i=1,\cdots,n) \\\left\{\epsilon_i|i=1,\cdots,n\right\}&:&\epsilon_i \overset{iid}{\sim} N(0,\sigma^2) \\&&\;\cdots\;独立同一分布(independent\;and\;identically\;distributed;\;IID,\;i.i.d.,\;iid) \\&&\;\cdots\;\mathrm{E}\left[\epsilon_i\right]=0,\;\mathrm{V}\left[\epsilon_i\right]=\sigma^2,互いに独立 \\\mathrm{E}\left[y_i\right] &=&\mathrm{E}\left[\alpha+\beta x_i+\epsilon_i\right] \\&=&\mathrm{E}\left[\alpha\right]+\mathrm{E}\left[\beta x_i\right]+\mathrm{E}\left[\epsilon_i\right] \\&=&\alpha+\beta x_i+0\;\cdots\;\mathrm{E}\left[C\right]=C\;(C:期待値をとることについて定数),\;\mathrm{E}\left[\epsilon_i\right]=0 \\&=&\alpha+\beta x_i \end{eqnarray} $$

最小二乗推定量\(\hat{\beta}\)が不偏推定量であることの証明

$$ \begin{eqnarray} \mathrm{E}\left[\hat{\beta}\right] &=&\mathrm{E}\left[\frac{S_{xy}}{S_{xx}}\right] \;\cdots\;\hat{\beta}=\frac{S_{xy}}{S_{xx}} \\&=&\mathrm{E}\left[\frac{\sum_{i=1}^{n}\left(x_i-\bar{x}\right)\left(y_i-\bar{y}\right)}{S_{xx}}\right] \;\cdots\;S_{xy}=\sum_{i=1}^{n}\left(x_i-\bar{x}\right)\left(y_i-\bar{y}\right) \\&=&\frac{1}{S_{xx}}\mathrm{E}\left[\sum_{i=1}^{n}\left(x_i-\bar{x}\right)\left(y_i-\bar{y}\right)\right] \;\cdots\;\href{https://shikitenkai.blogspot.com/2019/06/discrete-random-variable-expected-value.html}{\mathrm{E}\left[cX\right]=c\mathrm{E}\left[X\right]} \\&=&\frac{1}{S_{xx}}\sum_{i=1}^{n}\mathrm{E}\left[\left(x_i-\bar{x}\right)\left(y_i-\bar{y}\right)\right] \\&&\;\cdots\;\mathrm{E}\left[\sum_{i=1}^{n}A_i\right]=\mathrm{E}\left[A_i+\cdots+A_i+\cdots+A_n\right]=\mathrm{E}\left[A_i\right]+\cdots+\mathrm{E}\left[A_i\right]+\cdots+\mathrm{E}\left[A_n\right]=\sum_{i=1}^{n}\mathrm{E}\left[A_i\right] \\&&\;\cdots\;\href{https://shikitenkai.blogspot.com/2019/06/discrete-random-variable-expected-value.html}{\mathrm{E}\left[X\pm Y\right]=\mathrm{E}\left[X\right]\pm\mathrm{E}\left[Y\right]} \\&=&\frac{1}{S_{xx}}\sum_{i=1}^{n}\left(x_i-\bar{x}\right)\mathrm{E}\left[y_i-\bar{y}\right] \;\cdots\;\href{https://shikitenkai.blogspot.com/2019/06/discrete-random-variable-expected-value.html}{\mathrm{E}\left[cX\right]=c\mathrm{E}\left[X\right]} \\&=&\frac{1}{S_{xx}}\sum_{i=1}^{n}\left(x_i-\bar{x}\right)\left(\mathrm{E}\left[y_i\right]-\mathrm{E}\left[\bar{y}\right]\right) \;\cdots\;\href{https://shikitenkai.blogspot.com/2019/06/discrete-random-variable-expected-value.html}{\mathrm{E}\left[X\pm Y\right]=\mathrm{E}\left[X\right]\pm\mathrm{E}\left[Y\right]} \\&=&\frac{1}{S_{xx}}\sum_{i=1}^{n}\left(x_i-\bar{x}\right)\left\{\left(\alpha+\beta x_i\right)-\left(\alpha+\beta \bar{x}\right)\right\} \;\cdots\;\mathrm{E}\left[y_i\right]=\alpha+\beta x_i,\;\mathrm{E}\left[\bar{y}\right]=\alpha+\beta \bar{x} \\&=&\frac{1}{S_{xx}}\sum_{i=1}^{n}\left(x_i-\bar{x}\right)\left(\alpha + \beta x_i -\alpha-\beta \bar{x}\right) \\&=&\frac{1}{S_{xx}}\sum_{i=1}^{n}\left(x_i-\bar{x}\right)\beta\left(x_i-\bar{x}\right) \\&=&\frac{1}{S_{xx}}\sum_{i=1}^{n}\left(x_i-\bar{x}\right)^2\beta \\&=&\frac{1}{S_{xx}}\beta\sum_{i=1}^{n}\left(x_i-\bar{x}\right)^2 \\&&\;\cdots\;\sum_{i=1}^{n} cX_i=cX_1+\cdots+cX_i+\cdots+cX_n=c(X_1+\cdots+X_i+\cdots+X_n)=c\sum_{i=1}^{n} X_i \\&=&\frac{1}{S_{xx}}\beta S_{xx} \;\cdots\;S_{xx}=\sum_{i=1}^{n}\left(x_i-\bar{x}\right)^2 \\&=&\beta \end{eqnarray} $$

最小二乗推定量\(\hat{\alpha}\)が不偏推定量であることの証明

$$ \begin{eqnarray} \\\mathrm{E}\left[\hat{\alpha}\right] &=&\mathrm{E}\left[\bar{y}-\hat{\beta}\bar{x}\right] \;\cdots\;\hat{\alpha}=\bar{y}-\hat{\beta}\bar{x} \\&=&\mathrm{E}\left[\bar{y}\right]-\mathrm{E}\left[\hat{\beta}\bar{x}\right] \;\cdots\;\href{https://shikitenkai.blogspot.com/2019/06/discrete-random-variable-expected-value.html}{\mathrm{E}\left[X\pm Y\right]=\mathrm{E}\left[X\right]\pm\mathrm{E}\left[Y\right]} \\&=&\mathrm{E}\left[\bar{y}\right]-\bar{x}\mathrm{E}\left[\hat{\beta}\right] \;\cdots\;\href{https://shikitenkai.blogspot.com/2019/06/discrete-random-variable-expected-value.html}{\mathrm{E}\left[cX\right]=c\mathrm{E}\left[X\right]} \\&=&\alpha+\beta\bar{x}-\beta\bar{x} \;\cdots\;\mathrm{E}\left[\bar{y}\right]=\alpha+\beta \bar{x},\;\mathrm{E}\left[\hat{\beta}\right]=\beta \\&=&\alpha \end{eqnarray} $$

標本平均の母平均まわりの4次モーメント (標本平均の4次の中心(化)モーメント)

標本平均\(\overline{X}\)の母平均\(\mu\)まわりの4次モーメント(=標本平均\(\overline{X}\)の4次の中心(化)モーメント)

$$ \begin{eqnarray} \mathrm{E}\left[(\overline{X}-\mu)^4\right] &=&\mathrm{E}\left[\left\{\left(\frac{1}{n}\sum_{k=1}^{n}X_k\right) - \mu\right\}^4\right] \;\cdots\;\overline{X}=\frac{1}{n}\sum_{i=1}^{n}X_i \\&=&\mathrm{E}\left[\left\{\left(\frac{1}{n}\sum_{k=1}^{n}X_k\right) - \left(\frac{1}{n}\sum_{k=1}^{n}\mu\right)\right\}^4\right] \;\cdots\;C=\frac{n}{n}C=\frac{1}{n}C\sum_{i=1}^{n}1=\frac{1}{n}\sum_{i=1}^{n}C\;(C:iによらない数,\sumにとって定数) \\&=&\mathrm{E}\left[\left[\frac{1}{n}\left\{\left(\sum_{k=1}^{n}X_k\right)-\left(\sum_{k=1}^{n}\mu\right)\right\}\right]^4\right] \\&=&\mathrm{E}\left[\frac{1}{n^4}\left\{\left(\sum_{k=1}^{n}X_k\right)-\left(\sum_{k=1}^{n}\mu\right)\right\}^4\right] \;\cdots\;(AB)^C=A^CB^C \\&=&\mathrm{E}\left[\frac{1}{n^4}\left\{\sum_{k=1}^{n}\left(X_k-\mu\right)\right\}^4\right] \;\cdots\;\sum_{i=1}^{n}X_i-\sum_{i=1}^{n}Y_i=\sum_{i=1}^{n}\left(X_i-Y_i\right) \end{eqnarray} $$ 総和の指数計算において掛け合わせる添え字の組合せについて考える. $$ \begin{eqnarray} \left(\sum_{k=1}^{n}A_k\right)^4 &=&\left(\sum_{k=1}^{n} A_k \right)\left(\sum_{l=1}^{n}A_l\right)\left(\sum_{m=1}^{n}A_m\right)\left(\sum_{s=1}^{n}A_s\right) \\&=&(A_1+A_2+\cdots+A_k+\cdots+A_n)(A_1+A_2+\cdots+A_l+\cdots+A_n)(A_1+A_2+\cdots+A_m+\cdots+A_n)(A_1+A_2+\cdots+A_s+\cdots+A_n) \\&=&_4\mathrm{P}_0\times\left(\sum_{k=1}^{n} A_k^4 \right)\;\cdots\;4つとも同じ添え字(どれか0個の添え字が異なるケース) \\&&+_4\mathrm{P}_1\times\left(\sum_{k \neq l} A_k^3 A_l\right)\;\cdots\;いずれか3つが同じ添え字(どれか1個の添え字が異なるのケース) \\&&+_4\mathrm{C}_2\times\left(\sum_{k \lt l} A_k^2 A_l^2\right)\;\cdots\;いずれか2つの添え字が同じで残りの2つの添え字同士も同じケース(どれか2個の添え字が異なるのケース(異なった添え字同士も同じ)) \\&&+_4\mathrm{P}_2\times\left(\sum_{k \neq l,m かつ l\lt m} A_k^2 A_l A_m\right)\;\cdots\;いずれか2つの添え字が同じで残りの2つの添え字同士は異なるケース(どれか2個の添え字が異なるのケース(異なった添え字同士は異なる)) \\&&+_4\mathrm{P}_3\times\left(\sum_{k \lt l \lt m \lt s} A_k A_l A_m A_s\right)\;\cdots\;すべての添え字が異なるケース(どれか3個の添え字が異なるのケース) \\&&\;\cdots\;各ケースでの(重複する数 \times 組合せで総和)の和 \\&=&\frac{4!}{(4-0)!}\left(\sum_{k=1}^{n} A_k^4 \right) +\frac{4!}{(4-1)!}\left(\sum_{k \neq l} A_k^3 A_l\right) +\frac{4!}{(4-2)!2!}\left(\sum_{k \lt l} A_k^2 A_l^2\right) +\frac{4!}{(4-2)!}\left(\sum_{k \neq l,m かつ l\lt m} A_k^2 A_l A_m\right) +\frac{4!}{(4-3)!}\left(\sum_{k \lt l \lt m \lt s} A_k A_l A_m A_s\right) \\&=&\frac{4\times3\times2\times1}{4\times3\times2\times1}\left(\sum_{k=1}^{n} A_k^4 \right) +\frac{4\times3\times2\times1}{3\times2\times1}\left(\sum_{k \neq l} A_k^3 A_l\right) +\frac{4\times3\times2\times1}{2\times1\cdot2\times1}\left(\sum_{k \lt l} A_k^2 A_l^2\right) +\frac{4\times3\times2\times1}{2\times1}\left(\sum_{k \neq l,m かつ l\lt m} A_k^2 A_l A_m\right) +\frac{4\times3\times2\times1}{1}\left(\sum_{k \lt l \lt m \lt s} A_k A_l A_m A_s\right) \\&=&1\cdot\left(\sum_{k=1}^{n} A_k^4 \right) +4\cdot\left(\sum_{k \neq l} A_k^3 A_l\right) +6\cdot\left(\sum_{k \lt l} A_k^2 A_l^2\right) +12\cdot\left(\sum_{k \neq l,m かつ l\lt m} A_k^2 A_l A_m\right) +24\cdot\left(\sum_{k \lt l \lt m \lt s} A_k A_l A_m A_s\right) \end{eqnarray} $$ よって, $$ \begin{eqnarray} \mathrm{E}\left[(\overline{X}-\mu)^4\right] &=&\mathrm{E}\left[\frac{1}{n^4}\left\{\sum_{k=1}^{n}\left(X_k-\mu\right)\right\}^4\right] \\&=&\mathrm{E}\left[\frac{1}{n^4}\left\{ \sum_{k=1}^{n} \left(X_k-\mu\right)^4 +4\sum_{k \neq l} \left(X_k-\mu\right)^3\left(X_l-\mu\right) +6\sum_{k \lt l} \left(X_k-\mu\right)^2\left(X_l-\mu\right)^2 +12\sum_{k \neq l,m かつ l\lt m} \left(X_k-\mu\right)^2\left(X_l-\mu\right)\left(X_m-\mu\right) +24\sum_{k \lt l \lt m \lt s} \left(X_k-\mu\right)\left(X_l-\mu\right)\left(X_m-\mu\right)\left(X_s-\mu\right) \right\}\right] \\&=&\frac{1}{n^4}\mathrm{E}\left[ \sum_{k=1}^{n} \left(X_k-\mu\right)^4 +4\sum_{k \neq l} \left(X_k-\mu\right)^3\left(X_l-\mu\right) +6\sum_{k \lt l} \left(X_k-\mu\right)^2\left(X_l-\mu\right)^2 +12\sum_{k \neq l,m かつ l\lt m} \left(X_k-\mu\right)^2\left(X_l-\mu\right)\left(X_m-\mu\right) +24\sum_{k \lt l \lt m \lt s} \left(X_k-\mu\right)\left(X_l-\mu\right)\left(X_m-\mu\right)\left(X_s-\mu\right) \right] \;\cdots\;\href{https://shikitenkai.blogspot.com/2019/06/discrete-random-variable-expected-value.html}{\mathrm{E}[cX]=c\mathrm{E}[X]} \\&=&\frac{1}{n^4}\left[ \mathrm{E}\left[ \sum_{k=1}^{n} \left(X_k-\mu\right)^4 \right] +\mathrm{E}\left[ 4\sum_{k \neq l} \left(X_k-\mu\right)^3\left(X_l-\mu\right) \right] +\mathrm{E}\left[ 6\sum_{k \lt l} \left(X_k-\mu\right)^2\left(X_l-\mu\right)^2 \right] +\mathrm{E}\left[ 12\sum_{k \neq l,m かつ l\lt m} \left(X_k-\mu\right)^2\left(X_l-\mu\right)\left(X_m-\mu\right) \right] +\mathrm{E}\left[ 24\sum_{k \lt l \lt m \lt s} \left(X_k-\mu\right)\left(X_l-\mu\right)\left(X_m-\mu\right)\left(X_s-\mu\right) \right] \right] \;\cdots\;\href{https://shikitenkai.blogspot.com/2019/06/discrete-random-variable-expected-value.html}{\mathrm{E}[X+Y]=\mathrm{E}[X]+\mathrm{E}[Y]} \\&=&\frac{1}{n^4}\left[ \mathrm{E}\left[ \sum_{k=1}^{n} \left(X_k-\mu\right)^4 \right] +4\mathrm{E}\left[ \sum_{k \neq l} \left(X_k-\mu\right)^3\left(X_l-\mu\right) \right] +6\mathrm{E}\left[ \sum_{k \lt l} \left(X_k-\mu\right)^2\left(X_l-\mu\right)^2 \right] +12\mathrm{E}\left[ \sum_{k \neq l,m かつ l\lt m} \left(X_k-\mu\right)^2\left(X_l-\mu\right)\left(X_m-\mu\right) \right] +24\mathrm{E}\left[ \sum_{k \lt l \lt m \lt s} \left(X_k-\mu\right)\left(X_l-\mu\right)\left(X_m-\mu\right)\left(X_s-\mu\right) \right] \right] \;\cdots\;\href{https://shikitenkai.blogspot.com/2019/06/discrete-random-variable-expected-value.html}{\mathrm{E}[cX]=c\mathrm{E}[X]} \\&=&\frac{1}{n^4}\left[ \sum_{k=1}^{n} \mathrm{E}\left[\left(X_k-\mu\right)^4 \right] +4\sum_{k \neq l} \mathrm{E}\left[\left(X_k-\mu\right)^3\left(X_l-\mu\right) \right] +6\sum_{k \lt l} \mathrm{E}\left[\left(X_k-\mu\right)^2\left(X_l-\mu\right)^2 \right] +12\sum_{k \neq l,m かつ l\lt m} \mathrm{E}\left[\left(X_k-\mu\right)^2\left(X_l-\mu\right)\left(X_m-\mu\right) \right] +24\sum_{k \lt l \lt m \lt s} \mathrm{E}\left[\left(X_k-\mu\right)\left(X_l-\mu\right)\left(X_m-\mu\right)\left(X_s-\mu\right) \right] \right] \\&&\;\cdots\;\mathrm{E}\left[\sum_{i=1}^n A_i\right]=\mathrm{E}\left[A_1+A_2+\cdots+A_i+\cdots+A_n\right]=\mathrm{E}\left[A_1\right]+\mathrm{E}\left[A_2\right]+\cdots+\mathrm{E}\left[A_i\right]+\cdots+\mathrm{E}\left[A_n\right]=\sum_{i=1}^n \mathrm{E}\left[A_i\right] \\&=&\frac{1}{n^4}\left[ \sum_{k=1}^{n} \mathrm{E}\left[\left(X_k-\mu\right)^4 \right] +4\sum_{k \neq l} \mathrm{E}\left[\left(X_k-\mu\right)^3\right]\mathrm{E}\left[\left(X_l-\mu\right) \right] +6\sum_{k \lt l} \mathrm{E}\left[\left(X_k-\mu\right)^2\right]\mathrm{E}\left[\left(X_l-\mu\right)^2 \right] +12\sum_{k \neq l,m かつ l\lt m} \mathrm{E}\left[\left(X_k-\mu\right)^2\right]\mathrm{E}\left[\left(X_l-\mu\right)\right]\mathrm{E}\left[\left(X_m-\mu\right) \right] +24\sum_{k \lt l \lt m \lt s} \mathrm{E}\left[\left(X_k-\mu\right)\right]\mathrm{E}\left[\left(X_l-\mu\right)\right]\mathrm{E}\left[\left(X_m-\mu\right)\right]\mathrm{E}\left[\left(X_s-\mu\right) \right] \right] \;\cdots\;\href{https://shikitenkai.blogspot.com/2019/06/discrete-random-variable-expected-value.html}{X,Yが独立の場合\,\,\mathrm{E}[XY]=\mathrm{E}[X]\mathrm{E}[Y]} \end{eqnarray} $$ 1次の中心(化)モーメントについて考える. $$ \begin{eqnarray} \mathrm{E}\left[X_i-\mu\right] &=& \mathrm{E}\left[X_i\right]-\mathrm{E}\left[\mu\right] \;\cdots\;\href{https://shikitenkai.blogspot.com/2019/06/discrete-random-variable-expected-value.html}{\mathrm{E}[X-Y]=\mathrm{E}[X]-\mathrm{E}[Y]} \\&=& \mu-\mu \;\cdots\;\href{https://shikitenkai.blogspot.com/2019/06/specimen-random-variable.html}{\mathrm{E}[X_i]=\mathrm{E}[X]=\mu},\;\href{https://shikitenkai.blogspot.com/2019/06/discrete-random-variable-expected-value.html}{\mathrm{E}[C]=C\;(C定数)} \\&=& 0 \end{eqnarray} $$ これを用いて $$ \begin{eqnarray} \mathrm{E}\left[(\overline{X}-\mu)^4\right] \\&=&\frac{1}{n^4}\left[ \sum_{k=1}^{n} \mathrm{E}\left[\left(X_k-\mu\right)^4 \right] +4\sum_{k \neq l} \mathrm{E}\left[\left(X_k-\mu\right)^3\right]\mathrm{E}\left[\left(X_l-\mu\right) \right] +6\sum_{k \lt l} \mathrm{E}\left[\left(X_k-\mu\right)^2\right]\mathrm{E}\left[\left(X_l-\mu\right)^2 \right] +12\sum_{k \neq l,m かつ l\lt m} \mathrm{E}\left[\left(X_k-\mu\right)^2\right]\mathrm{E}\left[\left(X_l-\mu\right)\right]\mathrm{E}\left[\left(X_m-\mu\right) \right] +24\sum_{k \lt l \lt m \lt s} \mathrm{E}\left[\left(X_k-\mu\right)\right]\mathrm{E}\left[\left(X_l-\mu\right)\right]\mathrm{E}\left[\left(X_m-\mu\right)\right]\mathrm{E}\left[\left(X_s-\mu\right) \right] \right] \\&=&\frac{1}{n^4}\left[ \sum_{k=1}^{n} \mathrm{E}\left[\left(X_k-\mu\right)^4 \right] +4\sum_{k \neq l} \left( \mathrm{E}\left[\left(X_k-\mu\right)^3\right] \cdot 0 \right) +6\sum_{k \lt l} \mathrm{E}\left[\left(X_k-\mu\right)^2\right]\mathrm{E}\left[\left(X_l-\mu\right)^2 \right] +12\sum_{k \neq l,m かつ l\lt m} \left( \mathrm{E}\left[\left(X_k-\mu\right)^2\right] \cdot 0 \cdot 0 \right) +24\sum_{k \lt l \lt m \lt s} \left( 0 \cdot 0 \cdot 0 \cdot 0 \right) \right] \;\cdots\;\mathrm{E}\left[X_i-\mu\right]=0 \\&=&\frac{1}{n^4}\left[ \sum_{k=1}^{n} \mathrm{E}\left[\left(X_k-\mu\right)^4\right] +0 +6\sum_{k \lt l}\mathrm{E}\left[\left(X_k-\mu\right)^2\right]\mathrm{E}\left[\left(X_l-\mu\right)^2 \right] +0 +0 \right] \\&=&\frac{1}{n^4}\left\{ \sum_{k=1}^{n} \mathrm{E}\left[\left(X_k-\mu\right)^4\right] +6\sum_{k \lt l}\mathrm{E}\left[\left(X_k-\mu\right)^2\right]\mathrm{E}\left[\left(X_l-\mu\right)^2 \right] \right\} \\&=&\frac{1}{n^4}\left\{ \sum_{k=1}^{n} \mu_4 +6\sum_{k \lt l} \left(\sigma^2 \cdot \sigma^2\right) \right\} \;\cdots\;\mathrm{E}\left[\left(X_i-\mu\right)^4\right]=\mu_4\;:4次の中心(化)モーメント,\;\mathrm{E}\left[\left(X_i-\mu\right)^2\right]=\sigma^2\;:2次の中心(化)モーメント \\&=&\frac{1}{n^4}\left\{ \sum_{k=1}^{n} \mu_4 +6\sum_{k \lt l} \sigma^4 \right\} \\&=&\frac{1}{n^4}\left\{ \sum_{k=1}^{n} \mu_4 +6\sigma^4\sum_{k \lt l} 1 \right\} \\&=&\frac{1}{n^4}\left\{n\mu_4+6\frac{n(n-1)}{2}\sigma^4\right\} \;\cdots\;\sum_{k \lt l} 1=\frac{n(n-1)}{2}\;(各k=1〜n-1に対してl=k+1からl=nまでの和) \\&=&\frac{1}{n^3}\left(\mu_4+3(n-1)\sigma^4\right) \\&=&\mu_4\left(\overline{X}\right)\;\cdots\;\mu_4\left(\overline{X}\right):標本平均\overline{X}の母平均\muまわりの4次モーメント(4次の中心(化)モーメント) \end{eqnarray} $$

標本平均\(\overline{X}\)の母平均\(\mu\)まわりの4次モーメント(標本平均\(\overline{X}\)の4次の中心(化)モーメント)を尖度\(\beta_2\)で表す

$$ \begin{eqnarray} \mathrm{E}\left[(\overline{X}-\mu)^4\right] &=&\mu_4\left(\overline{X}\right) \\&=&\frac{1}{n^3}\left(\mu_4+3(n-1)\sigma^4\right) \\&=&\frac{1}{n^3}\left(\frac{\beta_2+3}{\sigma^4}+3(n-1)\sigma^4\right) \;\cdots\;\mu_4=\frac{\beta_2+3}{\sigma^4}\;:4次の中心(化)モーメント \\&=&\frac{1}{n^3}\left(\frac{\beta_2+3+3(n-1)}{\sigma^4}\right) \\&=&\frac{1}{n^3}\left(\frac{\beta_2+3+3n-3}{\sigma^4}\right) \\&=&\frac{1}{n^3}\left(\frac{\beta_2+3n}{\sigma^4}\right) \end{eqnarray} $$

標本平均\(\overline{X}\)の尖度\(\beta_2\left(\overline{X}\right)\)

$$ \begin{eqnarray} \href{https://shikitenkai.blogspot.com/2020/08/blog-post_39.html}{\beta_2\left(\overline{X}\right)=\frac{\beta_2}{n}} \end{eqnarray} $$

標本平均の分布の尖度

標本平均\(\overline{X}\)の尖度\(\beta_2\left(\overline{X}\right)\)

$$ \begin{eqnarray} \beta_2\left(\overline{X}\right) &=&\frac{\mathrm{E}\left[(\overline{X}-\mu)^4\right]}{\mathrm{V}\left[\overline{X}\right]^{\frac{4}{2}}}-3 \;\cdots\; \href{https://shikitenkai.blogspot.com/2020/08/blog-post_27.html}{\beta_2=\beta_2\left(X\right)=\frac{\mathrm{E}\left[(X-\mu)^4\right]}{\mathrm{V}\left[X\right]^{\frac{4}{2}}}-3=\frac{\mu_4}{\sigma^4}-3\;:尖度} \\&=&\frac{\frac{1}{n^3}\left(\mu_4+3(n-1)\sigma^4\right)}{\left(\frac{\sigma^2}{n}\right)^2}-3 \\&&\;\cdots\;\href{https://shikitenkai.blogspot.com/2020/08/4-4.html}{\mathrm{E}\left[(\overline{X}-\mu)^4\right]=\frac{1}{n^3}\left(\mu_4+3(n-1)\sigma^4\right) \;:標本平均の母平均まわりの4次モーメント (標本平均の4次の中心(化)モーメント) } \\&&\;\cdots\;\href{https://shikitenkai.blogspot.com/2019/06/specimen-random-variable_3.html}{\mathrm{V}\left[\overline{X}\right]=\frac{\sigma^2}{n}\;:標本平均の分散} \\&=&\frac{n^2}{n^3}\frac{\mu_4+3(n-1)\sigma^4}{\sigma^4}-3 \\&=&\frac{1}{n}\left(\frac{\mu_4}{\sigma^4}+3(n-1)\right)-3 \\&=&\frac{1}{n}\left(\frac{\mu_4}{\sigma^4}+3n-3\right)-3 \\&=&\frac{\frac{\mu_4}{\sigma^4}-3}{n}+3-3 \\&=&\frac{\frac{\mu_4}{\sigma^4}-3}{n} \\&=&\frac{\beta_2}{n} \;\cdots\;\beta_2=\frac{\mu_4}{\sigma^4}-3 \end{eqnarray} $$ よって標本数\(n\)を増やすことで\(\beta_2\left(\overline{X}\right)\)は\(0\)に近づいていく.

母集団の尖度

母集団(平均 \(\mu\), 分散 \(\sigma^2\))の尖度\(\beta_2\)

\begin{array}{rcl} \beta_2 &=& \frac{\mathrm{E}\left[\left(X-\mu\right)^4\right]}{ \mathrm{V}\left[ X \right]^{\frac{4}{2}} }-3\;\cdots\;尖度(せんど,\;kurtosis)\\ &=& \frac{\mu_4}{\sigma^4}-3\\ && \,\dotso\, \mu_4 = \mathrm{E}\left[\left(X-\mu\right)^3\right] \,\,\, \left(確率変数Xの母平均\muまわりの4次モーメント=確率変数Xの4次の中心(化)モーメント\right)\\ \end{array}

確率変数\(X\)の母平均\(\mu\)まわりの4次のモーメント\(\mu_4\)を尖度\(\beta_2\)で表す

\begin{array}{rcl} \mu_4 &=& (\beta_2-3)\sigma^4\\ \end{array}

カイ二乗分布の期待値と分散

カイ二乗分布の期待値と分散

カイ二乗分布の期待値(一次モーメント)

$$ \begin{eqnarray} \mathrm{E}\left[x\right]&=&\int_0^\infty x \chi^2(x) \mathrm{d}x \\&=&\int_0^\infty x \frac{1}{2^{\frac{n}{2}}\Gamma\left(\frac{n}{2}\right)}e^{-\frac{x}{2}}x^{\frac{n}{2}-1} \mathrm{d}x \\&=&\frac{1}{2^{\frac{n}{2}}\Gamma\left(\frac{n}{2}\right)}\int_0^\infty x e^{-\frac{x}{2}}x^{\frac{n}{2}-1} \mathrm{d}x \\&=&\frac{1}{2^{\frac{n}{2}}\Gamma\left(\frac{n}{2}\right)}\int_0^\infty e^{-\frac{x}{2}}x^{\frac{n}{2}} \mathrm{d}x \\&=&\frac{1}{2^{\frac{n}{2}}\Gamma\left(\frac{n}{2}\right)}\int_0^\infty e^{-\frac{x}{2}}x^{\frac{n}{2}} \color{red}{2^{\frac{n}{2}}\left(\frac{1}{2}\right)^{\frac{n}{2}}} \color{black}{\mathrm{d}x} \\&=&\frac{1}{\color{red}{2^{\frac{n}{2}}}\color{black}{\Gamma\left(\frac{n}{2}\right)}}\color{red}{2^{\frac{n}{2}}}\color{black}{\int_0^\infty e^{-\frac{x}{2}}x^{\frac{n}{2}} \left(\frac{1}{2}\right)^{\frac{n}{2}}\mathrm{d}x} \\&=&\frac{1}{\Gamma\left(\frac{n}{2}\right)}\int_0^\infty e^{-\frac{x}{2}}\left(\frac{x}{2}\right)^{\frac{n}{2}}\mathrm{d}x \\&=&\frac{1}{\Gamma\left(\frac{n}{2}\right)}\int_0^\infty e^{-t}t^{\frac{n}{2}}\;2\mathrm{d}t \;\cdots\;t=\frac{x}{2},\frac{\mathrm{d}t}{\mathrm{d}x}=\frac{1}{2},\mathrm{d}x=2\mathrm{d}t \\&=&\frac{1}{\Gamma\left(\frac{n}{2}\right)}2\int_0^\infty e^{-t}t^{\frac{n}{2}}\;\mathrm{d}t \;\cdots\;\int cf(x) \mathrm{d}x=c\int f(x) \mathrm{d}x \\&=&\frac{1}{\Gamma\left(\frac{n}{2}\right)}2\int_0^\infty e^{-t}t^{\frac{n}{2}\color{red}{+1-1}}\;\mathrm{d}t \\&=&\frac{1}{\Gamma\left(\frac{n}{2}\right)}2\Gamma\left(\frac{n}{2}+1\right) \;\cdots\;\Gamma\left(s\right)=\int_0^\infty e^{-t}t^{s-1}\mathrm{d}t \\&=&\frac{1}{\Gamma\left(\frac{n}{2}\right)}2\frac{n}{2}\Gamma\left(\frac{n}{2}\right) \;\cdots\;\href{https://shikitenkai.blogspot.com/2020/08/s1ss.html}{\Gamma(s+1)=\int_0^\infty e^{-t}t^{s}\;\mathrm{d}t=s\Gamma(s)} \\&=&n \end{eqnarray} $$

カイ二乗分布の二次モーメント

$$ \begin{eqnarray} \mathrm{E}\left[x^2\right]&=&\int_0^\infty x^2 \chi^2(x) \mathrm{d}x \\&=&\int_0^\infty x^2 \frac{1}{2^{\frac{n}{2}}\Gamma\left(\frac{n}{2}\right)}e^{-\frac{x}{2}}x^{\frac{n}{2}-1} \mathrm{d}x \\&=&\frac{1}{2^{\frac{n}{2}}\Gamma\left(\frac{n}{2}\right)}\int_0^\infty x^2 e^{-\frac{x}{2}}x^{\frac{n}{2}-1} \mathrm{d}x \\&=&\frac{1}{2^{\frac{n}{2}}\Gamma\left(\frac{n}{2}\right)}\int_0^\infty e^{-\frac{x}{2}}x^{\frac{n}{2}+1} \mathrm{d}x \\&=&\frac{1}{2^{\frac{n}{2}}\Gamma\left(\frac{n}{2}\right)}\int_0^\infty e^{-\frac{x}{2}}x^{\frac{n}{2}+1} \color{red}{2^{\frac{n}{2}+1} \left(\frac{1}{2}\right)^{\frac{n}{2}+1} } \color{black}{\mathrm{d}x} \\&=&\frac{1}{2^{\frac{n}{2}}\Gamma\left(\frac{n}{2}\right)}2^{\frac{n}{2}+1}\int_0^\infty e^{-\frac{x}{2}}x^{\frac{n}{2}+1} \left(\frac{1}{2}\right)^{\frac{n}{2}+1}\mathrm{d}x \\&=&\frac{1}{2^{\frac{n}{2}}\Gamma\left(\frac{n}{2}\right)}2^{\frac{n}{2}}2\int_0^\infty e^{-\frac{x}{2}}x^{\frac{n}{2}+1} \left(\frac{1}{2}\right)^{\frac{n}{2}+1}\mathrm{d}x \\&=&\frac{1}{\color{red}{2^{\frac{n}{2}}}\color{black}{\Gamma\left(\frac{n}{2}\right)}}\color{red}{2^{\frac{n}{2}}}\color{black}{2\int_0^\infty e^{-\frac{x}{2}}x^{\frac{n}{2}+1} \left(\frac{1}{2}\right)^{\frac{n}{2}+1}\mathrm{d}x} \\&=&\frac{2}{\Gamma\left(\frac{n}{2}\right)}\int_0^\infty e^{-\frac{x}{2}}\left(\frac{x}{2}\right)^{\frac{n}{2}+1}\mathrm{d}x \\&=&\frac{2}{\Gamma\left(\frac{n}{2}\right)}\int_0^\infty e^{-t}t^{\frac{n}{2}+1}\;2\mathrm{d}t \;\cdots\;t=\frac{x}{2},\frac{\mathrm{d}t}{\mathrm{d}x}=\frac{1}{2},\mathrm{d}x=2\mathrm{d}t \\&=&\frac{2}{\Gamma\left(\frac{n}{2}\right)}2\int_0^\infty e^{-t}t^{\frac{n}{2}+1}\;\mathrm{d}t \;\cdots\;\int cf(x) \mathrm{d}x=c\int f(x) \mathrm{d}x \\&=&\frac{2}{\Gamma\left(\frac{n}{2}\right)}2\int_0^\infty e^{-t}t^{\frac{n}{2}+1\color{red}{+1-1}}\;\mathrm{d}t \\&=&\frac{2}{\Gamma\left(\frac{n}{2}\right)}2\int_0^\infty e^{-t}t^{\frac{n}{2}+2-1}\;\mathrm{d}t \\&=&\frac{2}{\Gamma\left(\frac{n}{2}\right)}2\Gamma\left(\frac{n}{2}+2\right) \;\cdots\;\Gamma\left(s\right)=\int_0^\infty e^{-t}t^{s-1}\mathrm{d}t \\&=&\frac{2}{\Gamma\left(\frac{n}{2}\right)}2\left(\frac{n}{2}+1\right)\frac{n}{2}\Gamma\left(\frac{n}{2}\right) \;\cdots\;\href{https://shikitenkai.blogspot.com/2020/08/s1ss.html}{\Gamma\left(s+2\right)=(s+1)\Gamma(s+1)=(s+1)s\Gamma(s)} \\&=&2n\left(\frac{n}{2}+1\right) \\&=&n(n+2) \end{eqnarray} $$

カイ二乗分布の分散(二次の中心モーメント)

$$ \begin{eqnarray} \mathrm{V}\left[x^2\right]&=&\mathrm{E}\left[(x-\mathrm{E}\left[x\right])^2\right] \\&=&\mathrm{E}\left[x^2\right]-\mathrm{E}\left[x\right]^2 \\&=&n(n+2)-n^2 \\&=&n^2+2n-n^2 \\&=&2n \end{eqnarray} $$

Γ(s+1)=sΓ(s)

\(\Gamma(s+1)=s\Gamma(s)\)

$$ \begin{eqnarray} \Gamma(s+1)&=&\int_{0}^{\infty}e^{-t}t^{s\color{red}{+1-1}}\mathrm{d}t \;\cdots\;\Gamma(s)=\int_{0}^{\infty}e^{-t}t^{s-1}\mathrm{d}t \\&=&\int_{0}^{\infty}e^{-t}t^{s}\mathrm{d}t \\&=&\int_{0}^{\infty}\left\{-e^{-t}\right\}^\prime t^{s}\mathrm{d}t \;\cdots\;\frac{\mathrm{d}}{\mathrm{d}t}\left(-e^{-t}\right)=\left\{-e^{-t}\right\}^\prime=e^{-t} \\&=&\left[-e^{-t} t^{s}\right]_0^\infty-\int_{0}^{\infty}\left\{-e^{-t}\right\}\left\{ st^{s-1} \right\}\mathrm{d}t \;\cdots\;\href{https://shikitenkai.blogspot.com/2020/02/blog-post_7.html}{\int f^\prime g\;\mathrm{d}x= fg-\int f g^\prime\;\mathrm{d}x} \\&=&\left[\left(\lim_{t\rightarrow \infty} -\frac{t^{s}}{e^{t}}\right)-\left(-\frac{0^{s}}{e^{0}}\right)\right]+s\int_{0}^{\infty}e^{-t}t^{s-1}\mathrm{d}t \;\cdots\;\int cf(x)\mathrm{d}x=c\int f(x)\mathrm{d}x \\&=&\left[0-0\right]+s\int_{0}^{\infty}e^{-t}t^{s-1}\mathrm{d}t \;\cdots\;\href{https://shikitenkai.blogspot.com/2020/08/xnex.html}{\lim_{t\rightarrow \infty} \frac{t^{s}}{e^{t}}=0} \\&=&s\Gamma(s) \;\cdots\;\Gamma(s)=\int_{0}^{\infty}e^{-t}t^{s-1}\mathrm{d}t \end{eqnarray} $$

\(\Gamma(s+2)=(s+1)s\Gamma(s)\)

$$ \begin{eqnarray} \Gamma(s+2)&=&(s+1)\Gamma(s+1) \\&=&(s+1)s\Gamma(s) \end{eqnarray} $$

(x^n)/(e^x)の極限

(x^n)/(e^x)の極限

以下を証明する $$ \begin{eqnarray} \lim_{x\rightarrow\infty}\frac{x^n}{e^x}&=&0 \end{eqnarray} $$ まず\(e^x\)のマクローリン展開から\(e^x\)より小さい値を考える. $$ \begin{eqnarray} \\e^x &=&\href{https://shikitenkai.blogspot.com/2019/07/blog-post.html}{\sum_{k=0}^{\infty}\frac{x^k}{k!}} \\&=&\sum_{k=0}^{\infty}f_k(x) \;\cdots\;f_k(x)=\frac{x^k}{k!} \\&\gt&\frac{x^{n+1}}{(n+1)!}\;\cdots\;いくつものf_kを足し合わせたe^xの方が特定のf_k(例えばk=n+1)だけより大きい \end{eqnarray} $$ 上記値の逆数をとる. $$ \begin{eqnarray} \\\frac{1}{e^x}&\lt&\frac{(n+1)!}{x^{n+1}} \;\cdots\;逆数なので大小関係が反対になる \end{eqnarray} $$ 両辺にx^nを掛け,証明したい式の形にする. $$ \begin{eqnarray} \\x^n\frac{1}{e^x}&\lt&x^n\frac{(n+1)!}{x^{n+1}} \;\cdots\;両辺にx^nを掛ける \\&\lt&\frac{(n+1)!}{x} \;\cdots\;\frac{x^n}{x^{n+1}}=\frac{1}{x} \end{eqnarray} $$ 両辺の極限をとることで\(\frac{x^n}{e^x}\)の極限の値が0より小さいことがわかる. $$ \begin{eqnarray} \\\lim_{x\rightarrow\infty}\frac{x^n}{e^x}&\lt&\lim_{x\rightarrow\infty}\frac{(n+1)!}{x}&=&0 \end{eqnarray} $$ また,\(0\leq x\)において\(0\leq x^n\)及び\(0\lt e^x\)となるので\(\frac{x^n}{e^x}\)が0以上であることがわかる.
以上から次の関係(不等式)を得る. $$ \begin{eqnarray} 0&\leq& \lim_{x\rightarrow\infty}\frac{x^n}{e^x} &\lt&\lim_{x\rightarrow\infty}\frac{(n+1)!}{x} &=&0 \end{eqnarray} $$ これより“はさみうちの原理”から極限の値が0であることが証明された. $$ \begin{eqnarray} \\\lim_{x\rightarrow\infty}\frac{x^n}{e^x}&=&0 \end{eqnarray} $$

定積分を用いた凾数同士の被積分凾数

定積分を用いた凾数同士の被積分凾数

$$ \begin{eqnarray} F(x)&=&\int_a^x f(t)\mathrm{d}t\;\cdots\;定積分を用いた凾数 \\G(x)&=&\int_a^x g(t)\mathrm{d}t \\ただし,&&a\leq x \leq b \end{eqnarray} $$ とする.今,任意の\(x\)において(\(x\)の取りえる範囲すべての\(x\)において) $$ \begin{eqnarray} F(x)&=&G(x) \end{eqnarray} $$ が成り立つなら(特定の積分範囲の定積分結果が等しいだけではないところに注意), $$ \begin{eqnarray} F^\prime(x)&=&G^\prime(x) \end{eqnarray} $$ つまり $$ \begin{eqnarray} f(x)&=&g(x) \end{eqnarray} $$ が成り立つ.
よって $$ \begin{eqnarray} \int_a^x f(t)\mathrm{d}t &=&\int_a^x g(t)\mathrm{d}t \end{eqnarray} $$ ならば $$ \begin{eqnarray} \\f(x)&=&g(x) \end{eqnarray} $$ であり,被積分凾数同士も等しい.

カイ二乗分布の導出

カイ二乗\(\chi^2\)分布の導出

\(k=1\)のXの確率密度凾数

$$ \begin{eqnarray} X&=&Z_1^2\;\cdots\;Z_1:標準正規分布に従う確率変数 \\F_1(X=t)&=&\int_0^t f_1(x) \mathrm{d}x \;\cdots\;k=1のXの累積分布凾数F_1,確率密度凾数f_1 \\F_1(X=t)&=&\int_{-\sqrt{t}}^{+\sqrt{t}}g(z_1)\mathrm{d}z_1 \;\cdots\;Z_1=\pm\sqrt{X},x:0\rightarrow t,z_1:0\rightarrow\pm\sqrt{t}\;標準正規分布の確率密度凾数g \\&=&2\int_{0}^{\sqrt{t}}g(z_1)\mathrm{d}z_1 \;\cdots\;標準正規分布は偶凾数(y軸(x=0)で左右対称),\;Z_1\geq0のみ考えればよくする \\&=&2\int_{0}^{t}g(\sqrt{x})\frac{1}{2}x^{-\frac{1}{2}}\mathrm{d}x \\&&\;\cdots\;Z_1=\sqrt{X}(Z_1\geq0)より\frac{\mathrm{d}z_1}{\mathrm{d}x}=\frac{\mathrm{d}}{\mathrm{d}x}\sqrt{x}=\frac{\mathrm{d}}{\mathrm{d}x}x^{\frac{1}{2}}=\frac{1}{2}x^{-\frac{1}{2}},\mathrm{d}z_1=\frac{1}{2}x^{-\frac{1}{2}}\mathrm{d}x \\&&\;\cdots\;z_1:0\rightarrow\sqrt{t},\;x:0\rightarrow t \\&=&\int_{0}^{t}g(\sqrt{x})x^{-\frac{1}{2}}\mathrm{d}x \\\int_0^t f_1(x) \mathrm{d}x &=&\int_{0}^{t}g(\sqrt{x})x^{-\frac{1}{2}}\mathrm{d}x \;\cdots\;累積分布凾数同士なので等しい(0から任意のt(\gt0)までの定積分が等しい) \\f_1(x)&=&g(\sqrt{x})x^{-\frac{1}{2}} \;\cdots\;\href{https://shikitenkai.blogspot.com/2020/08/blog-post_25.html}{0から任意のt(\gt0)までの定積分が等しい\rightarrow0からtの範囲で等しい\rightarrow各tでの微分が等しい\rightarrow被積分凾数同士も等しい} \\&=&\frac{1}{\sqrt{2\pi}}e^{-\frac{\sqrt{x}^2}{2}}\;x^{-\frac{1}{2}} \;\cdots\;標準正規分布の確率密度凾数:g(x)=\frac{1}{\sqrt{2\pi}}e^{-\frac{x^2}{2}} \\&=&\frac{1}{\sqrt{2\pi}}e^{-\frac{x}{2}}\;x^{-\frac{1}{2}} \end{eqnarray} $$

\(k=2\)のXの確率密度凾数

\(k=2\) $$ \begin{eqnarray} X&=&Z_1^2+Z_2^2\;\cdots\;Z_1,Z_2:互いに独立に標準正規分布に従う確率変数 \\Y&=&Z_1^2 \\X-Y&=&Z_2^2 \\f_2(x) &=&\int_{0}^{\infty}f_1(y)f_1(x-y)\mathrm{d}y \;\cdots\;k=2のXの確率密度凾数f_2,\;Z_1とZ_2は独立なので同時確率はf_1(Z_1)とf_1(Z_2)の積 \\&=&\int_{0}^{x} \left\{ \frac{1}{\sqrt{2\pi}}e^{-\frac{y}{2}}\;y^{-\frac{1}{2}} \right\} \;\cdot\; \left\{ \frac{1}{\sqrt{2\pi}}e^{-\frac{(x-y)}{2}}\;(x-y)^{-\frac{1}{2}} \right\} \mathrm{d}y \\&=&\frac{1}{\sqrt{2\pi}}\frac{1}{\sqrt{2\pi}} \int_{0}^{x} e^{-\frac{y}{2}}e^{-\frac{(x-y)}{2}}\;y^{-\frac{1}{2}}(x-y)^{-\frac{1}{2}} \mathrm{d}y \\&=&\frac{1}{2\pi} \int_{0}^{x} e^{-\frac{y}{2}-\frac{(x-y)}{2}}\;y^{-\frac{1}{2}}(x-y)^{-\frac{1}{2}} \mathrm{d}y \\&=&\frac{1}{2\pi}e^{-\frac{x}{2}} \int_{0}^{x} y^{-\frac{1}{2}}(x-y)^{-\frac{1}{2}} \mathrm{d}y \\&=&\frac{1}{2\pi}e^{-\frac{x}{2}}\pi \;\cdots\;\href{https://shikitenkai.blogspot.com/2020/08/1xa-x.html}{\int_{0}^{a}\frac{1}{\sqrt{x}\sqrt{a-x}}\mathrm{d}x=\pi} \\&=&\frac{1}{2}e^{-\frac{x}{2}} \end{eqnarray} $$

\(k=3\)のXの確率密度凾数

$$ \begin{eqnarray} X&=&Z_1^2+Z_2^2+Z_3^2\;\cdots\;Z_1,Z_2,Z_3:互いに独立に標準正規分布に従う確率変数 \\Y&=&Z_1^2+Z_2^2 \\X-Y&=&Z_3^2 \\f_3(x) &=&\int_{0}^{\infty}f_2(y)f_1(x-y)\mathrm{d}y \;\cdots\;k=3のXの確率密度凾数f_3,\;YとZ_3は独立なので同時確率はf_2(Y)とf_1(Z_3)の積 \\&=&\int_{0}^{x} \left\{ \frac{1}{2}e^{-\frac{y}{2}} \right\} \;\cdot\; \left\{ \frac{1}{\sqrt{2\pi}}e^{-\frac{(x-y)}{2}}\;(x-y)^{-\frac{1}{2}} \right\} \mathrm{d}y \\&=&\frac{1}{2}\frac{1}{\sqrt{2\pi}} \int_{0}^{x} e^{-\frac{y}{2}}e^{-\frac{(x-y)}{2}}\;(x-y)^{-\frac{1}{2}} \mathrm{d}y \\&=&\frac{1}{2\sqrt{2\pi}} \int_{0}^{x} e^{-\frac{y}{2}-\frac{(x-y)}{2}}\;(x-y)^{-\frac{1}{2}} \mathrm{d}y \\&=&\frac{1}{2\sqrt{2\pi}}e^{-\frac{x}{2}} \int_{0}^{x} (x-y)^{-\frac{1}{2}} \mathrm{d}y \\&=&\frac{1}{2\sqrt{2\pi}}e^{-\frac{x}{2}} 2x^{\frac{1}{2}} \;\cdots\;\href{https://shikitenkai.blogspot.com/2020/08/1a-x.html}{\int_{0}^{a}\frac{1}{\sqrt{a-x}}\mathrm{d}x=2a^{\frac{1}{2}}} \\&=&\frac{1}{\sqrt{2\pi}}e^{-\frac{x}{2}} x^{\frac{1}{2}} \end{eqnarray} $$

\(k=4\)のXの確率密度凾数

$$ \begin{eqnarray} X&=&Z_1^2+Z_2^2+Z_3^2+Z_4^2\;\cdots\;Z_1,Z_2,Z_3,Z_4:互いに独立に標準正規分布に従う確率変数 \\Y&=&Z_1^2+Z_2^2+Z_3^2 \\X-Y&=&Z_4^2 \\f_4(x) &=&\int_{0}^{\infty}f_3(y)f_1(x-y)\mathrm{d}y \;\cdots\;k=4のXの確率密度凾数f_4,\;YとZ_4は独立なので同時確率はf_3(Y)とf_1(Z_4)の積 \\&=&\int_{0}^{x} \left\{ \frac{1}{\sqrt{2\pi}}e^{-\frac{y}{2}}y^{\frac{1}{2}} \right\} \;\cdot\; \left\{ \frac{1}{\sqrt{2\pi}}e^{-\frac{(x-y)}{2}}\;(x-y)^{-\frac{1}{2}} \right\} \mathrm{d}y \\&=&\frac{1}{\sqrt{2\pi}}\frac{1}{\sqrt{2\pi}} \int_{0}^{x} e^{-\frac{y}{2}}e^{-\frac{(x-y)}{2}}\;y^{\frac{1}{2}}(x-y)^{-\frac{1}{2}} \mathrm{d}y \\&=&\frac{1}{2\pi} \int_{0}^{x} e^{-\frac{y}{2}-\frac{(x-y)}{2}}\;y^{\frac{1}{2}}(x-y)^{-\frac{1}{2}} \mathrm{d}y \\&=&\frac{1}{2\pi}e^{-\frac{x}{2}} \int_{0}^{x} y^{\frac{1}{2}}(x-y)^{-\frac{1}{2}} \mathrm{d}y \\&=&\frac{1}{2\pi}e^{-\frac{x}{2}} \frac{x\pi}{2} \;\cdots\;\href{https://shikitenkai.blogspot.com/2020/08/xa-x.html}{\int_{0}^{a}\frac{\sqrt{x}}{\sqrt{a-x}}\mathrm{d}x=\frac{a\pi}{2}} \\&=&\frac{1}{4}e^{-\frac{x}{2}} x \end{eqnarray} $$

\(k=5\)のXの確率密度凾数

$$ \begin{eqnarray} X&=&Z_1^2+Z_2^2+Z_3^2+Z_4^2+Z_5^2\;\cdots\;Z_1,Z_2,Z_3,Z_4,Z_5:互いに独立に標準正規分布に従う確率変数 \\Y&=&Z_1^2+Z_2^2+Z_3^2+Z_4^2 \\X-Y&=&Z_5^2 \\f_5(x) &=&\int_{0}^{\infty}f_4(y)f_1(x-y)\mathrm{d}y \;\cdots\;k=5のXの確率密度凾数f_5,\;YとZ_4は独立なので同時確率はf_4(Y)とf_1(Z_5)の積 \\&=&\int_{0}^{x} \left\{ \frac{1}{4}e^{-\frac{y}{2}} y \right\} \;\cdot\; \left\{ \frac{1}{\sqrt{2\pi}}e^{-\frac{(x-y)}{2}}\;(x-y)^{-\frac{1}{2}} \right\} \mathrm{d}y \\&=&\frac{1}{4}\frac{1}{\sqrt{2\pi}} \int_{0}^{x} e^{-\frac{y}{2}}e^{-\frac{(x-y)}{2}}\;y(x-y)^{-\frac{1}{2}} \mathrm{d}y \\&=&\frac{1}{4\sqrt{2\pi}} \int_{0}^{x} e^{-\frac{y}{2}-\frac{(x-y)}{2}}\;y(x-y)^{-\frac{1}{2}} \mathrm{d}y \\&=&\frac{1}{4\sqrt{2\pi}}e^{-\frac{x}{2}} \int_{0}^{x} y(x-y)^{-\frac{1}{2}} \mathrm{d}y \\&=&\frac{1}{4\sqrt{2\pi}}e^{-\frac{x}{2}}\frac{4}{3}x^{\frac{3}{2}} \;\cdots\;\href{https://shikitenkai.blogspot.com/2020/08/xa-x_22.html}{\int_{0}^{a}\frac{x}{\sqrt{a-x}}\mathrm{d}x=\frac{4}{3}a^{\frac{3}{2}}} \\&=&\frac{1}{3\sqrt{2\pi}}e^{-\frac{x}{2}} x^{\frac{3}{2}} \end{eqnarray} $$

\(k=n\)の\(X\)の確率密度凾数の類推

$$ \begin{array}[crcccc] \\k=1&:&f_1(x)=&\frac{1}{\sqrt{2\pi}}&e^{-\frac{x}{2}}&x^{-\frac{1}{2}} \\k=2&:&f_2(x)=&\frac{1}{2}&e^{-\frac{x}{2}}&x^0 \\k=3&:&f_3(x)=&\frac{1}{\sqrt{2\pi}}&e^{-\frac{x}{2}}&x^{\frac{1}{2}} \\k=4&:&f_4(x)=&\frac{1}{4}&e^{-\frac{x}{2}}&x^1 \\k=5&:&f_5(x)=&\frac{1}{3\sqrt{2\pi}}&e^{-\frac{x}{2}}&x^{\frac{3}{2}} \\k=n&:&f_n(x)=&\alpha_n&e^{-\frac{x}{2}}&x^{\frac{n}{2}-1} \end{array} $$

係数\(\alpha_n\)を求める

$$ \begin{eqnarray} F_n(t) &=&\int^{t}_0 f_n(x) \mathrm{d}x \;\cdots\;任意のnにおけるXの累積分布凾数F_n,確率密度凾数f_n \\F_n(t)&=&\int^{t}_0 \alpha_ne^{-\frac{x}{2}}x^{\frac{n}{2}-1} \mathrm{d}x \;\cdots\;前述の類推より \\&=&\alpha_n\int^{t}_0e^{-\frac{x}{2}}x^{\frac{n}{2}-1} \mathrm{d}x\;\cdots\;\alpha_nはxによらない定数,\;\int cf(x)\mathrm{d}x=c\int f(x)\mathrm{d}x \\&=&\alpha_n\int^{t}_0e^{-t}(2u)^{\frac{n}{2}-1} 2\mathrm{d}u \;\cdots\;u=\frac{x}{2},x=2u,\frac{dx}{du}=2,x:0\rightarrow\infty,u:0\rightarrow\infty \\&=&\alpha_n\int^{t}_0e^{-t}2^{\frac{n}{2}-1}t^{\frac{n}{2}-1} 2\mathrm{d}t \\&=&\alpha_n\int^{t}_0e^{-t}2^{\frac{n}{2}}t^{\frac{n}{2}-1}\mathrm{d}t \\&=&\alpha_n2^{\frac{n}{2}}\int^{t}_0e^{-t}t^{\frac{n}{2}-1}\mathrm{d}t \\F_n(t=\infty)&=&\int^{\infty}_0 f_n(x) \mathrm{d}x \\&=&\alpha_n2^{\frac{n}{2}}\int^{\infty}_0e^{-t}t^{\frac{n}{2}-1}\mathrm{d}t \\&=&\alpha_n2^{\frac{n}{2}}\Gamma\left(\frac{n}{2}\right)\;\cdots\;\Gamma\left(s\right)=\int^{\infty}_0e^{-t}t^{s-1}\mathrm{d}t \\&=&\alpha_n2^{\frac{n}{2}}\Gamma\left(\frac{n}{2}\right) \\&=&1\;\cdots\;全事象は1 \\\alpha_n &=&\frac{F_n(t=\infty)}{2^{\frac{n}{2}}\Gamma\left(\frac{n}{2}\right)} \\&=&\frac{1}{2^{\frac{n}{2}}\Gamma\left(\frac{n}{2}\right)} \end{eqnarray} $$

係数\(\alpha_k\)の確認

$$ \begin{array}[crlllll] \\\alpha_1&=&\frac{1}{2^{\frac{1}{2}}\Gamma\left(\frac{1}{2}\right)}&=&\frac{1}{2^{\frac{1}{2}}\;\sqrt{\pi}} &=&\frac{1}{\sqrt{2\pi}} \\\alpha_2&=&\frac{1}{2^{\frac{2}{2}}\Gamma\left(\frac{2}{2}\right)}&=&\frac{1}{2^1 \; 1} &=&\frac{1}{2} \\\alpha_3&=&\frac{1}{2^{\frac{3}{2}}\Gamma\left(\frac{3}{2}\right)}&=&\frac{1}{2^{\frac{3}{2}}\;\frac{\sqrt{\pi}}{2}} &=&\frac{1}{\sqrt{2\pi}} \\\alpha_4&=&\frac{1}{2^{\frac{4}{2}}\Gamma\left(\frac{4}{2}\right)}&=&\frac{1}{2^2 \; 1} &=&\frac{1}{4} \\\alpha_5&=&\frac{1}{2^{\frac{5}{2}}\Gamma\left(\frac{5}{2}\right)}&=&\frac{1}{2^{\frac{5}{2}}\;\frac{3\sqrt{\pi}}{4}}&=&\frac{1}{3\sqrt{2\pi}} \end{array} $$

累積分布凾数の等式より\(k=n\)の\(X\)の確率密度凾数を求める

$$ \begin{eqnarray} \\\int^{t}_0 f_n(x) \mathrm{d}x &=&\int^{t}_0 \frac{1}{2^{\frac{n}{2}}\Gamma\left(\frac{n}{2}\right)}e^{-\frac{x}{2}}x^{\frac{n}{2}-1} \mathrm{d}x \;\cdots\;累積分布凾数同士なので等しい(0から任意のt(\gt0)までの定積分が等しい) \\f_n(x) &=& \frac{1}{2^{\frac{n}{2}}\Gamma\left(\frac{n}{2}\right)}e^{-\frac{x}{2}}x^{\frac{n}{2}-1} \;\cdots\;\href{https://shikitenkai.blogspot.com/2020/08/blog-post_25.html}{0から任意のt(\gt0)までの定積分が等しい\rightarrow0からtの範囲で等しい\rightarrow各tでの微分が等しい\rightarrow被積分凾数同士も等しい} \end{eqnarray} $$

数学的帰納法による証明

\(f_1(x),f_n(x)\)を認めた上で,\(f_{n+1}(x)\)が\(f_n(x)\)の\(n\)を\(n+1\)とした式になるかを確認する. $$ \begin{eqnarray} f_1(x) &=& \frac{1}{\sqrt{2\pi}}e^{-\frac{x}{2}}\;x^{-\frac{1}{2}} \\f_{n}(x) &=& \frac{1}{2^{\frac{n}{2}}\Gamma\left(\frac{n}{2}\right)}e^{-\frac{x}{2}}x^{\frac{n}{2}-1} \\f_{n+1}(x) &=& \frac{1}{2^{\frac{\color{red}{n+1}}{2}}\Gamma\left(\frac{\color{red}{n+1}}{2}\right)}e^{-\frac{x}{2}}x^{\frac{\color{red}{n+1}}{2}-1}\;\cdots\;f_n(x)のnをn+1とした式,上2式を認めた上でこの式が導出できるか. \end{eqnarray} $$ $$ \begin{eqnarray} X&=&Z_1^2+\cdots+Z_n^2+Z_{n+1}^2\;\cdots\;Z_1,\cdots,Z_n,Z_{n+1}:互いに独立に標準正規分布に従う確率変数 \\Y&=&Z_1^2+\cdots+Z_{n} \\X-Y&=&Z_{n+1}^2 \\f_{n+1}(x) &=& \int_0^\infty f_n(y)f_1(x-y) \mathrm{d}y \\&=& \int_0^\infty \left\{ \frac{1}{2^{\frac{n}{2}}\Gamma\left(\frac{n}{2}\right)}e^{-\frac{y}{2}}y^{\frac{n}{2}-1} \right\}\left\{ \frac{1}{\sqrt{2\pi}}e^{-\frac{x-y}{2}}\;(x-y)^{-\frac{1}{2}} \right\} \mathrm{d}y \\&=&\frac{1}{2^{\frac{n}{2}}\Gamma\left(\frac{n}{2}\right)}\frac{1}{\sqrt{2\pi}} \int_0^\infty e^{-\frac{y}{2}}e^{-\frac{x-y}{2}} y^{\frac{n}{2}-1}(x-y)^{-\frac{1}{2}} \mathrm{d}y \;\cdots\;\int cf(x)\mathrm{d}x=c\int f(x)\mathrm{d}x \\&=&\frac{1}{2^{\frac{n}{2}}\Gamma\left(\frac{n}{2}\right)}\frac{1}{\sqrt{2}\sqrt{\pi}} \int_0^\infty e^{-\frac{y}{2}-\frac{x-y}{2}} y^{\frac{n}{2}-1}(x-y)^{-\frac{1}{2}} \mathrm{d}y \;\cdots\;A^BA^C=A^{B+C} \\&=&\frac{1}{2^{\frac{n}{2}}\Gamma\left(\frac{n}{2}\right)}\frac{1}{2^{\frac{1}{2}}\sqrt{\pi}} \int_0^\infty e^{-\frac{x}{2}} y^{\frac{n}{2}-1}(x-y)^{-\frac{1}{2}} \mathrm{d}y \\&=&\frac{1}{2^{\frac{n}{2}}2^{\frac{1}{2}}\Gamma\left(\frac{n}{2}\right)}\frac{1}{\sqrt{\pi}} e^{-\frac{x}{2}} \int_0^\infty y^{\frac{n}{2}-1}(x-y)^{-\frac{1}{2}} \mathrm{d}y \;\cdots\;\int cf(x)\mathrm{d}x=c\int f(x)\mathrm{d}x \\&=&\frac{1}{2^{\frac{n+1}{2}}\Gamma\left(\frac{n}{2}\right)}\frac{1}{\sqrt{\pi}} e^{-\frac{x}{2}} \int_0^1 (xu)^{\frac{n}{2}-1}(x-xu)^{-\frac{1}{2}} x \mathrm{d}u \;\cdots\; y=xu, \frac{\mathrm{d}y}{\mathrm{d}u}=x, x:0\rightarrow\infty, u:0\rightarrow1 \\&=&\frac{1}{2^{\frac{n+1}{2}}\Gamma\left(\frac{n}{2}\right)}\frac{1}{\sqrt{\pi}} e^{-\frac{x}{2}} \int_0^1 (xu)^{\frac{n}{2}-1}\left\{x(1-u)\right\}^{-\frac{1}{2}} x \mathrm{d}u \\&=&\frac{1}{2^{\frac{n+1}{2}}\Gamma\left(\frac{n}{2}\right)}\frac{1}{\sqrt{\pi}} e^{-\frac{x}{2}} \int_0^1 x^{\frac{n}{2}-1}u^{\frac{n}{2}-1}x^{-\frac{1}{2}}(1-u)^{-\frac{1}{2}} x \mathrm{d}u \;\cdots\;(AB)^C=A^CB^C \\&=&\frac{1}{2^{\frac{n+1}{2}}\Gamma\left(\frac{n}{2}\right)}\frac{1}{\sqrt{\pi}} e^{-\frac{x}{2}} x^{\frac{n}{2}-1} x^{-\frac{1}{2}} x \int_0^1 u^{\frac{n}{2}-1}(1-u)^{-\frac{1}{2}} \mathrm{d}u \;\cdots\;\int cf(x)\mathrm{d}x=c\int f(x)\mathrm{d}x \\&=&\frac{1}{2^{\frac{n+1}{2}}\Gamma\left(\frac{n}{2}\right)}\frac{1}{\sqrt{\pi}} e^{-\frac{x}{2}} x^{ \frac{n}{2}-1-\frac{1}{2}+1} \int_0^1 u^{\frac{n}{2}-1}(1-u)^{-\frac{1}{2}} \mathrm{d}u \;\cdots\;A^BA^C=A^{B+C} \\&=&\frac{1}{2^{\frac{n+1}{2}}\Gamma\left(\frac{n}{2}\right)}\frac{1}{\sqrt{\pi}} e^{-\frac{x}{2}} x^{ \frac{n+1}{2}-1} \int_0^1 u^{\frac{n}{2}-1}(1-u)^{-\frac{1}{2}} \mathrm{d}u \\&=&\frac{1}{2^{\frac{n+1}{2}}\Gamma\left(\frac{n}{2}\right)}\frac{1}{\sqrt{\pi}} e^{-\frac{x}{2}} x^{ \frac{n+1}{2}-1} B\left(\frac{n}{2}-1, -\frac{1}{2}\right) \;\cdots\;B(p,q)=\int_0^1 x^{p}(1-x)^{q} \mathrm{d}x \\&=&\frac{1}{2^{\frac{n+1}{2}}\Gamma\left(\frac{n}{2}\right)}\frac{1}{\sqrt{\pi}} e^{-\frac{x}{2}} x^{ \frac{n+1}{2}-1} \frac{\left(\frac{n}{2}-1\right)!\left(-\frac{1}{2}\right)!}{\left(\frac{n}{2}-1-\frac{1}{2}+1\right)!} \;\cdots\;B(p,q)=\frac{p!q!}{\left(p+q+1\right)!} \\&=&\frac{1}{2^{\frac{n+1}{2}}\Gamma\left(\frac{n}{2}\right)}\frac{1}{\sqrt{\pi}} e^{-\frac{x}{2}} x^{ \frac{n+1}{2}-1} \frac{\left(\frac{n}{2}-1\right)!\left(-\frac{1}{2}\color{red}{+1-1}\right)!} {\left(\frac{n}{2}+\frac{1}{2}-1\right)!} \\&=&\frac{1}{2^{\frac{n+1}{2}}\Gamma\left(\frac{n}{2}\right)}\frac{1}{\sqrt{\pi}} e^{-\frac{x}{2}} x^{ \frac{n+1}{2}-1} \frac{\left(\frac{n}{2}-1\right)!\left(\frac{1}{2}-1\right)!} {\left(\frac{n+1}{2}-1\right)!} \\&=&\frac{1}{2^{\frac{n+1}{2}}\Gamma\left(\frac{n}{2}\right)}\frac{1}{\sqrt{\pi}} e^{-\frac{x}{2}} x^{ \frac{n+1}{2}-1} \frac{\Gamma\left(\frac{n}{2}\right)\Gamma\left(\frac{1}{2}\right)} {\Gamma\left(\frac{n+1}{2}\right)} \;\cdots\;\Gamma(n)=\left(n-1\right)! \\&=&\frac{1}{2^{\frac{n+1}{2}}\color{red}{\Gamma\left(\frac{n}{2}\right)}} \frac{1}{\sqrt{\pi}} e^{-\frac{x}{2}} x^{ \frac{n+1}{2}-1} \frac{\color{red}{\Gamma\left(\frac{n}{2}\right)}\color{black}{\Gamma\left(\frac{1}{2}\right)}} {\Gamma\left(\frac{n+1}{2}\right)} \\&=&\frac{1}{2^{\frac{n+1}{2}}}\frac{1}{\sqrt{\pi}} e^{-\frac{x}{2}} x^{ \frac{n+1}{2}-1} \frac{\Gamma\left(\frac{1}{2}\right)} {\Gamma\left(\frac{n+1}{2}\right)} \\&=&\frac{1}{2^{\frac{n+1}{2}}} \color{red}{ \frac{1}{\sqrt{\pi}} } \color{black}{e^{-\frac{x}{2}}} x^{ \frac{n+1}{2}-1} \frac{\color{red}{\sqrt{\pi}}} {\Gamma\left(\frac{n+1}{2}\right)} \;\cdots\;\Gamma\left(\frac{1}{2}\right)=\sqrt{\pi} \\&=&\frac{1}{2^{\frac{n+1}{2}}\Gamma\left(\frac{n+1}{2}\right)} e^{-\frac{x}{2}} x^{ \frac{n+1}{2}-1} \end{eqnarray} $$ 任意の\(n(n\geq1)\)において上記式が成り立つことを証明できた.
以上により\(\chi^2\)分布の確率密度凾数が導出された.

x/√(a-x)の積分

x/√(a-x)の積分

不定積分

$$ \begin{eqnarray} \int \frac{x}{\sqrt{a-x}}\mathrm{d}x &=&\int x(a-x)^{-\frac{1}{2}}\mathrm{d}x \\&=&\int x\left\{-2(a-x)^{\frac{1}{2}}\right\}^\prime \mathrm{d}x \;\cdots\;\href{https://shikitenkai.blogspot.com/2020/08/1a-x.html}{\int (a-x)^{-\frac{1}{2}} \mathrm{d}x=-2(a-x)^{\frac{1}{2}}+C} \\&=&-2x(a-x)^{\frac{1}{2}}-\int -2(a-x)^{\frac{1}{2}} \mathrm{d}x \;\cdots\;\href{https://shikitenkai.blogspot.com/2020/02/blog-post_7.html}{\int f^\prime(x)g(x) \mathrm{d}x= fg-\int fg^\prime \mathrm{d}x} \\&=&-2x(a-x)^{\frac{1}{2}}+2\int (a-x)^{\frac{1}{2}} \mathrm{d}x \\&=&-2x(a-x)^{\frac{1}{2}}+2\left\{\frac{1}{\frac{1}{2}+1}(a-x)^{\frac{1}{2}+1}(-1)\right\} \\&=&-2x(a-x)^{\frac{1}{2}}+2\left\{\frac{-1}{\frac{3}{2}}(a-x)^{\frac{3}{2}}\right\} \\&=&-2x(a-x)^{\frac{1}{2}}+2\left\{\frac{-2}{3}(a-x)^{\frac{3}{2}}\right\} \\&=&-2x(a-x)^{\frac{1}{2}}-\frac{4}{3}(a-x)^{\frac{3}{2}} \\&=&-2x(a-x)^{\frac{1}{2}}-\frac{4}{3}(a-x)(a-x)^{\frac{1}{2}} \\&=&(a-x)^{\frac{1}{2}}\left\{-2x-\frac{4}{3}(a-x)\right\} \\&=&(a-x)^{\frac{1}{2}}\left(-2x-\frac{4}{3}a+\frac{4}{3}x\right) \\&=&(a-x)^{\frac{1}{2}}\left(-\frac{6}{3}x-\frac{4}{3}a+\frac{4}{3}x\right) \\&=&(a-x)^{\frac{1}{2}}\left(-\frac{2}{3}x-\frac{4}{3}a\right) \\&=&-\frac{2}{3}(a-x)^{\frac{1}{2}}\left(x+2a\right)+C\;\cdots\;C:積分定数 \\&=&-\frac{2}{3}\sqrt{a-x}\left(x+2a\right)+C \end{eqnarray} $$

定積分

$$ \begin{eqnarray} \int_{0}^{a} \frac{x}{\sqrt{a-x}}\mathrm{d}x &=&\left[-\frac{2}{3}\sqrt{a-x}\left(x+2a\right)\right]_{0}^{a} \\&=&(-\frac{2}{3}\sqrt{a-a}\left(a+2a\right))-(-\frac{2}{3}\sqrt{a-0}\left(0+2a\right)) \\&=&0-(-\frac{2}{3}\sqrt{a}\left(2a\right)) \\&=&0-(-\frac{4}{3}\sqrt{a}^3) \\&=&\frac{4}{3}a^{\frac{3}{2}} \end{eqnarray} $$

√(x)/√(a-x)の積分

√(x)/√(a-x)の積分

不定積分

$$ \begin{eqnarray} \int \frac{\sqrt{x}}{\sqrt{a-x}}\mathrm{d}x &=&\int x^{\frac{1}{2}}(a-x)^{-\frac{1}{2}}\mathrm{d}x \\&=&\int u(a-u^2)^{-\frac{1}{2}}2u\mathrm{d}u \;\cdots\;u=\sqrt{x},\frac{\mathrm{d}u}{\mathrm{d}x}=\frac{1}{2\sqrt{x}},\mathrm{d}x=2\sqrt{x}\mathrm{d}u=2u\mathrm{d}u \\&=&2\int u^2(a-u^2)^{-\frac{1}{2}}\mathrm{d}u \;\cdots\;\int cf(x)\mathrm{d}x=c\int f(x)\mathrm{d}x \\&=&2\int \left\{\sqrt{a}\sin{\left(\theta\right)}\right\}^2\left[a-\left\{\sqrt{a}\sin{\left(\theta\right)}\right\}^2\right]^{-\frac{1}{2}}\sqrt{a}\cos{\left(\theta\right)}\mathrm{d}\theta \;\cdots\;u=\sqrt{a}\sin{\left(\theta\right)},\frac{\mathrm{d}u}{\mathrm{d}\theta}=\sqrt{a}\cos{\left(\theta\right)},\mathrm{d}u=\sqrt{a}\cos{\left(\theta\right)}\mathrm{d}\theta \\&=&2\int a\sin^2{\left(\theta\right)}\left\{a-a\sin^2{\left(\theta\right)}\right\}^{-\frac{1}{2}}\sqrt{a}\cos{\left(\theta\right)}\mathrm{d}\theta \\&=&2a\sqrt{a}\int \sin^2{\left(\theta\right)}\left[a\left\{1-\sin^2{\left(\theta\right)}\right\}\right]^{-\frac{1}{2}}\cos{\left(\theta\right)}\mathrm{d}\theta \\&=&2a\sqrt{a}\int \sqrt{a}\sin^2{\left(\theta\right)}a^{-\frac{1}{2}}\left\{1-\sin^2{\left(\theta\right)}\right\}^{-\frac{1}{2}}\cos{\left(\theta\right)}\mathrm{d}\theta \\&=&2a\sqrt{a}a^{-\frac{1}{2}}\int \sin^2{\left(\theta\right)}\left\{1-\sin^2{\left(\theta\right)}\right\}^{-\frac{1}{2}}\cos{\left(\theta\right)}\mathrm{d}\theta \\&=&2a\int \sin^2{\left(\theta\right)}\left\{1-\sin^2{\left(\theta\right)}\right\}^{-\frac{1}{2}}\cos{\left(\theta\right)}\mathrm{d}\theta \\&=&2a\int \sin^2{\left(\theta\right)}\left\{\cos^2{\left(\theta\right)}\right\}^{-\frac{1}{2}}\cos{\left(\theta\right)}\mathrm{d}\theta \\&=&2a\int \sin^2{\left(\theta\right)}\left\{\cos{\left(\theta\right)}\right\}^{-1}\cos{\left(\theta\right)}\mathrm{d}\theta \\&=&2a\int \sin^2{\left(\theta\right)}\mathrm{d}\theta \\&=&2a\left\{\frac{1}{2}\theta-\frac{1}{2}\sin{\left(\theta\right)}\cos{\left(\theta\right)}\right\} \;\cdots\;\href{https://shikitenkai.blogspot.com/2020/08/sin.html}{\int \sin^2{\left(\theta\right)}\mathrm{d}\theta=\frac{1}{2}\left\{\theta-\sin{\left(\theta\right)}\cos{\left(\theta\right)}\right\}+C\;(C:積分定数)} \\&=&2a\left\{\frac{1}{2}\sin^{-1}{\left(\frac{u}{\sqrt{a}}\right)}-\frac{1}{2}\sin{\left(\sin^{-1}{\left(\frac{u}{\sqrt{a}}\right)} \right)}\cos{\left(\sin^{-1}{\left(\frac{u}{\sqrt{a}}\right)} \right)}\right\} \;\cdots\;u=\sqrt{a}\sin{\left(\theta\right)},\;\frac{u}{\sqrt{a}}=\sin{\left(\theta\right)},\;\theta=\sin^{-1}{\left(\frac{u}{\sqrt{a}}\right)} \\&=&2a\left\{\frac{1}{2}\sin^{-1}{\left(\frac{u}{\sqrt{a}}\right)}-\frac{1}{2}\left(\frac{u}{\sqrt{a}}\right)\sqrt{ 1-\left(\frac{u}{\sqrt{a}}\right)^2 }\right\} \;\cdots\;\sin{\left(\sin^{-1}{\left(x\right)}\right)}=x,\;\cos{\left(\sin^{-1}{\left(x\right)}\right)}=\sqrt{1-x^2} \\&=&2a\frac{1}{2}\left\{\sin^{-1}{\left(\frac{u}{\sqrt{a}}\right)}-\left(\frac{u}{\sqrt{a}}\right)\sqrt{ 1-\left(\frac{u}{\sqrt{a}}\right)^2 }\right\} \\&=&a\left\{\sin^{-1}{\left(\frac{u}{\sqrt{a}}\right)}-\left(\frac{u}{\sqrt{a}}\right)\sqrt{ 1-\left(\frac{u}{\sqrt{a}}\right)^2 }\right\} \\&=&a\left\{\sin^{-1}{\left(\frac{\sqrt{x}}{\sqrt{a}}\right)}-\left(\frac{\sqrt{x}}{\sqrt{a}}\right)\sqrt{ 1-\left(\frac{\sqrt{x}}{\sqrt{a}}\right)^2 }\right\} \;\cdots\;u=\sqrt{x} \\&=&a\left\{\sin^{-1}{\left(\sqrt{\frac{x}{a}}\right)}-\frac{\sqrt{x}}{\sqrt{a}}\sqrt{ 1-\frac{x}{a} }\right\} \\&=&a\left\{\sin^{-1}{\left(\sqrt{\frac{x}{a}}\right)}-\frac{\sqrt{x}}{\sqrt{a}}\sqrt{ \frac{a-x}{a} }\right\} \\&=&a\left\{\sin^{-1}{\left(\sqrt{\frac{x}{a}}\right)}-\frac{\sqrt{x}}{\sqrt{a}}\frac{\sqrt{a-x}}{\sqrt{a}}\right\} \;\cdots\;\sqrt{\frac{A}{B}}=\left(\frac{A}{B}\right)^{\frac{1}{2}}=\left(A\frac{1}{B}\right)^{\frac{1}{2}}=A^{\frac{1}{2}}\left(\frac{1}{B}\right)^{\frac{1}{2}}=A^{\frac{1}{2}}\left(B^{-1}\right)^{\frac{1}{2}}=A^{\frac{1}{2}}B^{-\frac{1}{2}}=\sqrt{A}\frac{1}{\sqrt{B}}=\frac{\sqrt{A}}{\sqrt{B}} \\&=&a\left\{\sin^{-1}{\left(\sqrt{\frac{x}{a}}\right)}-\frac{\sqrt{x}\sqrt{a-x}}{a }\right\} \\&=&a\sin^{-1}{\left(\sqrt{\frac{x}{a}}\right)}-a\frac{\sqrt{x}\sqrt{a-x}}{a} \\&=&a\sin^{-1}{\left(\sqrt{\frac{x}{a}}\right)}-\sqrt{x}\sqrt{a-x}+C\;\cdots\;C:積分定数 \\&=&a\sin^{-1}{\left(\left(\frac{x}{a}\right)^{\frac{1}{2}}\right)}-x^{\frac{1}{2}}\left(a-x\right)^{\frac{1}{2}}+C \end{eqnarray} $$

定積分

$$ \begin{eqnarray} \int_{0}^{a} \frac{\sqrt{x}}{\sqrt{a-x}}\mathrm{d}x &=&\left[a\sin^{-1}{\left(\left(\frac{x}{a}\right)^{\frac{1}{2}}\right)}-x^{\frac{1}{2}}\left(a-x\right)^{\frac{1}{2}}\right]_{0}^{a} \\&=&(a\sin^{-1}{\left(\left(\frac{a}{a}\right)^{\frac{1}{2}}\right)}-a^{\frac{1}{2}}\left(a-a\right)^{\frac{1}{2}})-(a\sin^{-1}{\left(\left(\frac{0}{a}\right)^{\frac{1}{2}}\right)}-0^{\frac{1}{2}}\left(a-0\right)^{\frac{1}{2}}) \\&=&(a\sin^{-1}{\left(1\right)}-0)-(0-0) \;\cdots\;\sin^{-1}{\left(0\right)}=0,\;0^A=0 \\&=&a \frac{\pi}{2} \;\cdots\;\sin^{-1}{\left(1\right)}=\frac{\pi}{2} \\&=&\frac{a\pi}{2} \end{eqnarray} $$

1/√(a-x)の積分

1/√(a-x)の積分

不定積分

$$ \begin{eqnarray} \int \frac{1}{\sqrt{a-x}}\mathrm{d}x &=&\int (a-x)^{-\frac{1}{2}}\mathrm{d}x \\&=&\int u^{-\frac{1}{2}}(-1)\mathrm{d}u \;\cdots\;u=a-x,\frac{\mathrm{d}u}{\mathrm{d}x}=-1,\mathrm{d}x=-\mathrm{d}u \\&=&-\int u^{-\frac{1}{2}}\mathrm{d}u \;\cdots\;\int cf(x)\mathrm{d}x=c\int f(x)\mathrm{d}x \\&=&-\frac{1}{-\frac{1}{2}+1}u^{-\frac{1}{2}+1} \;\cdots\;\int x^a \mathrm{d}x=\frac{1}{a+1}x^{a+1}+C\;(C:積分定数) \\&=&-\frac{1}{\frac{1}{2}}u^{\frac{1}{2}} \\&=&-2u^{\frac{1}{2}} \\&=&-2(a-x)^{\frac{1}{2}}\;\cdots\;u=a-x \\&=&-2(a-x)^{\frac{1}{2}}+C\;\cdots\;C:積分定数 \\&=&-2\sqrt{a-x}+C \end{eqnarray} $$

定積分

$$ \begin{eqnarray} \int_{0}^{a} \frac{1}{\sqrt{a-x}}\mathrm{d}x &=&\left[-2\sqrt{a-x}\right]_{0}^{a} \\&=&(-2\sqrt{a-a})-(-2\sqrt{a-0}) \\&=&0-(-2\sqrt{a}) \\&=&2\sqrt{a} \\&=&2a^{\frac{1}{2}} \end{eqnarray} $$

1/√(x(a-x))の積分

1/√(x(a-x))の積分

不定積分

$$ \begin{eqnarray} \int \frac{1}{\sqrt{x(a-x)}}\mathrm{d}x &=&\int \left\{x(a-x)\right\}^{-\frac{1}{2}}\mathrm{d}x \\&=&\int x^{-\frac{1}{2}}\left(a-x\right)^{-\frac{1}{2}}\mathrm{d}x \\&=&\int u^{-1}(a-u^2)^{-\frac{1}{2}}\;2u\mathrm{d}u \;\cdots\;u=\sqrt{x},\frac{\mathrm{d}u}{\mathrm{d}x}=\frac{1}{2\sqrt{x}},\mathrm{d}x=2\sqrt{x}\mathrm{d}u=2u\mathrm{d}u \\&=&2\int (a-u^2)^{-\frac{1}{2}}\mathrm{d}u \;\cdots\;u^{-1}u=1,\;\int cf(x)\mathrm{d}x=c\int f(x)\mathrm{d}x \\&=&2\int \left[a-\left\{\sqrt{a}\sin{\left(\theta\right)}\right\}^2\right]^{-\frac{1}{2}}\;\sqrt{a}\cos{\left(\theta\right)}\mathrm{d}\theta \;\cdots\;u=\sqrt{a}\sin{\left(\theta\right)},\frac{\mathrm{d}u}{\mathrm{d}\theta}=\sqrt{a}\cos{\left(\theta\right)},\mathrm{d}u=\sqrt{a}\cos{\left(\theta\right)}\mathrm{d}\theta \\&=&2\sqrt{a}\int \left[a-\left\{\sqrt{a}^2\sin^2{\left(\theta\right)}\right\}\right]^{-\frac{1}{2}}\;\cos{\left(\theta\right)}\mathrm{d}\theta \;\cdots\;(AB)^C=A^CB^C \\&=&2\sqrt{a}\int \left\{a-a\sin^2{\left(\theta\right)}\right\}^{-\frac{1}{2}}\;\cos{\left(\theta\right)}\mathrm{d}\theta \\&=&2\sqrt{a}\int \left[a\left\{1-\sin^2{\left(\theta\right)}\right\}\right]^{-\frac{1}{2}}\;\cos{\left(\theta\right)}\mathrm{d}\theta \\&=&2\sqrt{a}\int a^{-\frac{1}{2}}\left\{1-\sin^2{\left(\theta\right)}\right\}^{-\frac{1}{2}}\;\cos{\left(\theta\right)}\mathrm{d}\theta \\&=&2a\sqrt{a}a^{-\frac{1}{2}}\int \left\{1-\sin^2{\left(\theta\right)}\right\}^{-\frac{1}{2}}\;\cos{\left(\theta\right)}\mathrm{d}\theta \\&=&2\int \left\{1-\sin^2{\left(\theta\right)}\right\}^{-\frac{1}{2}}\;\cos{\left(\theta\right)}\mathrm{d}\theta \;\cdots\;\sqrt{A}A^{-\frac{1}{2}}=A^{\frac{1}{2}}A^{-\frac{1}{2}}=A^{-\frac{1}{2}+\frac{1}{2}}=A^0=1 \\&=&2\int \left\{\cos^2{\left(\theta\right)}\right\}^{-\frac{1}{2}}\;\cos{\left(\theta\right)}\mathrm{d}\theta \;\cdots\;\cos^2{\left(\theta\right)}+\sin^2{\left(\theta\right)}=1,\cos^2{\left(\theta\right)}=1-\sin^2{\left(\theta\right)} \\&=&2\int \left\{\cos{\left(\theta\right)}\right\}^{2(-\frac{1}{2})}\;\cos{\left(\theta\right)}\mathrm{d}\theta \;\cdots\;\left(A^B\right)^C=A^{BC} \\&=&2\int \left\{\cos{\left(\theta\right)}\right\}^{-1}\;\cos{\left(\theta\right)}\mathrm{d}\theta \\&=&2\int \mathrm{d}\theta \;\cdots\;A^{-1}A=1 \\&=&2\theta \;\cdots\;\int \mathrm{d}x=x+C\;(C:積分定数) \\&=&2\sin^{-1}{\left(\frac{u}{\sqrt{a}}\right)} \;\cdots\;\theta=\sin^{-1}{\left(\frac{u}{\sqrt{a}}\right)} \\&=&2\sin^{-1}{\left(\frac{\sqrt{x}}{\sqrt{a}}\right)} \;\cdots\;u=\sqrt{x} \\&=&2\sin^{-1}{\left(\sqrt{\frac{x}{a}}\right)}+C\;\cdots\;C:積分定数 \end{eqnarray} $$

定積分

$$ \begin{eqnarray} \int_0^a \frac{1}{\sqrt{x(a-x)}}\mathrm{d}x &=&\left[2\sin^{-1}{\left(\sqrt{\frac{x}{a}}\right)}\right]_0^a \\&=&\left\{2\sin^{-1}{\left(\sqrt{\frac{a}{a}}\right)}\right\}-\left\{2\sin^{-1}{\left(\sqrt{\frac{0}{a}}\right)}\right\} \\&=&\left\{2\sin^{-1}{\left(1\right)}\right\}-\left\{2\sin^{-1}{\left(0\right)}\right\} \\&=&\left(2\frac{\pi}{2}\right)-\left(2\cdot0\right) \\&=&\pi-0 \\&=&\pi \end{eqnarray} $$

sinの二乗の不定積分

sinの二乗の不定積分

$$ \begin{eqnarray} \int \sin^2{\left(\theta\right)}\mathrm{d}\theta &=&\int \frac{1}{2}-\frac{1}{2}\cos{\left(2\theta\right)}\mathrm{d}\theta \\&&\;\cdots\;\cos{\left(2\theta\right)}=\cos^2{\left(\theta\right)}-\sin^2{\left(\theta\right)}=\left(1-\sin^2{\left(\theta\right)}\right)-\sin^2{\left(\theta\right)}=1-2\sin^2{\left(\theta\right)} \\&&\;\cdots\;\sin^2{\left(\theta\right)}=\frac{1}{2}-\frac{1}{2}\cos{\left(2\theta\right)} \\&=&\frac{1}{2}\int \mathrm{d}\theta-\frac{1}{2}\int\cos{\left(2\theta\right)}\mathrm{d}\theta \\&=&\frac{1}{2}\theta-\frac{1}{2}\int\cos{\left(2\theta\right)}\mathrm{d}\theta \\&=&\frac{1}{2}\theta-\frac{1}{2}\int\cos{\left(\phi\right)}\frac{1}{2}\mathrm{d}\phi \;\cdots\;\phi=2\theta, \frac{\mathrm{d}\phi}{\mathrm{d}\theta}=2,\mathrm{d}\theta=\frac{1}{2}\mathrm{d}\phi \\&=&\frac{1}{2}\theta-\frac{1}{4}\int\cos{\left(\phi\right)}\mathrm{d}\phi\;\cdots\;\int cf(x)\mathrm{d}x=c\int f(x)\mathrm{d}x \\&=&\frac{1}{2}\theta-\frac{1}{2}\left(\frac{1}{2}\sin{\left(\phi\right)}\right) \;\cdots\;\int \cos{\left(x\right)}\mathrm{d}x= \sin{\left(x\right)}+C\;(C:積分定数) \\&=&\frac{1}{2}\theta-\frac{1}{4}\sin{\left(\phi\right)} \\&=&\frac{1}{2}\theta-\frac{1}{4}\sin{\left(2\theta\right)} \\&=&\frac{1}{2}\theta-\frac{1}{2}\sin{\left(\theta\right)}\cos{\left(\theta\right)} \\&=&\frac{1}{2}\left\{\theta-\sin{\left(\theta\right)}\cos{\left(\theta\right)}\right\}+C\;\cdots\;C:積分定数 \end{eqnarray} $$

ベータ分布の期待値(平均)と分散

$$ \begin{eqnarray} f(x;m,n)&=&\href{https://shikitenkai.blogspot.com/2020/05/blog-post_22.html}{\frac{x^{(m-1)}(1-x)^{(n-1)}}{B(m,n)}\;\cdots\;ベータ分布} \\B(m,n)&=&\int_0^1x^{(m-1)}(1-x)^{(n-1)} \mathrm{d}x\;\cdots\;ベータ凾数 \\&=&\frac{(m-1)!\;(n-1)!}{\left\{(m-1)+(n-1)+1\right\}!} \;\cdots\;\href{https://shikitenkai.blogspot.com/2020/05/blog-post_22.html}{\int_\alpha^\beta(x-\alpha)^p(\beta-x)^q \mathrm{d}x=\frac{p!\;q!}{(p+q+1)!}(\beta-\alpha)^{(p+q+1)}(第一種オイラー積分)} \\&=&\frac{(m-1)!\;(n-1)!}{(m+n-1)!} \end{eqnarray} $$

ベータ分布の期待値(一次モーメント・平均)

$$ \begin{eqnarray} \mathbf{E}[X]&=&\int_0^1xf(x)\mathrm{d}x \\&=&\int_0^1x\; \frac{x^{(m-1)}(1-x)^{(n-1)}}{B(m,n)} \;\mathrm{d}x \\&=&\frac{1}{B(m,n)} \int_0^1x^{m}(1-x)^{(n-1)}\;\mathrm{d}x \\&=&\frac{(m+n-1)!}{(m-1)!\;(n-1)!}\frac{m!\;(n-1)!}{(m+n)!} \\&=&\frac{(m+n-1)!}{(m+n)!}\frac{m!}{(m-1)!} \\&=&\frac{1}{m+n}\frac{m}{1} \\&=&\frac{m}{m+n}\;\cdots\;ベータ分布の平均 \end{eqnarray} $$

ベータ分布の二次モーメント

$$ \begin{eqnarray} \mathbf{E}[X^2]&=&\int_0^1x^2f(x)\mathrm{d}x \\&=&\int_0^1x^2\; \frac{x^{(m-1)}(1-x)^{(n-1)}}{B(m,n)} \;\mathrm{d}x \\&=&\frac{1}{B(m,n)} \int_0^1x^{(m+1)}(1-x)^{(n-1)}\;\mathrm{d}x \\&=&\frac{(m+n-1)!}{(m-1)!\;(n-1)!}\frac{(m+1)!\;(n-1)!}{((m+1)+n)!} \\&=&\frac{(m+n-1)!}{(m-1)!\;(n-1)!}\frac{(m+1)!\;(n-1)!}{(m+n+1)!} \\&=&\frac{(m+n-1)!}{(m+n+1)!}\frac{(m+1)!}{(m-1)!} \\&=&\frac{1}{(m+n+1)(m+n)}\frac{(m+1)m}{1} \\&=&\frac{(m+1)m}{(m+n+1)(m+n)} \end{eqnarray} $$

ベータ分布の分散(二次の中心モーメント

$$ \begin{eqnarray} \mathbf{V}[X]&=&\mathbf{E}[X^2]-\mathbf{E}[X]^2 \\&=&\frac{(m+1)m}{(m+n+1)(m+n)}-\left\{\frac{m}{m+n}\right\}^2 \\&=&\frac{(m+1)m}{(m+n+1)(m+n)}-\frac{m^2}{\left(m+n\right)^2} \\&=&\frac{(m+n)(m+1)m-(m+n+1)m^2}{(m+n+1)(m+n)^2} \\&=&\frac{(m^2+m+mn+n)m-(m^3+m^2n+m^2)}{(m+n+1)(m+n)^2} \\&=&\frac{m^3+m^2+m^2n+mn-m^3-m^2n-m^2}{(m+n+1)(m+n)^2} \\&=&\frac{mn}{(m+n+1)(m+n)^2}\;\cdots\;ベータ分布の分散 \end{eqnarray} $$

KLダイバージェンスの下限(イェンセンの不等式による)

KLダイバージェンスの下限(イェンセンの不等式による)

$$ \begin{eqnarray} p_i&:&真の確率分布P(X)の事象x_iの確率 \\q_i&:&任意の確率分布Q(X)の事象x_iの確率(Q:予測や符号化など「真の確率分布Pに対するモデル」が示す確率分布) \end{eqnarray} $$ $$ \begin{eqnarray} D_{KL}(P||Q)&=&\sum_{i=1}^{k} p_i\log{\left(\frac{p_i}{q_i}\right)} \\&=&\sum_{i=1}^{k} p_i\left\{-\log{\left(\frac{q_i}{p_i}\right)}\right\} \;\cdots\;\log{\left(\frac{A}{B}\right)}=\log{\left(\left(\frac{B}{A}\right)^{-1}\right)}=-\log{\left(\frac{B}{A}\right)} \\&\geq&-\log{\left(\sum_{i=1}^{k} p_i\frac{q_i}{p_i}\right)} \\&&\;\cdots\;-\log{\left(x\right)}は凸凾数(下に凸の凾数)(判定については後述) \\&&\;\cdots\;\href{https://shikitenkai.blogspot.com/2020/08/blog-post.html}{\sum_{i=1}^{n}\phi_if(x_i)\geq f\left( \sum_{i=1}^{n}\phi_i x_i \right)\;(イェンセンの不等式(下に凸の凾数))} \\D_{KL}(P||Q) &\geq&-\log{\left(\sum_{i=1}^{k} p_i\frac{q_i}{p_i}\right)} \\&\geq&-\log{\left(\sum_{i=1}^{k} q_i\right)} \\&\geq&-\log{\left(1\right)}\;\cdots\;q_iは確率なので\sum_{i=1}^{k} q_i=1 \\&\geq&0 \end{eqnarray} $$

\(-\log{\left(x\right)}\)の凸凾数の判定

$$ \begin{eqnarray} \frac{\mathrm{d}^2}{\mathrm{d}x^2} \left\{ -\log{\left(x\right)} \right\} &=&-\frac{\mathrm{d}^2}{\mathrm{d}x^2} \left\{ \log{\left(x\right)} \right\} \;\cdots\; \left\{cf(x)\right\}^\prime=c\left\{f(x)\right\}^\prime \\&=&-\frac{\mathrm{d}}{\mathrm{d}x} \left\{ \frac{1}{x} \right\} \;\cdots\;\left\{\log{\left(x\right)}\right\}^\prime=x^{-1}=\frac{1}{x} \\&=&-\left(-\frac{1}{x^2}\right) \;\cdots\;\left(x^{-1}\right)^\prime=-x^{-2}=-\frac{1}{x^2} \\&=&\frac{1}{x^2} \gt0\;\cdots\;xが実数の範囲では, 二階微分が常に正なので凸凾数(下に凸の凾数). \end{eqnarray} $$ 

\(\log_{\mathrm{e}}{x}\leq(x-1)\)の関係を用いたKLダイバージェンスの下限

イェンセンの不等式の証明

イェンセンの不等式

\(\phi_1,\phi_2,\cdots\phi_n\)が\(0\lt\phi_i\)かつ\(\sum_{i=1}^{n}\phi_i=1 \)を満たしまた,\(x_1,x_2,\cdots x_n\)を実数の列とするとき,凸凾数\(f(x)\)に対して以下のことが成り立つ. $$ \begin{eqnarray} \sum_{i=1}^{n}\phi_if(x_i)\geq f\left( \sum_{i=1}^{n}\phi_i x_i \right) \;\cdots\;f(x)が凸凾数(下に凸の凾数)のとき. \\\left( \sum_{i=1}^{n}\phi_if(x_i)\leq f\left( \sum_{i=1}^{n}\phi_i x_i \right) \;\cdots\;f(x)が凹凾数(上に凸の凾数)のとき. \right) \end{eqnarray} $$

凸凾数(下に凸の凾数)

凸凾数(下に凸の凾数)の性質

任意の\(x_1, x_2\)に対して凸凾数(下に凸の凾数)\(f(x)\)では,\((x_1, f(x_1)), (x_2, f(x_2))\)を結ぶ線分\(g(x)\)は凾数\(f(x)\)の上側にある.
逆に,\((x_1, f(x_1)), (x_2, f(x_2))\)を結ぶ線分\(g(x)\)が凾数\(f(x)\)の上側にあるので,\(f(x)\)は凸凾数(下に凸の凾数)である. 

凸凾数(下に凸の凾数)にかかる線分\(g(x)\)の式

$$ \begin{eqnarray} x_t&=&\left(1-t\right)x_1+tx_2\;\left(ただし 0\leq t \leq 1\right) \;\cdots\;x_1x_2間を単位としx_1を基準点とするtを用いたxの表現となる. \\&=&\sum_{i=1}^{2}\phi_ix_i\;\cdots\;\phi_i=\left\{1-t,\;t\right\}で,0\leq\phi_iかつ\sum_{i=1}^{2}\phi_i=1 \\ \\ \\ g(x)-f(x_1)&=&\frac{f(x_2)-f(x_1)}{x_2-x_1}\left(x-x_1\right) \;\cdots\;傾きが\frac{f(x_2)-f(x_1)}{x_2-x_1}で点(x_1, f(x_1))を通る直線の方程式 \\g(x)&=&\frac{f(x_2)-f(x_1)}{x_2-x_1}\left(x-x_1\right)+f(x_1) \\&=&\frac{f(x_2)-f(x_1)}{x_2-x_1}\left(x_t-x_1\right)+f(x_1)\;\cdots\;xをtで表現するx_tとする \\&=&\frac{f(x_2)-f(x_1)}{x_2-x_1}\left\{\left(1-t\right)x_1+tx_2-x_1\right\}+f(x_1)\;\cdots\;x_t=\left(1-t\right)x_1+tx_2 \\&=&\frac{f(x_2)-f(x_1)}{x_2-x_1}\left(x_1-tx_1+tx_2-x_1\right)+f(x_1) \\&=&\frac{f(x_2)-f(x_1)}{x_2-x_1}\left(-tx_1+tx_2\right)+f(x_1) \\&=&\frac{f(x_2)-f(x_1)}{x_2-x_1}t\left(x_2-x_1\right)+f(x_1) \\&=&\left\{f(x_2)-f(x_1)\right\}t+f(x_1) \\&=&tf(x_2)-tf(x_1)+f(x_1) \\&=&tf(x_2)+\left(1-t\right)f(x_1) \\&=&\left(1-t\right)f(x_1)+tf(x_2) \\&=&\sum_{i=1}^{2}\phi_if(x_i)\;\cdots\;\phi_i=\left\{1-t,\;t\right\}で,0\leq\phi_iかつ\sum_{i=1}^{2}\phi_i=1 \end{eqnarray} $$

数学的帰納法によるイェンセンの不等式の証明

\(n=2\)の証明

$$ \begin{eqnarray} g(x_t)&\geq&f(x_t)\;\cdots\;凸凾数(下に凸の凾数)の性質よりg(x)がf(x)より上側にある. \\\left(1-t\right)f(x_1)+tf(x_2)&\geq&f\left(\left(1-t\right)x_1+tx_2\right)\;\cdots\;x_t=\left(1-t\right)x_1+tx_2 \\\sum_{i=1}^{2}\phi_if(x_i)&\geq&f\left(\sum_{i=1}^{2}\phi_ix_i\right)\;\cdots\;n=2の証明. \end{eqnarray} $$

\(n=k\)を認める状況で\(n=k+1\)を証明

\(n=k+1\)の式を変形していく. $$ \begin{eqnarray} \sum_{i=1}^{k+1}\theta_if(x_i) &=&\sum_{i=1}^{k}\theta_if(x_i)+\theta_{k+1}f(x_{k+1})\;\cdots\;0\leq\theta_iかつ\sum_{i=1}^{k+1}\theta_i=1 \\&=&\Theta_k\sum_{i=1}^{k}\frac{\theta_i}{\Theta_k}f(x_i)+\theta_{k+1}f(x_{k+1}) \;\cdots\;\Theta_k=\sum_{i=1}^{k}\theta_i=1-\theta_{k+1}\lt1,\;\sum_{i=1}^{k}\frac{\theta_i}{\Theta_k}=\frac{1}{\Theta_k}\sum_{i=1}^{k}\theta_i=1 \\&\geq&\Theta_k f\left(\sum_{i=1}^{k}\frac{\theta_i}{\Theta_k}x_i\right)+\theta_{k+1}f(x_{k+1}) \;\cdots\;n=kは認めるので\left(\sum_{i=1}^{k}\phi_if(x_i)\geq f\left(\sum_{i=1}^{k}\phi_ix_i\right)\right),\; \sum_{i=1}^{k}\frac{\theta_i}{\Theta_k}f(x_i)\geq f\left(\sum_{i=1}^{k}\frac{\theta_i}{\Theta_k}x_i\right). \end{eqnarray} $$

\(n=2\)の式としてみる. $$ \begin{eqnarray} \Theta_k f\left(\sum_{i=1}^{k}\frac{\theta_i}{\Theta_k}x_i\right)+\theta_{k+1}f(x_{k+1}) &=&\lambda_1 f\left(a_1\right)+\lambda_2 f(a_2) \;\cdots\; \lambda_j=\left\{ \Theta_k,\;\theta_{k+1} \right\} ,\;a_j=\left\{ \sum_{i=1}^{k}\frac{\theta_i}{\Theta_k}x_i,\;x_{k+1} \right\} \\&=&\sum_{j=1}^{2}\lambda_j f(a_j) \\&\geq&f\left( \sum_{j=1}^2\lambda_j a_j \right) \;\cdots\;n=2は上記で証明済み. \end{eqnarray} $$

\(\sum_{j=1}^{2}\lambda_j a_j\)を変形していく. $$ \begin{eqnarray} \sum_{j=1}^{2}\lambda_j a_j &=&\Theta_k \sum_{i=1}^{k}\frac{\theta_i}{\Theta_k}x_i + \theta_{k+1} x_{k+1} \\&=&\Theta_k\frac{1}{\Theta_k} \sum_{i=1}^{k}\theta_ix_i + \theta_{k+1} x_{k+1} \;\cdots\;\sum_{i=1}^n Cx_i=C\sum_{i=1}^nx_i \\&=&\sum_{i=1}^{k}\theta_ix_i + \theta_{k+1} x_{k+1} \\&=&\sum_{i=1}^{k+1}\theta_ix_i \end{eqnarray} $$

変形結果を戻す.

$$ \begin{eqnarray} f\left( \sum_{j=1}^2\lambda_j a_j \right) &=&f\left( \sum_{i=1}^{k+1}\theta_ix_i \right) \end{eqnarray} $$

以上より\(n=k\)を認めれば\(n=k+1\)でもイェンセンの不等式が成り立つ. $$ \begin{eqnarray} \sum_{i=1}^{k+1}\theta_if(x_i)&\geq&f\left( \sum_{i=1}^{k+1}\theta_ix_i \right) \end{eqnarray} $$

\(n=2\)及び\(n=k+1\)の証明からの結論

数学的帰納法によりイェンセンの不等式が成り立つ.