Translation of "最小二乗法" to English language:
Dictionary Japanese-English
最小二乗法 - 翻訳 : 最小二乗法 - 翻訳 : 最小二乗法 - 翻訳 :
Examples (External sources, not reviewed)
普通の最小二乗法が我々が最初にとるアプローチです | Then we'll talk about the idea of estimation of regression coefficients. |
そしてその結果 残差の二乗和を得る そしてこの最小二乗法では | Just like the sum of square, sum of deviation squares. So we'll square them. |
そして最小二乗法とはなんなのかを理解する事 | X and Y, we are still assuming that X and Y are both continuous, both normal and there's a linear association between them and we will move beyond those assumptions |
ここが最小二乗法の考えが登場する所です そのアイデアはとてもシンプル | How does R, or how if we want to calculat e this by hand, how do we come up with these estimates? |
その数値を最小化する訳です 残差の二乗和を | And that would give us the sum of squared residuals. |
勾配降下法を 二乗誤差のコスト関数を 最小化するために適用する という事 勾配降下法を | What we're going to do is apply gradient descent to minimize our squared error cost function. |
最小化したい だから残差の二乗和を最小化したい 単回帰の時みたいに | So, we wanna minimize the difference between the observed scores on Y and predicted scores on Y, predicted by the model. |
記法を思い出して下さい 二乗和と二乗和の平均 | Again his average 22.7, standard deviation of 9.6, and variance of 92. |
するとこの場合 最小化問題は 10掛ける (u 5)の二乗 | Now if I want to take this objective function and multiply it by 10, so here my minimization problem is minimum of u of 10, u minus 5 squared plus 10. |
トレーニング手本xiとの二乗距離が 最小になるクラスタ重心を選びとった物と考える事が出来る だがもちろん 距離の二乗を最小化しようと | So we think of Ci as picking the cluster centroid with the smallest squared distance to my training example Xi. |
重回帰での回帰係数の計算 ここでも単回帰の時と同じように最小二乗法やってるんだが | So the main topic of this segment is just again estimation of regression coefficients in multiple regression. |
小さな法律家が肩に乗っていて | It actually changes the way people think. |
前に定義した二乗誤差の目的関数です そして最急降下法と | In the next video, we're going to take the function J, and set that back to be exactly linear regression's cost function. |
0 0 1 2 2 2 です 最小二乗法による回帰直線y bx aの 傾きbと切片aを計算してください | For this data set an observation of 0, 0 for x and y, of 1, 2, and of 2, 2, what is the least squares regression using the equation y bx a. |
AB 59の二乗 OK そして最後は 観客 | 59 squared, OK, and finally? |
2X二乗 | 2x squared. |
1つだけ線形から外れています 二乗誤差を最小化しようとすると | This is an interesting one where we seem to have a linear relationship that is flatter than the linear regression indicates, but there is one outlier. |
シグマの二乗 サンプルの分散はSDの二乗 | Again, there's variance in the population, sigma squared. |
そこで まず最初に x 二乗 それから | So let me just graph those. |
乗法 | Multiplication |
2X二乗と | We only have 1x squared terms, so let's write that down. |
ではk means法の最初の小テストです | But for the sake of this class, let's just care about it. |
分母には Xの二乗和とYの二乗和 | Look what's in the numerator, sum of cross products. |
距離を最小化しようと 同じciの値になるはず でも普通は二乗をつける | But of course minimizing squared distance, and minimizing distance that should give you the same value of Ci, but we usually put in the square there, just as the convention that people use for K means. |
二つの方法があり その数字を最小公倍数と呼びます 9と12の最も小さい倍数であり 共通でもあります | So let's think about what that number is, and there's two ways of coming up with that what we could call a least common multiple, the smallest multiple of both 9 and 12 that is common. |
X二乗の項をやりましょう 4x二乗と | OK, now we can simplify. |
二乗して 足して それが二乗の和です | Lynn example. |
16t二乗 10t 84 | I swapped the sides. |
37の二乗 OK | Arthur Benjamin |
23の二乗 OK | AB 23 squared, OK. Audience |
以下のような最適化問題があるとする 実数のuを (u 5)の二乗 1を 最小化するように選ぶ | Here is what I mean, to give you a concrete example, suppose I had a minimization problem that you know minimize over a real number u of u minus 5 squared, plus 1, right. |
分散 シグマ二乗は 0.5の二乗 つまり 0.25となる | So the standard deviation is one half and the variance sigma squared would therefore be the square of 0.5 would be 0.25. |
まだ残っているのは メートルの二乗 秒の二乗 | So we put in one density. |
二つの数の乗算をするのに 小学校で習うやり方以外にどんな方法ができるかい | Much richer than you might have initially had intuition for. |
これは残差の二乗和 Residual Sum of Squares 覚えてる 我々が最小化しようとしているのは | First I'm gonna bring your attention to this RSS column. |
最小値を見つけ取り除く方法です | For an unordered list, it's actually quite a bit easier. Well, let's stick with ordered list for a moment. |
987の二乗は974,169 | AB 987 squared is 974,169. |
二乗すると ウヒャー | 57,683 squared. |
タクシーは二人の乗客を乗せた | The taxi picked up two passengers. |
ですから x二乗からx二乗足す3にすると | It's going to look like that. |
要素単位での A の 二乗になるので 1の二乗は | This gives me the multi, the element wise squaring of |
8,500を二乗すれば | I'll assume that these events are independent. |
では y x二乗は | And I think you're familiar with what that looks like. |
2の二乗は4だ | Minus 2 and minus 2 is minus 4. |
乗法の 加法での分散 | No, that's not it. |
関連検索 : 最小二乗 - 最小二乗解 - 最小二乗フィット - 最小二乗フィット - フィッティング最小二乗 - 最小二乗アルゴリズム - 最小二乗で - 二段階最小二乗 - 通常の最小二乗法 - 最小二乗線形 - 部分最小二乗 - 最小二乗平均