일 | 월 | 화 | 수 | 목 | 금 | 토 |
---|---|---|---|---|---|---|
1 | 2 | 3 | 4 | |||
5 | 6 | 7 | 8 | 9 | 10 | 11 |
12 | 13 | 14 | 15 | 16 | 17 | 18 |
19 | 20 | 21 | 22 | 23 | 24 | 25 |
26 | 27 | 28 | 29 | 30 | 31 |
- 비용함수
- Vision
- Support Vector Machine
- Kaggle
- C++
- SGD
- 컴퓨터 그래픽스
- Computer Vision
- 인공지능
- cs231n
- 추천 시스템
- logistic regression
- CPP
- 로지스틱 회귀
- OpenGL
- 컴퓨터 비전
- 딥러닝
- Unsupervised learning
- recommender system
- neural network
- 머신러닝
- petal to metal
- 그래픽스
- pre-trained
- SVM
- Regularization
- 파이썬
- 백준
- CNN
- 신경망
- Today
- Total
kwan's note
Cost function -비용함수/ 손실함수 본문
출처: machine learning by andrew ng, stanford cousera lecture
수강일시: 2021.01.24
www.coursera.org/learn/machine-learning/home
in previeous lecture we tried to find out the housing price
cost function is way to measure difference in real value and estimated value.
so we can measure the accuracy of our hypothesis function by using a cost function.
we want to minimize cost finction J to find a regression which can represent the values (or data) best
let's consider single parameter regression line(uni variable)
for value theta1, we can dot the J(theta1).
also, we can plot the J whatever the theta value is, and it will look like following graph in this problem:
so we can find the minimum value of J in this problem.
but how about multi variate problem?
It will look like this.
we can find our minimum value J* in 3dimension.
next time we will learn how to find this minimum value(by gradient discent)
'ML and AI > Machine learning - Andrew Ng' 카테고리의 다른 글
Normal Equation-정규방정식 (0) | 2021.01.26 |
---|---|
gradient descent(경사하강법) (0) | 2021.01.24 |
unsupervised learning -비지도학습 (0) | 2021.01.24 |
supervised learning -지도학습 (0) | 2021.01.24 |
machine learning - Andrew Ng, Stanford(coursera) (0) | 2021.01.24 |