ICode9

精准搜索请尝试: 精确搜索
首页 > 其他分享> 文章详细

吴恩达Coursera, 机器学习专项课程, Machine Learning:Supervised Machine Learning: Regression and Classification第一

2022-07-02 19:33:40  阅读:292  来源: 互联网

标签:吴恩达 set learning parameters number Machine Learning model Regression


Practice quiz: Supervised vs unsupervised learning

第 1 个问题:Which are the two common types of supervised learning? (Choose two)

【正确】Regression
【解释】Regression predicts a number among potentially infinitely possible numbers.
【不选】Clustering
【正确】Classification
【解释】Classification predicts from among a limited set of categories (also called classes). These could be a limited set of numbers or labels such as "cat" or "dog".

第 2 个问题:Which of these is a type of unsupervised learning?

Classification
【正确】Clustering
Regression
【解释】Clustering groups data into groups or clusters based on how similar each item (such as a hospital patient or shopping customer) are to each other.

Practice quiz: Regression

第 1 个问题:For linear regression, the model is f_{w,b}(x)=wx+b. Which of the following are the inputs, or features, that are fed into the model and with which the model is expected to make a prediction?

w and b.
【正确】x
m
(x,y)
【解释】The xx, the input features, are fed into the model to generate a prediction f_{w,b}(x)

第 2 个问题:For linear regression, if you find parameters ww and bb so that J(w,b) is very close to zero, what can you conclude?

【正确】The selected values of the parameters w and b cause the algorithm to fit the training set really well.
The selected values of the parameters w and b cause the algorithm to fit the training set really poorly.
This is never possible -- there must be a bug in the code.
【解释】When the cost is small, this means that the model fits the training set well.

Practice quiz: Train the model with gradient descent

第 1 个问题:Gradient descent is an algorithm for finding values of parameters w and b that minimize the cost function J. When \frac{\partial J(w,b)}{\partial w} is a negative number (less than zero), what happens to ww after one update step?

【正确】w increases.
w decreases
It is not possible to tell if ww will increase or decrease.
w stays the same
【解释】The learning rate is always a positive number, so if you take W minus a negative number, you end up with a new value for W that is larger (more positive).

第 2 个问题:For linear regression, what is the update step for parameter b?

image

标签:吴恩达,set,learning,parameters,number,Machine,Learning,model,Regression
来源: https://www.cnblogs.com/chuqianyu/p/16438303.html

本站声明: 1. iCode9 技术分享网(下文简称本站)提供的所有内容,仅供技术学习、探讨和分享;
2. 关于本站的所有留言、评论、转载及引用,纯属内容发起人的个人观点,与本站观点和立场无关;
3. 关于本站的所有言论和文字,纯属内容发起人的个人观点,与本站观点和立场无关;
4. 本站文章均是网友提供,不完全保证技术分享内容的完整性、准确性、时效性、风险性和版权归属;如您发现该文章侵犯了您的权益,可联系我们第一时间进行删除;
5. 本站为非盈利性的个人网站,所有内容不会用来进行牟利,也不会利用任何形式的广告来间接获益,纯粹是为了广大技术爱好者提供技术内容和技术思想的分享性交流网站。

专注分享技术,共同学习,共同进步。侵权联系[81616952@qq.com]

Copyright (C)ICode9.com, All Rights Reserved.

ICode9版权所有