Machine Learning: Regression

Methode

Machine Learning: Regression

Coursera (CC)
Logo von Coursera (CC)
Bewertung: starstarstarstar_halfstar_border 7,2 Bildungsangebote von Coursera (CC) haben eine durchschnittliche Bewertung von 7,2 (aus 6 Bewertungen)

Tipp: Haben Sie Fragen? Für weitere Details einfach auf "Kostenlose Informationen" klicken.

Beschreibung

When you enroll for courses through Coursera you get to choose for a paid plan or for a free plan

  • Free plan: No certicification and/or audit only. You will have access to all course materials except graded items.
  • Paid plan: Commit to earning a Certificate—it's a trusted, shareable way to showcase your new skills.

About this course: Case Study - Predicting Housing Prices In our first case study, predicting house prices, you will create models that predict a continuous value (price) from input features (square footage, number of bedrooms and bathrooms,...). This is just one of the many places where regression can be applied. Other applications range from predicting health outcomes in medicine, stock prices in finance, and power usage in high-performance computing, to analyzing which regulators are important for gene expression. In this course, you will explore regularized linear regression models for the task of prediction and feature selection. You will be able to handle very large sets of featur…

Gesamte Beschreibung lesen

Frequently asked questions

Es wurden noch keine FAQ hinterlegt. Falls Sie Fragen haben oder Unterstützung benötigen, kontaktieren Sie unseren Kundenservice. Wir helfen gerne weiter!

Noch nicht den perfekten Kurs gefunden? Verwandte Themen: Machine Learning, Big Data, Microsoft Azure, Data Science und Data Mining.

When you enroll for courses through Coursera you get to choose for a paid plan or for a free plan

  • Free plan: No certicification and/or audit only. You will have access to all course materials except graded items.
  • Paid plan: Commit to earning a Certificate—it's a trusted, shareable way to showcase your new skills.

About this course: Case Study - Predicting Housing Prices In our first case study, predicting house prices, you will create models that predict a continuous value (price) from input features (square footage, number of bedrooms and bathrooms,...). This is just one of the many places where regression can be applied. Other applications range from predicting health outcomes in medicine, stock prices in finance, and power usage in high-performance computing, to analyzing which regulators are important for gene expression. In this course, you will explore regularized linear regression models for the task of prediction and feature selection. You will be able to handle very large sets of features and select between models of various complexity. You will also analyze the impact of aspects of your data -- such as outliers -- on your selected models and predictions. To fit these models, you will implement optimization algorithms that scale to large datasets. Learning Outcomes: By the end of this course, you will be able to: -Describe the input and output of a regression model. -Compare and contrast bias and variance when modeling data. -Estimate model parameters using optimization algorithms. -Tune parameters with cross validation. -Analyze the performance of the model. -Describe the notion of sparsity and how LASSO leads to sparse solutions. -Deploy methods to select between models. -Exploit the model to form predictions. -Build a regression model to predict prices using a housing dataset. -Implement these techniques in Python.

Created by:  University of Washington
  • Taught by:  Emily Fox, Amazon Professor of Machine Learning

    Statistics
  • Taught by:  Carlos Guestrin, Amazon Professor of Machine Learning

    Computer Science and Engineering
Basic Info Course 2 of 4 in the Machine Learning Specialization Commitment 6 weeks of study, 5-8 hours/week Language English How To Pass Pass all graded assignments to complete the course. User Ratings 4.8 stars Average User Rating 4.8See what learners said Coursework

Each course is like an interactive textbook, featuring pre-recorded videos, quizzes and projects.

Help from your peers

Connect with thousands of other learners and debate ideas, discuss course material, and get help mastering concepts.

Certificates

Earn official recognition for your work, and share your success with friends, colleagues, and employers.

University of Washington Founded in 1861, the University of Washington is one of the oldest state-supported institutions of higher education on the West Coast and is one of the preeminent research universities in the world.

Syllabus


WEEK 1


Welcome



Regression is one of the most important and broadly used machine learning and statistics tools out there. It allows you to make predictions from data by learning the relationship between features of your data and some observed, continuous-valued response. Regression is used in a massive number of applications ranging from predicting stock prices to understanding gene regulatory networks.<p>This introduction to the course provides you with an overview of the topics we will cover and the background knowledge and resources we assume you have.


5 videos, 2 readings expand


  1. Reading: Slides presented in this module
  2. Video: Welcome!
  3. Video: What is the course about?
  4. Video: Outlining the first half of the course
  5. Video: Outlining the second half of the course
  6. Video: Assumed background
  7. Reading: Reading: Software tools you'll need


Simple Linear Regression



Our course starts from the most basic regression model: Just fitting a line to data. This simple model for forming predictions from a single, univariate feature of the data is appropriately called "simple linear regression".<p> In this module, we describe the high-level regression task and then specialize these concepts to the simple linear regression case. You will learn how to formulate a simple regression model and fit the model to data using both a closed-form solution as well as an iterative optimization algorithm called gradient descent. Based on this fitted function, you will interpret the estimated model parameters and form predictions. You will also analyze the sensitivity of your fit to outlying observations.<p> You will examine all of these concepts in the context of a case study of predicting house prices from the square feet of the house.


25 videos, 5 readings expand


  1. Reading: Slides presented in this module
  2. Video: A case study in predicting house prices
  3. Video: Regression fundamentals: data & model
  4. Video: Regression fundamentals: the task
  5. Video: Regression ML block diagram
  6. Video: The simple linear regression model
  7. Video: The cost of using a given line
  8. Video: Using the fitted line
  9. Video: Interpreting the fitted line
  10. Video: Defining our least squares optimization objective
  11. Video: Finding maxima or minima analytically
  12. Video: Maximizing a 1d function: a worked example
  13. Video: Finding the max via hill climbing
  14. Video: Finding the min via hill descent
  15. Video: Choosing stepsize and convergence criteria
  16. Video: Gradients: derivatives in multiple dimensions
  17. Video: Gradient descent: multidimensional hill descent
  18. Video: Computing the gradient of RSS
  19. Video: Approach 1: closed-form solution
  20. Reading: Optional reading: worked-out example for closed-form solution
  21. Video: Approach 2: gradient descent
  22. Reading: Optional reading: worked-out example for gradient descent
  23. Video: Comparing the approaches
  24. Reading: Download notebooks to follow along
  25. Video: Influence of high leverage points: exploring the data
  26. Video: Influence of high leverage points: removing Center City
  27. Video: Influence of high leverage points: removing high-end towns
  28. Video: Asymmetric cost functions
  29. Video: A brief recap
  30. Reading: Reading: Fitting a simple linear regression model on housing data

Graded: Simple Linear Regression
Graded: Fitting a simple linear regression model on housing data

WEEK 2


Multiple Regression



The next step in moving beyond simple linear regression is to consider "multiple regression" where multiple features of the data are used to form predictions. <p> specifically, in this module, you will learn how to build models of more complex relationship between a single variable (e.g., 'square feet') and the observed response (like 'house sales price'). This includes things like fitting a polynomial to your data, or capturing seasonal changes in the response value. You will also learn how to incorporate multiple input variables (e.g., 'square feet', '# bedrooms', '# bathrooms'). You will then be able to describe how all of these models can still be cast within the linear regression framework, but now using multiple "features". Within this multiple regression framework, you will fit models to data, interpret estimated coefficients, and form predictions. <p>Here, you will also implement a gradient descent algorithm for fitting a multiple regression model.


19 videos, 5 readings expand


  1. Reading: Slides presented in this module
  2. Video: Multiple regression intro
  3. Video: Polynomial regression
  4. Video: Modeling seasonality
  5. Video: Where we see seasonality
  6. Video: Regression with general features of 1 input
  7. Video: Motivating the use of multiple inputs
  8. Video: Defining notation
  9. Video: Regression with features of multiple inputs
  10. Video: Interpreting the multiple regression fit
  11. Reading: Optional reading: review of matrix algebra
  12. Video: Rewriting the single observation model in vector notation
  13. Video: Rewriting the model for all observations in matrix notation
  14. Video: Computing the cost of a D-dimensional curve
  15. Video: Computing the gradient of RSS
  16. Video: Approach 1: closed-form solution
  17. Video: Discussing the closed-form solution
  18. Video: Approach 2: gradient descent
  19. Video: Feature-by-feature update
  20. Video: Algorithmic summary of gradient descent approach
  21. Video: A brief recap
  22. Reading: Reading: Exploring different multiple regression models for house price prediction
  23. Reading: Numpy tutorial
  24. Reading: Reading: Implementing gradient descent for multiple regression

Graded: Multiple Regression
Graded: Exploring different multiple regression models for house price prediction
Graded: Implementing gradient descent for multiple regression

WEEK 3


Assessing Performance



Having learned about linear regression models and algorithms for estimating the parameters of such models, you are now ready to assess how well your considered method should perform in predicting new data. You are also ready to select amongst possible models to choose the best performing. <p> This module is all about these important topics of model selection and assessment. You will examine both theoretical and practical aspects of such analyses. You will first explore the concept of measuring the "loss" of your predictions, and use this to define training, test, and generalization error. For these measures of error, you will analyze how they vary with model complexity and how they might be utilized to form a valid assessment of predictive performance. This leads directly to an important conversation about the bias-variance tradeoff, which is fundamental to machine learning. Finally, you will devise a method to first select amongst models and then assess the performance of the selected model. <p>The concepts described in this module are key to all machine learning problems, well-beyond the regression setting addressed in this course.


14 videos, 2 readings expand


  1. Reading: Slides presented in this module
  2. Video: Assessing performance intro
  3. Video: What do we mean by "loss"?
  4. Video: Training error: assessing loss on the training set
  5. Video: Generalization error: what we really want
  6. Video: Test error: what we can actually compute
  7. Video: Defining overfitting
  8. Video: Training/test split
  9. Video: Irreducible error and bias
  10. Video: Variance and the bias-variance tradeoff
  11. Video: Error vs. amount of data
  12. Video: Formally defining the 3 sources of error
  13. Video: Formally deriving why 3 sources of error
  14. Video: Training/validation/test split for model selection, fitting, and assessment
  15. Video: A brief recap
  16. Reading: Reading: Exploring the bias-variance tradeoff

Graded: Assessing Performance
Graded: Exploring the bias-variance tradeoff

WEEK 4


Ridge Regression



You have examined how the performance of a model varies with increasing model complexity, and can describe the potential pitfall of complex models becoming overfit to the training data. In this module, you will explore a very simple, but extremely effective technique for automatically coping with this issue. This method is called "ridge regression". You start out with a complex model, but now fit the model in a manner that not only incorporates a measure of fit to the training data, but also a term that biases the solution away from overfitted functions. To this end, you will explore symptoms of overfitted functions and use this to define a quantitative measure to use in your revised optimization objective. You will derive both a closed-form and gradient descent algorithm for fitting the ridge regression objective; these forms are small modifications from the original algorithms you derived for multiple regression. To select the strength of the bias away from overfitting, you will explore a general-purpose method called "cross validation". <p>You will implement both cross-validation and gradient descent to fit a ridge regression model and select the regularization constant.


16 videos, 5 readings expand


  1. Reading: Slides presented in this module
  2. Video: Symptoms of overfitting in polynomial regression
  3. Reading: Download the notebook and follow along
  4. Video: Overfitting demo
  5. Video: Overfitting for more general multiple regression models
  6. Video: Balancing fit and magnitude of coefficients
  7. Video: The resulting ridge objective and its extreme solutions
  8. Video: How ridge regression balances bias and variance
  9. Reading: Download the notebook and follow along
  10. Video: Ridge regression demo
  11. Video: The ridge coefficient path
  12. Video: Computing the gradient of the ridge objective
  13. Video: Approach 1: closed-form solution
  14. Video: Discussing the closed-form solution
  15. Video: Approach 2: gradient descent
  16. Video: Selecting tuning parameters via cross validation
  17. Video: K-fold cross validation
  18. Video: How to handle the intercept
  19. Video: A brief recap
  20. Reading: Reading: Observing effects of L2 penalty in polynomial regression
  21. Reading: Reading: Implementing ridge regression via gradient descent

Graded: Ridge Regression
Graded: Observing effects of L2 penalty in polynomial regression
Graded: Implementing ridge regression via gradient descent

WEEK 5


Feature Selection & Lasso



A fundamental machine learning task is to select amongst a set of features to include in a model. In this module, you will explore this idea in the context of multiple regression, and describe how such feature selection is important for both interpretability and efficiency of forming predictions. <p> To start, you will examine methods that search over an enumeration of models including different subsets of features. You will analyze both exhaustive search and greedy algorithms. Then, instead of an explicit enumeration, we turn to Lasso regression, which implicitly performs feature selection in a manner akin to ridge regression: A complex model is fit based on a measure of fit to the training data plus a measure of overfitting different than that used in ridge. This lasso method has had impact in numerous applied domains, and the ideas behind the method have fundamentally changed machine learning and statistics. You will also implement a coordinate descent algorithm for fitting a Lasso model. <p>Coordinate descent is another, general, optimization technique, which is useful in many areas of machine learning.


22 videos, 4 readings expand


  1. Reading: Slides presented in this module
  2. Video: The feature selection task
  3. Video: All subsets
  4. Video: Complexity of all subsets
  5. Video: Greedy algorithms
  6. Video: Complexity of the greedy forward stepwise algorithm
  7. Video: Can we use regularization for feature selection?
  8. Video: Thresholding ridge coefficients?
  9. Video: The lasso objective and its coefficient path
  10. Video: Visualizing the ridge cost
  11. Video: Visualizing the ridge solution
  12. Video: Visualizing the lasso cost and solution
  13. Reading: Download the notebook and follow along
  14. Video: Lasso demo
  15. Video: What makes the lasso objective different
  16. Video: Coordinate descent
  17. Video: Normalizing features
  18. Video: Coordinate descent for least squares regression (normalized features)
  19. Video: Coordinate descent for lasso (normalized features)
  20. Video: Assessing convergence and other lasso solvers
  21. Video: Coordinate descent for lasso (unnormalized features)
  22. Video: Deriving the lasso coordinate descent update
  23. Video: Choosing the penalty strength and other practical issues with lasso
  24. Video: A brief recap
  25. Reading: Reading: Using LASSO to select features
  26. Reading: Reading: Implementing LASSO using coordinate descent

Graded: Feature Selection and Lasso
Graded: Using LASSO to select features
Graded: Implementing LASSO using coordinate descent

WEEK 6


Nearest Neighbors & Kernel Regression



Up to this point, we have focused on methods that fit parametric functions---like polynomials and hyperplanes---to the entire dataset. In this module, we instead turn our attention to a class of "nonparametric" methods. These methods allow the complexity of the model to increase as more data are observed, and result in fits that adapt locally to the observations. <p> We start by considering the simple and intuitive example of nonparametric methods, nearest neighbor regression: The prediction for a query point is based on the outputs of the most related observations in the training set. This approach is extremely simple, but can provide excellent predictions, especially for large datasets. You will deploy algorithms to search for the nearest neighbors and form predictions based on the discovered neighbors. Building on this idea, we turn to kernel regression. Instead of forming predictions based on a small set of neighboring observations, kernel regression uses all observations in the dataset, but the impact of these observations on the predicted value is weighted by their similarity to the query point. You will analyze the theoretical performance of these methods in the limit of infinite training data, and explore the scenarios in which these methods work well versus struggle. You will also implement these techniques and observe their practical behavior.


13 videos, 2 readings expand


  1. Reading: Slides presented in this module
  2. Video: Limitations of parametric regression
  3. Video: 1-Nearest neighbor regression approach
  4. Video: Distance metrics
  5. Video: 1-Nearest neighbor algorithm
  6. Video: k-Nearest neighbors regression
  7. Video: k-Nearest neighbors in practice
  8. Video: Weighted k-nearest neighbors
  9. Video: From weighted k-NN to kernel regression
  10. Video: Global fits of parametric models vs. local fits of kernel regression
  11. Video: Performance of NN as amount of data grows
  12. Video: Issues with high-dimensions, data scarcity, and computational complexity
  13. Video: k-NN for classification
  14. Video: A brief recap
  15. Reading: Reading: Predicting house prices using k-nearest neighbors regression

Graded: Nearest Neighbors & Kernel Regression
Graded: Predicting house prices using k-nearest neighbors regression

Closing Remarks



In the conclusion of the course, we will recap what we have covered. This represents both techniques specific to regression, as well as foundational machine learning concepts that will appear throughout the specialization. We also briefly discuss some important regression techniques we did not cover in this course.<p> We conclude with an overview of what's in store for you in the rest of the specialization.


5 videos, 1 reading expand


  1. Reading: Slides presented in this module
  2. Video: Simple and multiple regression
  3. Video: Assessing performance and ridge regression
  4. Video: Feature selection, lasso, and nearest neighbor regression
  5. Video: What we covered and what we didn't cover
  6. Video: Thank you!

Werden Sie über neue Bewertungen benachrichtigt

Es wurden noch keine Bewertungen geschrieben.

Schreiben Sie eine Bewertung

Haben Sie Erfahrung mit diesem Kurs? Schreiben Sie jetzt eine Bewertung und helfen Sie Anderen dabei die richtige Weiterbildung zu wählen. Als Dankeschön spenden wir € 1,00 an Stiftung Edukans.

Es wurden noch keine FAQ hinterlegt. Falls Sie Fragen haben oder Unterstützung benötigen, kontaktieren Sie unseren Kundenservice. Wir helfen gerne weiter!

Bitte füllen Sie das Formular so vollständig wie möglich aus

(optional)
(optional)
(optional)
(optional)

Haben Sie noch Fragen?

(optional)

Anmeldung für Newsletter

Damit Ihnen per E-Mail oder Telefon weitergeholfen werden kann, speichern wir Ihre Daten.
Mehr Informationen dazu finden Sie in unseren Datenschutzbestimmungen.