---
product_id: 99580971
title: "Learning From Data"
price: "£31.80"
currency: GBP
in_stock: true
reviews_count: 8
url: https://www.desertcart.co.uk/products/99580971-learning-from-data
store_origin: GB
region: Great Britain
---

# Learning From Data

**Price:** £31.80
**Availability:** ✅ In Stock

## Quick Answers

- **What is this?** Learning From Data
- **How much does it cost?** £31.80 with free shipping
- **Is it available?** Yes, in stock and ready to ship
- **Where can I buy it?** [www.desertcart.co.uk](https://www.desertcart.co.uk/products/99580971-learning-from-data)

## Best For

- Customers looking for quality international products

## Why This Product

- Free international shipping included
- Worldwide delivery with tracking
- 15-day hassle-free returns

## Description

Learning From Data [Yaser S. Abu-Mostafa, Malik Magdon-Ismail, Hsuan-Tien Lin] on desertcart.com. *FREE* shipping on qualifying offers. Learning From Data

Review: Learning From Data: A Great Crash Course on Machine Learning - Learning From Data by Yaser S. Abu-Mostafa et al is good intro to both a theoretical and practical approach to understanding modeling. Let’s make things clear, this is a textbook – not a passive read. At about 200 pages, it on the slim side for a textbook, but as the authors note in the preface, the book is “a short course, not a hurried course”. Complexity: Despite not having much modeling experience, the book was relatively easy for me to understand, although some of it did go over my head. It is a very good choice as an introduction to the field. The authors do an incredible job of weaving narrative into the knowledge early on, although this becomes less common later in the book. By that I mean most sections contain examples relating the topic to real life applications which prevents the math from becoming too abstract. And there is a good amount of math. Before reading this book, I would recommend having taken multivariable calculus since they use gradients and other things fairly frequently. If you have taken enough math but think you are a bit rusty the book takes care of that. The authors have been kind enough to include a “Table of Notation” at the back of the book to let you refresh yourself if you come across an unfamiliar symbol. So if you forget what a downward pointing triangle means, there is still hope for you! Chapter 1: The Learning Problem. In this chapter the authors provide the basic background to learning from data. As stated earlier, this is where they do some of their best work connecting the theory to real life examples. The writing style in some of these examples is almost like prose which makes it a much more enjoyable and memorable read. They summarize some of the main types of learning and define some of the key terms and principles like error and noise. Chapter 2: Training versus Testing. This chapter begins to explain what the theory of generalization is, the associated error, and numerical approximations of generalizations. I would still categorize this chapter as background knowledge that will be used more in later parts of the book. Chapter 3: The Linear Model. This is really the meat of the book. If you want to quickly learn how to do a regression then jump straight to this chapter. It covers both linear and logistic regression and also touches on nonlinear transformations. Chapter 4: Overfitting. This chapter deals with the more advanced aspects of modeling. As the authors put it “the ability to deal with overfitting is what separates professionals from amateurs”. While I can safely say that I am still an amateur, it was nice to be exposed to some of the more advanced concerns of the field. Overfitting, for those who don’t know, is trying to fit to the data more than is needed. This is often done by using more degrees of freedom than necessary to make a model i.e. making a 10th order approximation of data whose original function is actually only 2nd order. To someone new to modeling it can be very tempting to increase the order of an approximation because it might seem that higher order = higher accuracy. This section of the book does a good job of explaining how that isn’t the case by introducing the idea of an overfit penalty, the increase in error from overfitting a curve. Chapter 5: Three Learning Principles. This chapter is different. Very different. Instead of continuing to introduce other types of models, the authors decide to use the fundamentals taught in earlier chapters to talk about three key principles that are useful in modeling: Occam’s razor, sampling bias, and data snooping. Occam’s razor is something that is well-known. The book uses Albert Einstein’s explanation of the term, “An explanation of the data should be made as simple as possible, but no simpler.” This relates to the previous chapter’s problem of overfitting. Sampling bias talks about errors in modeling that come from having data that is not representative of the overall population. Data snooping talks about deciding to make a prediction after looking at data rather than before. This section was probably the most interesting to read. Unlike the previous few chapters the authors return to relying on real world examples. They even use the historical example of the false prediction of the 1948 US presidential election between Truman and Dewey. Again this makes the information in this chapter much more memorable and was by far my favorite chapter of the book. Problems/Exercises: I only briefly skimmed over a few of the exercises and problems included in this book but they did help improve my understanding of the topics. According to the authors the provided practice problems are “easy” so if you want more of a challenge you will need to look elsewhere. Criticisms: In the middle chapters of the book, the authors use fewer real world examples and prose style writing. This is extremely unfortunate because, to someone unfamiliar with the field, it provided a hook to draw you into the more math heavy sections. Also, while the authors are very detailed and thorough in explaining different theories and types of models, they do not do a great job of listing the strengths and weaknesses of each. If I was required to make one of the models explained in this book, I probably could, but if I was asked to choose which model would be best for a given situation I would probably be unable to do so. They should have been clearer about what models are used in different situations and provided guidelines for selecting which model to use (beyond Occam’s razor). This would better connect the material to real world use and be more beneficial to the readers. Conclusion: Learning From Data provided a quick but thorough overview of modeling and machine learning. If you would like to learn more about the subject and have the required math background, it is a very good place to start. It will give you the background, main models, errors, and principles necessary for you to not only learn the language of the field but also critique and even create your own models. I highly recommend it. Score: 4.5/5
Review: Great book gets even better when coupled with online materials - I've owned a much loved copy of this book for several years. It does an incredibly good job of explaining why, and when, statistical learning methods work. It also introduces the mathematical prerequisites required to go out and further explore the field. Having said that, I found the topic coverage somewhat limited, and the approach surprisingly abstract. For example, there is no mention of software methods, or of popular learning models like neural networks. The text strongly hints that readers who solve the included problems will gain additional and valuable insight, but solutions to these are not readily available. This makes self study just that much harder. It's stuff like this that left me disinclined to make a strong recommendation for this book in the past. Recently I discovered an open secret that many other people seem to have known for the better part of a decade. A website associated with the book provides additional online chapters. Also included is a freely available video lecture series from one of the authors. It was recorded live in a real classroom at Caltech in 2012. The accompanying homework and final exams, with solution sets, are also available to all! The new chapters available online seemed to be written in same accessible style of the book. These chapters definitely address my complaint regarding limited topic coverage. The homework problems also seem to put a kibosh on the criticism that the book is too abstract. They seem particularly well chosen, and if you actually solve them you will definitely gain real experience with these learning techniques, all of it in the language and software platform of your choice. The solution keys tell you when you need to go back and work on a problem some more, or if you got the solution right. The video lectures I have watched are all spectacularly good. It's rare to find such a gifted teacher, and I even found myself watching him cover familiar ground just for the pleasure of observing the exposition. So, is it acceptable to give this book a "five stars!" rating for things that are not actually in it? I don't really know, and frankly, I think the question is irrelevant here. If you are willing to use the online resources, this book is amongst the very best introductions to machine learning available. If you choose to ignore the online resources, it is still an outstanding introduction to the mathematical foundations of the subject. While I wish there was a second edition with the new chapters, I can settle for my own printed copies of the downloaded documents for now!

## Technical Specifications

| Specification | Value |
|---------------|-------|
| Best Sellers Rank | #120,507 in Books ( See Top 100 in Books ) #18 in Computer Vision & Pattern Recognition #47 in Computer Neural Networks #313 in Artificial Intelligence & Semantics |
| Customer Reviews | 4.6 out of 5 stars 283 Reviews |

## Images

![Learning From Data - Image 1](https://m.media-amazon.com/images/I/81k-OvvReZL.jpg)

## Customer Reviews

### ⭐⭐⭐⭐⭐ Learning From Data: A Great Crash Course on Machine Learning
*by T***5 on September 30, 2013*

Learning From Data by Yaser S. Abu-Mostafa et al is good intro to both a theoretical and practical approach to understanding modeling. Let’s make things clear, this is a textbook – not a passive read. At about 200 pages, it on the slim side for a textbook, but as the authors note in the preface, the book is “a short course, not a hurried course”. Complexity: Despite not having much modeling experience, the book was relatively easy for me to understand, although some of it did go over my head. It is a very good choice as an introduction to the field. The authors do an incredible job of weaving narrative into the knowledge early on, although this becomes less common later in the book. By that I mean most sections contain examples relating the topic to real life applications which prevents the math from becoming too abstract. And there is a good amount of math. Before reading this book, I would recommend having taken multivariable calculus since they use gradients and other things fairly frequently. If you have taken enough math but think you are a bit rusty the book takes care of that. The authors have been kind enough to include a “Table of Notation” at the back of the book to let you refresh yourself if you come across an unfamiliar symbol. So if you forget what a downward pointing triangle means, there is still hope for you! Chapter 1: The Learning Problem. In this chapter the authors provide the basic background to learning from data. As stated earlier, this is where they do some of their best work connecting the theory to real life examples. The writing style in some of these examples is almost like prose which makes it a much more enjoyable and memorable read. They summarize some of the main types of learning and define some of the key terms and principles like error and noise. Chapter 2: Training versus Testing. This chapter begins to explain what the theory of generalization is, the associated error, and numerical approximations of generalizations. I would still categorize this chapter as background knowledge that will be used more in later parts of the book. Chapter 3: The Linear Model. This is really the meat of the book. If you want to quickly learn how to do a regression then jump straight to this chapter. It covers both linear and logistic regression and also touches on nonlinear transformations. Chapter 4: Overfitting. This chapter deals with the more advanced aspects of modeling. As the authors put it “the ability to deal with overfitting is what separates professionals from amateurs”. While I can safely say that I am still an amateur, it was nice to be exposed to some of the more advanced concerns of the field. Overfitting, for those who don’t know, is trying to fit to the data more than is needed. This is often done by using more degrees of freedom than necessary to make a model i.e. making a 10th order approximation of data whose original function is actually only 2nd order. To someone new to modeling it can be very tempting to increase the order of an approximation because it might seem that higher order = higher accuracy. This section of the book does a good job of explaining how that isn’t the case by introducing the idea of an overfit penalty, the increase in error from overfitting a curve. Chapter 5: Three Learning Principles. This chapter is different. Very different. Instead of continuing to introduce other types of models, the authors decide to use the fundamentals taught in earlier chapters to talk about three key principles that are useful in modeling: Occam’s razor, sampling bias, and data snooping. Occam’s razor is something that is well-known. The book uses Albert Einstein’s explanation of the term, “An explanation of the data should be made as simple as possible, but no simpler.” This relates to the previous chapter’s problem of overfitting. Sampling bias talks about errors in modeling that come from having data that is not representative of the overall population. Data snooping talks about deciding to make a prediction after looking at data rather than before. This section was probably the most interesting to read. Unlike the previous few chapters the authors return to relying on real world examples. They even use the historical example of the false prediction of the 1948 US presidential election between Truman and Dewey. Again this makes the information in this chapter much more memorable and was by far my favorite chapter of the book. Problems/Exercises: I only briefly skimmed over a few of the exercises and problems included in this book but they did help improve my understanding of the topics. According to the authors the provided practice problems are “easy” so if you want more of a challenge you will need to look elsewhere. Criticisms: In the middle chapters of the book, the authors use fewer real world examples and prose style writing. This is extremely unfortunate because, to someone unfamiliar with the field, it provided a hook to draw you into the more math heavy sections. Also, while the authors are very detailed and thorough in explaining different theories and types of models, they do not do a great job of listing the strengths and weaknesses of each. If I was required to make one of the models explained in this book, I probably could, but if I was asked to choose which model would be best for a given situation I would probably be unable to do so. They should have been clearer about what models are used in different situations and provided guidelines for selecting which model to use (beyond Occam’s razor). This would better connect the material to real world use and be more beneficial to the readers. Conclusion: Learning From Data provided a quick but thorough overview of modeling and machine learning. If you would like to learn more about the subject and have the required math background, it is a very good place to start. It will give you the background, main models, errors, and principles necessary for you to not only learn the language of the field but also critique and even create your own models. I highly recommend it. Score: 4.5/5

### ⭐⭐⭐⭐⭐ Great book gets even better when coupled with online materials
*by C***Y on April 24, 2019*

I've owned a much loved copy of this book for several years. It does an incredibly good job of explaining why, and when, statistical learning methods work. It also introduces the mathematical prerequisites required to go out and further explore the field. Having said that, I found the topic coverage somewhat limited, and the approach surprisingly abstract. For example, there is no mention of software methods, or of popular learning models like neural networks. The text strongly hints that readers who solve the included problems will gain additional and valuable insight, but solutions to these are not readily available. This makes self study just that much harder. It's stuff like this that left me disinclined to make a strong recommendation for this book in the past. Recently I discovered an open secret that many other people seem to have known for the better part of a decade. A website associated with the book provides additional online chapters. Also included is a freely available video lecture series from one of the authors. It was recorded live in a real classroom at Caltech in 2012. The accompanying homework and final exams, with solution sets, are also available to all! The new chapters available online seemed to be written in same accessible style of the book. These chapters definitely address my complaint regarding limited topic coverage. The homework problems also seem to put a kibosh on the criticism that the book is too abstract. They seem particularly well chosen, and if you actually solve them you will definitely gain real experience with these learning techniques, all of it in the language and software platform of your choice. The solution keys tell you when you need to go back and work on a problem some more, or if you got the solution right. The video lectures I have watched are all spectacularly good. It's rare to find such a gifted teacher, and I even found myself watching him cover familiar ground just for the pleasure of observing the exposition. So, is it acceptable to give this book a "five stars!" rating for things that are not actually in it? I don't really know, and frankly, I think the question is irrelevant here. If you are willing to use the online resources, this book is amongst the very best introductions to machine learning available. If you choose to ignore the online resources, it is still an outstanding introduction to the mathematical foundations of the subject. While I wish there was a second edition with the new chapters, I can settle for my own printed copies of the downloaded documents for now!

### ⭐⭐⭐⭐ In line with caltechs online course (as expected!)
*by O***Y on April 30, 2013*

The book accompanies Caltech's online course of the same name https://telecourse.caltech.edu/index.php Pretty much in line with the online lectures. The course covers machine learning in general and focuses on the theory of learning with introductory material on various learning algorithms incorporated into the chapters as they become relevant. I'm giving this 4 stars because it is indeed a short course on learning from data as the cover says. This is not an in-depth book on learning algorithms although it does of course cover some of these in reasonable depth but always from a computer science angle towards the theory of learning rather than in a practical applied 'engineering' light. The book and the online course itself is actually of very high quality (despite being free); the professor's lectures are very well structured, organised and explained. Finally, the price is OK but a little high given its total knowledge content - although addmitedly I am maybe a little miserly :)

---

## Why Shop on Desertcart?

- 🛒 **Trusted by 1.3+ Million Shoppers** — Serving international shoppers since 2016
- 🌍 **Shop Globally** — Access 737+ million products across 21 categories
- 💰 **No Hidden Fees** — All customs, duties, and taxes included in the price
- 🔄 **15-Day Free Returns** — Hassle-free returns (30 days for PRO members)
- 🔒 **Secure Payments** — Trusted payment options with buyer protection
- ⭐ **TrustPilot Rated 4.5/5** — Based on 8,000+ happy customer reviews

**Shop now:** [https://www.desertcart.co.uk/products/99580971-learning-from-data](https://www.desertcart.co.uk/products/99580971-learning-from-data)

---

*Product available on Desertcart Great Britain*
*Store origin: GB*
*Last updated: 2026-04-24*