Supervised Learning

"Supervised learning is a machine learning technique whereby the algorithm is first presented with training data which consists of examples which include both the inputs and the desired outputs; thus enabling it to learn a function. The learner should then be able to generalize from the presented data to unseen examples."
Sewell (2006)

"Supervised learning is a machine learning technique for creating a function from training data. The training data consist of pairs of input objects (typically vectors), and desired outputs. The output of the function can be a continuous value (called regression), or can predict a class label of the input object (called classification). The task of the supervised learner is to predict the value of the function for any valid input object after having seen a number of training examples (i.e. pairs of input and target output). To achieve this, the learner has to generalize from the presented data to unseen situations in a "reasonable" way (see inductive bias)."
Wikipedia (2006)

"The problem of supervised learning involves learning a function from examples of its inputs and outputs."
Russell and Norvig (2003), page 650

"The examples above describe what is called the supervised learning problem. It is called “supervised” because of the presence of the outcome variable to guide the learning process."
Hastie, Tibshirani and Friedman (2001), page 2

"In supervised learning, a teacher provides a category label or cost for each pattern in a training set, and seeks to reduce the sum of the costs for these patterns. How can we be sure that a particular learning algorithm is powerful enough to learn the solution to a given problem and that it will be stable to parameter variations? How can we determine if it will converge in finite time or if it will scale reasonably with the number of training patterns, the number of input features or the number of categories? How can we ensure that the learning algorithm appropriately favors “simple” solutions [...] rather than complicated ones [...]?"
Duda, Hart and Stork (2001), page 16

Links