"Your soul yearns for both stagnation and progress, both desires that maul at one another. Let us see how long your will can keep them in check."
Let us return to the topic of supervised learning from back on 6th July. Supervised learning can be divided into two types of problems:
Regression: target variable is a continuous numerical value that, in theory, can be any real number (y ∈ ℝ), though practical applications might have limitations on the range of possible values. Common loss functions used in regression problems measure the difference between the predicted value and the actual value (e.g., MSE). An example of a regression problem is predicting the temperature based on weather data.
Classification: target variable is a categorical label that can be binary (y ∈ {0, 1}) or multi-class (y ∈ C, where C represents the possible categories for the specific problem). Common loss functions used in classification problems measure the penalty for misclassifying data points (e.g., cross-entropy). An example of a classification problem is classifying emails as spam or not spam.
More onto classification problems, the supertype can be further broken down into two types of sub-problems:
Binary: has only 2 categories of target variables. An example of a binary classification problem is scam filtering (scam vs. non-scam).
Multi-Class: has more than 2 categories of target variables. An example of a multi-class classification problem is image classification (item 1, item 2, item n).
In case any of your target variables can have multiple classes/labels, you might have a Multi-Label (y ⊆ C) problem on your hands. Here, y represents a set of labels, and C represents the set of all possible labels.