Skip to main content


Home Machine learning

Machine learning

Machine learning definition

Machine learning is a type of artificial intelligence that imitates human learning, which allows the software to learn by itself and predict outcomes more accurately without a human programming it to do so. Machine learning uses all the inputs it ever received to predict new outputs. Machine learning is used a lot in daily life: to recommend TV shows based on what you liked before, fix your typos by suggesting a word you might want to use instead, or find the best route to work based on your and other people’s traffic data.

Machine learning types

  • Supervised learning. The algorithm receives both input and desired output data so it can find patterns and learn to make predictions based on them.
  • Unsupervised learning. These algorithms are less accurate, but they don’t need human intervention and can discover patterns in data themselves.
  • Semi-supervised learning. When there’s not enough data for the algorithm to use and learn from by itself, human intervention is needed. It uses both labeled data (clear input and output data) and unlabeled data if there’s not enough of either for the algorithm to learn by itself.
  • Reinforcement learning. Used to solve complex problems by employing a rewards system. The machine is rewarded for making the right decisions in a complex, uncertain environment and receives penalties for mistakes. Its goal is to achieve a maximum reward, so it uses a trial and error method to find the best sequence of actions with zero penalties.