Another post starts with you beautiful people! Continuing our Machine Learning track today we will apply the Naive Bayes Classifier but before that we need to understand the Bayes Theorem . So let’s first understand the Bayes Theorem. Bayes Theorem works on conditional probability. Conditional probability is the probability that something will happen, given that something else has already occurred. Using the conditional probability, we can calculate the probability of an event using its prior knowledge. Below is the formula for calculating the conditional probability. where P(H) is the probability of hypothesis H being true. This is known as the prior probability. P(E) is the probability of the evidence(regardless of the hypothesis). P(E|H) is the probability of the evidence given that hypothesis is true. P(H|E) is the probability of the hypothesis given that the evidence is there. We can understand the above concept with a classic example of coin that I su...
This blog is dedicated to all aspiring data scientists. I will share my learning about Data Science, Machine Learning, Deep Learning, OCR, and Computer Vision with my blog posts. So keep reading my posts and be a better version of yourself. Contact me for any AI-related freelance or consultancy work.