Machine Learning

Normal Distribution A Detailed Step-by-Step Explanation

Normal Distribution A Detailed Step-by-Step Explanation By Bindeshwar Singh Kushwaha PostNetwork Academy Introduction: Random Variables A random variable (r.v.) is a function that assigns a numerical value to each outcome of a random experiment. There are two main types of random variables: Discrete Random Variable: Takes countable values (e.g., number of heads in 3 coin […]

Normal Distribution A Detailed Step-by-Step Explanation Read More »

Support Vector Machines Made Easy | SVM Explained with Example

Support Vector Machine (SVM) A Simple Numerical Example – Detailed Explanation Author: Bindeshwar Singh Kushwaha PostNetwork Academy Introduction: Type and Purpose of SVM Type of Algorithm: Supervised Machine Learning Algorithm Used for Classification and Regression (SVR) Discriminative Model – finds decision boundaries Known as a Maximum-Margin Classifier Purpose: Find the optimal hyperplane that separates classes

Support Vector Machines Made Easy | SVM Explained with Example Read More »

Naive Bayes Classification Algorithm for Weather Dataset

Naive Bayes Classification Algorithm for Weather Dataset Author: Bindeshwar Singh Kushwaha | PostNetwork Academy Introduction to Naive Bayes Classifier Naive Bayes is a probabilistic classification algorithm. It is based on Bayes’ Theorem and the naive independence assumption. Suppose we have a feature vector \(\mathbf{X} = (x_1, x_2, …, x_n)\) and a class \(y\). Bayes Theorem:

Naive Bayes Classification Algorithm for Weather Dataset Read More »

Fitting of Poisson Distribution

Fitting of Poisson Distribution

Fitting of Poisson Distribution Bindeshwar Singh Kushwaha — PostNetwork Academy Introduction Master the technique of fitting the Poisson distribution to real-world frequency data. This tutorial shows a step-by-step method to calculate theoretical frequencies for observed datasets. Key Concepts & Techniques Introduction to Fitting: Fit a theoretical Poisson distribution to experimental data to derive expected frequencies.

Fitting of Poisson Distribution Read More »

Gradient of Softmax + Cross-Entropy w.r.t Logits

Gradient of Softmax + Cross-Entropy w.r.t Logits Author: Bindeshwar Singh Kushwaha – PostNetwork Academy Goal We want to compute: $$ \frac{\partial L}{\partial z_j} $$ Notation: Logits: \(z = [z_1, z_2, \dots, z_C]\) Softmax: \(\hat{y}_i = \frac{e^{z_i}}{\sum_{k=1}^{C} e^{z_k}}\) Cross-Entropy Loss: \(L = -\sum_{i=1}^{C} y_i \log \hat{y}_i\), where \(y_i\) is one-hot. [Insert Neural Network Diagram Here] Loss

Gradient of Softmax + Cross-Entropy w.r.t Logits Read More »

Understanding Neural Networks: Softmax, Cross-Entropy, and Backpropagation

Understanding Neural Networks: Softmax, Cross-Entropy, and Backpropagation Author: Bindeshwar Singh Kushwaha – PostNetwork Academy Neural Network with Softmax + Cross-Entropy Input Layer: The network receives 3 input features, denoted \(x_1, x_2, x_3\). Hidden Layer: 2 neurons in the hidden layer with activations \(a^{(1)}\) and \(a^{(2)}\). Output Layer: 2 outputs \(z^{(3)}, z^{(4)}\), passed through softmax. Softmax

Understanding Neural Networks: Softmax, Cross-Entropy, and Backpropagation Read More »

Label Encoding and One Hot Encoding in Machine Learning

📘 Label Encoding and One-Hot Encoding Author: Bindeshwar Singh Kushwaha 🎯 Encoding Categorical Features 🔹 Label Encoding Assigns each category an integer value Suitable for ordinal data (e.g., size: small, medium, large) Tool: LabelEncoder from sklearn.preprocessing Example (Titanic): Encoding Sex as 0 (male), 1 (female) 🔹 One-Hot Encoding Converts categories into binary columns (one per

Label Encoding and One Hot Encoding in Machine Learning Read More »

K-Nearest Neighbors (KNN) Classifier and Imputation using KNN

K-Nearest Neighbors (KNN) Classifier and Imputation using KNN Author: Bindeshwar Singh Kushwaha What is K-Nearest Neighbors (KNN)? KNN is a supervised machine learning algorithm. It is easy to understand and does not involve complex math. Commonly used for classification tasks, especially with labeled data. We’ll use the Iris dataset, which has flower measurements. Iris Dataset

K-Nearest Neighbors (KNN) Classifier and Imputation using KNN Read More »

Handling Missing Data and Categorical Features

Handling Missing Data and Categorical Features By: Bindeshwar Singh Kushwaha Data Preprocessing Flow Raw Data → Handle Missing Values → Encode Categorical Variables → Feature Scaling → Preprocessed Data Overview of Data Preprocessing Load Titanic dataset from CSV file Handle missing values using various techniques Encode categorical data for machine learning Save the cleaned dataset

Handling Missing Data and Categorical Features Read More »

What is Data Preprocessing

What is Data Preprocessing? Why It Matters in Machine Learning! Author: Bindeshwar Singh Kushwaha Real-World Data Challenges Missing or incomplete values Inconsistent formatting and typos Mixed data types (text, numeric, dates) Categorical variables needing encoding Scale variations and outliers What is Data Preprocessing? A set of techniques to clean and prepare raw data Essential for

What is Data Preprocessing Read More »

©Postnetwork-All rights reserved.