Regularization techniques

Regularization is a widely used technique since it brings a positive impact on the model’s performance. However, there are many methods of regularization. I will try to go through some of them in this post. First, what is regularization? Regularization is a technique that helps a machine learning model avoid overfitting and enhance it generalization capabilities over unseen data. Regularization techniques 1. L1, L2 regularization L2 and L1 regularization put a constraint on the model’s weights an biases....

Fri August 11, 2023 · 3 min · 455 words · Me

Visualizing optimization algorithms

1. Gradient Descent Gradient descent is an optimization algorithm that iteratively adjusts model parameters to minimize a function, typically a loss function. By moving in the direction of the steepest decrease, determined by the gradient, it helps find the optimal parameters that best fit the data. Equation: θ = θ - α * ∇J(θ) Description: θ: Parameters (weights) of the model being optimized. α (alpha): Learning rate, determines the size of the steps taken during optimization....

Mon May 1, 2023 · 4 min · 784 words · Me