This project aims to understand and solve optimal control and differential equations governed
by the Hamiltonian system, using tools such as the Hamiltonian formulation and Pontryagin’s
maximum principle (PMP). Hamiltonian ...
Optimization techniques play a vital role in improving the performance of machine learning
models by minimizing error functions and enhancing predictive accuracy. Among these
techniques, gradient descent is one of the ...
This project aims to understand and solve optimal control and differential equations governed
by the Hamiltonian system, using tools such as the Hamiltonian formulation and Pontryagin’s
maximum principle (PMP). Hamiltonian ...
This project explores fundamental optimization concepts developed through the frameworks of
duality theory and sensitivity analysis. It presents the principles of conjugate and Fenchel duality
and illustrates how ...
Stochastic Gradient Descent (SGD) is a fundamental optimization method widely used in
machine learning due to its efficiency in handling large-scale and high-dimensional data. Unlike
batch gradient descent, which ...
This document provides a comprehensive overview of fundamental concepts and methods in optimization,
focusing on both unconstrained and constrained problems. It begins with the formulation and solution of
linear ...