- International Journal of Pure and Applied Sciences
- Volume:10 Issue:1
- Exponential-Quadratic-Logarithmic Composite Function Optimization In Positive Domains: Leveraging Mu...
Exponential-Quadratic-Logarithmic Composite Function Optimization In Positive Domains: Leveraging Multiplicative Calculus In Gradient Descent Algorithms
Authors : Erkan Kıymık, Ali Emre Öztürk
Pages : 209-227
Doi:10.29132/ijpas.1467644
View : 89 | Download : 41
Publication Date : 2024-06-30
Article Type : Research Paper
Abstract :This work investigates the integration of multiplicative calculus into gradient descent algorithms, including Adaptive Gradient algorithm (AdaGrad), Root Mean Squared Propagation (RMSProp), Nesterov Accelerated Gradient (NAG), and Momentum, to optimize exponential-quadratic-logarithmic composite functions with the positivity constrained. This research, conducted across five scenarios within the Constrained and Unconstrained Testing Environment (CUTEst), compares these multiplicative methods with their classical counterparts under a variety of constraints environments such as bounded, quadratic, and other types, and unconstrained environments. The results demonstrate the significant superiority of multiplicative-based algorithms, especially in unconstrained and bounded constrained scenarios, and demonstrate their potential for complex optimization tasks. Statistical analysis supports the observed performance advantages, indicating significant opportunities for optimization strate-gies in positive domains.Keywords : Gradient Descent, Optimization, Multiplicative calculus, Machine Learning, Composite Functions, Non newtonian Calculus