AdamZ: an enhanced optimisation method for neural network training

[thumbnail of Open Access]
Preview
Text (Open Access)
- Published Version
· Available under License Creative Commons Attribution.
[thumbnail of AdamZ_Springer_Neural_Computing_and_Applications_manuscript_revised.pdf]
Text
- Accepted Version
· Restricted to Repository staff only
· The Copyright of this document has not been checked yet. This may affect its availability.

Please see our End User Agreement.

It is advisable to refer to the publisher's version if you intend to cite from this work. See Guidance on citing.

Add to AnyAdd to TwitterAdd to FacebookAdd to LinkedinAdd to PinterestAdd to Email

Zaznov, I., Badii, A., Kunkel, J. and Dufour, A. ORCID: https://orcid.org/0000-0003-0519-648X (2025) AdamZ: an enhanced optimisation method for neural network training. Neural Computing and Applications. ISSN 1433-3058 doi: 10.1007/s00521-025-11649-w

Abstract/Summary

AdamZ is an advanced variant of the Adam optimiser, developed to enhance convergence efficiency in neural network training. This optimiser dynamically adjusts the learning rate by incorporating mechanisms to address overshooting and stagnation, which are common challenges in optimisation. Specifically, AdamZ reduces the learning rate when overshooting is detected and increases it during periods of stagnation, utilising hyperparameters such as overshoot and stagnation factors, thresholds, and patience levels to guide these adjustments. While AdamZ may lead to slightly longer training times compared to some other optimisers, it consistently excels in minimising the loss function, making it particularly advantageous for applications where precision is critical. Benchmarking results demonstrate the effectiveness of AdamZ in maintaining optimal learning rates, leading to improved model performance across diverse tasks.

Altmetric Badge

Item Type Article
URI https://centaur.reading.ac.uk/id/eprint/124435
Identification Number/DOI 10.1007/s00521-025-11649-w
Refereed Yes
Divisions Henley Business School > Finance and Accounting
Science > School of Mathematical, Physical and Computational Sciences > Department of Computer Science
Publisher Springer
Download/View statistics View download statistics for this item

Downloads

Downloads per month over past year

University Staff: Request a correction | Centaur Editors: Update this record