Abstract Neural Networks (NN) are a powerful tool in approximation theory because of the existence of Universal Approximation (UA) results. In the last decades, a significant attention has been given to Extreme Learning Machines (ELMs), typically employed for the training of single layer NNs, and for which a UA result can also be proven. In a generic NN, the design of the optimal approximator can be recast as a non-convex optimization problem that turns out to be particularly demanding from the computational viewpoint. However, under the adoption of ELM, the optimization task reduces to a – possibly rectangular – linear problem. In this work, we detail how to design a sequence of ELM networks trained via a target dataset. Different convergence procedures are proposed and tested for some reference datasets constructed to be equivalent to approximation problems.
Insights on the different convergences in Extreme Learning Machine / De Falco, Davide Elia; Calabrò, Francesco; Pragliola, Monica. - In: NEUROCOMPUTING. - ISSN 0925-2312. - 599:(2024). [10.1016/j.neucom.2024.128061]
Insights on the different convergences in Extreme Learning Machine
Calabrò, Francesco
;Pragliola, Monica
2024
Abstract
Abstract Neural Networks (NN) are a powerful tool in approximation theory because of the existence of Universal Approximation (UA) results. In the last decades, a significant attention has been given to Extreme Learning Machines (ELMs), typically employed for the training of single layer NNs, and for which a UA result can also be proven. In a generic NN, the design of the optimal approximator can be recast as a non-convex optimization problem that turns out to be particularly demanding from the computational viewpoint. However, under the adoption of ELM, the optimization task reduces to a – possibly rectangular – linear problem. In this work, we detail how to design a sequence of ELM networks trained via a target dataset. Different convergence procedures are proposed and tested for some reference datasets constructed to be equivalent to approximation problems.File | Dimensione | Formato | |
---|---|---|---|
DeFalcoCalabroPraglila_Neurocomputing2024.pdf
accesso aperto
Licenza:
Dominio pubblico
Dimensione
1.28 MB
Formato
Adobe PDF
|
1.28 MB | Adobe PDF | Visualizza/Apri |
I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.