People | Locations | Statistics |
---|---|---|
Naji, M. |
| |
Motta, Antonella |
| |
Aletan, Dirar |
| |
Mohamed, Tarek |
| |
Ertürk, Emre |
| |
Taccardi, Nicola |
| |
Kononenko, Denys |
| |
Petrov, R. H. | Madrid |
|
Alshaaer, Mazen | Brussels |
|
Bih, L. |
| |
Casati, R. |
| |
Muller, Hermance |
| |
Kočí, Jan | Prague |
|
Šuljagić, Marija |
| |
Kalteremidou, Kalliopi-Artemi | Brussels |
|
Azam, Siraj |
| |
Ospanova, Alyiya |
| |
Blanpain, Bart |
| |
Ali, M. A. |
| |
Popa, V. |
| |
Rančić, M. |
| |
Ollier, Nadège |
| |
Azevedo, Nuno Monteiro |
| |
Landes, Michael |
| |
Rignanese, Gian-Marco |
|
Fernandez-Martinez, R.
in Cooperation with on an Cooperation-Score of 37%
Topics
Publications (3/3 displayed)
- 2017Carbide distribution based on automatic image analysis for cryogenically treated tool steelscitations
- 2016Mechanical Behavior of PLA/Clay Reinforced Nanocomposite Material Using FE Simulations: Comparison of an Idealized Volume against the Real Electron Tomography Volumecitations
- 2014Methodology based on genetic optimisation to develop overall parsimony models for predicting temperature settings on annealing furnacecitations
Places of action
Organizations | Location | People |
---|
article
Methodology based on genetic optimisation to develop overall parsimony models for predicting temperature settings on annealing furnace
Abstract
<p>Developing better prediction models is crucial for the steelmaking industry to improve the continuous hot dip galvanising line (HDGL). This paper presents a genetic based methodology whereby a wrapper based scheme is optimised to generate overall parsimony models for predicting temperature set points in a continuous annealing furnace on an HDGL. This optimisation includes a dynamic penalty function to control model complexity and an early stopping criterion during the optimisation phase. The resulting models (multilayer perceptron neural networks) were trained using a database obtained from an HDGL operating in the north of Spain. The number of neurons in the unique hidden layer, the inputs selected and the training parameters were adjusted to achieve the lowest validation and mean testing errors. Finally, a comparative evaluation is reported to highlight our proposal's range of applicability, developing models with lower prediction errors, higher generalisation capacity and less complexity than a standard method.</p>