Please use this identifier to cite or link to this item:
http://artemis.cslab.ece.ntua.gr:8080/jspui/handle/123456789/19874| Title: | Expansion of Multilayer Perceptrons Through Progressive Neuron Addition |
| Authors: | Χατζής, Νικόλας Μαραγκός Πέτρος |
| Keywords: | Machine Learning Neural Networks Multi-Layer Perceptron Neural Network Expansion |
| Issue Date: | 13-Oct-2025 |
| Abstract: | This thesis explores strategies for dynamically extending the architecture of Multi-Layer Perceptrons (MLPs) with ReLU activations through progressive neuron addition. Rather than starting with a fixed, large network, we study how models can grow their capacity during training by adding new neurons in a structured and principled way. To address this problem, we develop a framework that separates the role of extenders, which determine how new neurons are initialized and integrated into the existing network, and distributors, which decide where these neurons should be placed across layers. Within this framework, we introduce multiple variants, including the the Partition-Based Extender, the Weight Sharing Extender, as well as distribution strategies such as the Steepest Voting Distributor. We evaluate these approaches on synthetic data and benchmark image tasks (MNIST, FashionMNIST, CIFAR-10, CIFAR-100). The experiments highlight both the strengths and the limitations of progressive expansion. While the dynamically grown networks do not consistently surpass conventionally trained fixed-size models, they achieve better results than strong expansion methods from the literature and are able to overcome challenges—such as neuron inactivity and poor initialization—that existing techniques fail to address. |
| URI: | http://artemis.cslab.ece.ntua.gr:8080/jspui/handle/123456789/19874 |
| Appears in Collections: | Διπλωματικές Εργασίες - Theses |
Files in This Item:
| File | Description | Size | Format | |
|---|---|---|---|---|
| thesis_Nikolas_Chatzis (2).pdf | 6.8 MB | Adobe PDF | View/Open |
Items in Artemis are protected by copyright, with all rights reserved, unless otherwise indicated.