3 Publications

Research.

A Scale-Free Network for Modeling a Homogeneous Ensemble
2025

A Scale-Free Network for Modeling a Homogeneous Ensemble

Ensemble learning aims to enhance predictions by aggregating multiple learners through a model ensemble and thus facilitating a more accurate capture of underlying data distribution. This study proposes a scale-free network-based ensemble learning (NEL) algorithm based on neural networks as base learners, allowing for a balance between individual base learner accuracy and diversity. Optimal weights for ensemble integration are calculated based on the centrality of a node in the heterogeneous network. The scale-free topology of NEL inherently allows for ensemble pruning due to its power-law degree distribution. Experimental results derived from averaging the ensemble performance over ten runs show the favorability and robustness of our approach.

Ensemble LearningScale-Free NetworksNeural Networks
A Scale-Free Network-Based Genetic Algorithm with Balanced Exploration and Exploitation
2025

A Scale-Free Network-Based Genetic Algorithm with Balanced Exploration and Exploitation

Genetic algorithms (GA) are widely used to solve complex computational problems. This paper introduces a novel network-based approach that utilizes phenotypic and genotypic similarities to establish inter-chromosomal links. Based on the scale-free property of the Barabasi-Albert (BA) model, we dynamically assign authority nodes to drive the evolutionary process. Comparative evaluations against panmictic GA and a previously developed network-based genetic algorithm demonstrate favorable results. The assessments involve the average best fitness over 50 runs conducted on eight well-known benchmark functions.

Genetic AlgorithmsScale-Free NetworksOptimization
Adaptive Neural Topologies: A Dynamic Approach to Learning Network Structures during Training
2024

Adaptive Neural Topologies: A Dynamic Approach to Learning Network Structures during Training

This paper introduces the Adaptive Topology Neural Network (ATNN), an architecture that dynamically optimizes its connections to emulate the efficiency of biological neural networks. Starting as a fully connected network, ATNN adapts to a sparser topology by utilizing an edge importance metric to selectively rewire non-important connections, enabling skip connections similar to ResNet. Empirical evaluations across 8 classification datasets demonstrate that ATNN significantly outperforms standard fixed-topology ANNs in MSE, accuracy, precision, and recall.

Neural NetworksAdaptive TopologyMachine Learning