Committed to connecting the world

WTISD

Designing graph neural networks training data with limited samples and small network sizes

Designing graph neural networks training data with limited samples and small network sizes

Authors: Junior Momo Ziazet, Charles Boudreau, Oscar Delgado, Brigitte Jaumard
Status: Final
Date of publication: 12 September 2023
Published in: ITU Journal on Future and Evolving Technologies, Volume 4 (2023), Issue 3, Pages 492-502
Article DOI : https://doi.org/10.52953/AFYW5455
Abstract:
Machine learning is a data-driven domain, which means a learning model's performance depends on the availability of large volumes of data to train it. However, by improving data quality, we can train effective machine learning models with little data. This paper demonstrates this possibility by proposing a methodology to generate high-quality data in the networking domain. We designed a dataset to train a given Graph Neural Network (GNN) that not only contains a small number of samples, but whose samples also feature network graphs of a reduced size (10-node networks). Our evaluations indicate that the dataset generated by the proposed pipeline can train a GNN model that scales well to larger networks of 50 to 300 nodes. The trained model compares favorably to the baseline, achieving a mean absolute percentage error of 5-6%, while being significantly smaller at 90 samples total (vs. thousands of samples for the baseline).

Keywords: Data-centric AI, data generation, graph neural networks, network modeling, RouteNet-Fermi
Rights: © International Telecommunication Union, available under the CC BY-NC-ND 3.0 IGO license.
electronic file
ITEM DETAILARTICLEPRICE
ENGLISH
PDF format   Full article (PDF)
Free of chargeDOWNLOAD