Home > Published Issues > 2026 > Volume 17, No. 3, 2026 >
JAIT 2026 Vol.17(3): 477-487
doi: 10.12720/jait.17.3.477-487

LSGELU: Improved Gaussian Error Linear Units for Enhanced Performance in Simple and Complex Neural Networks

Suwichai Phunsa 1 and Thawatchai Chomsiri 2,*
1. Department of Creative Media, Digital Contents for Development Research Unit, Faculty of Informatics, Mahasarakham University, Mahasarakham, Thailand
2. Department of Information Technology, Research Center of Information Technology for the Future, Faculty of Informatics, Mahasarakham University, Mahasarakham, Thailand Email: suwichai.p@msu.ac.th (S.P.); thawatchai.c@msu.ac.th (T.C.)
*Corresponding author

Manuscript received August 6, 2025; revised August 18, 2025; accepted November 24, 2025; published March 10, 2026.

Abstract—In this paper, we propose Left-Shifted Gaussian Error Linear Units (LSGELU), a novel modification of the Gaussian Error Linear Unit (GELU) in which the base Gaussian curve is horizontally shifted to produce altered activation responses. Specifically, left shifts move the Gaussian center from x = 0 to x = −1.5 (a 1.5-unit leftward displacement), while right shifts move the center 1.5 units rightward along the x-axis. By displacing the Gaussian kernel, the integral that defines the GELU activation is modified, yielding a parameterized activation function whose dip (local trough) can be adjusted to better suit network dynamics. We evaluate the proposed variants on Modified National Institute of Standards and Technology (MNIST), Street View House Numbers (SVHN), Canadian Institute for Advanced Research-10 classes (CIFAR-10), and CIFAR-100. The experiments include multiple independent runs and report mean accuracy, standard deviation, and confidence intervals (with degrees of freedom specified). The results indicate that left-shifted LSGELU variants consistently improve performance for relatively simple architectures (with fewer hidden layers and units), whereas right-shifted variants Right-Shifted Gaussian Error Linear Units (RSGELU) are more effective for deeper, more complex networks when used in convolutional neural networks of varying depth and capacity. These findings suggest that controllable horizontal shifts of the GELU kernel provide a low-cost, interpretable mechanism for adapting activation behavior to model complexity, offering a practical avenue for performance tuning in image classification tasks.
 
Keywords—Left-Shifted Gaussian Error Linear Units (LSGELU), Right-Shifted Gaussian Error Linear Units (RSGELU), activation function, neural networks, machine learning

Cite: Suwichai Phunsa and Thawatchai Chomsiri, "LSGELU: Improved Gaussian Error Linear Units for Enhanced Performance in Simple and Complex Neural Networks," Journal of Advances in Information Technology, Vol. 17, No. 3, pp. 477-487, 2026. doi: 10.12720/jait.17.3.477-487

Copyright © 2026 by the authors. This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited (CC BY 4.0).

Article Metrics in Dimensions