Title: | SMixSL: The Smashed-Mixture Technique for Split Learning With Localizable Features |
Author(s): | Vo Phuc Tinh |
Keywords: | Servers; Training; Mixers; Data models; Computational modeling; Costs; Accuracy; Urban areas; Predictive models; Data privacy; Split learning; Personalized data; Dropout; Smashed-mix; Federated learning |
Abstract: | In recent years, split learning (SL) with personalized data and region-dropout strategies has been proposed to enhance the performance of classifier convolutional neural networks (CNNs). Many SL studies have suggested solutions for edge devices, which suffer from performance degradation as local data heterogeneity across clients increases. In this study, we evaluate mixing strategies to improve performance in SL, called Smashed-Mix SL (SMixSL). These strategies include Smashed-Cutout, which uses pruned patches; Smashed-CutMix, which cuts and pastes to mix proportionally to the area of the output-generating patches; and Smashed-Mixup, which mixes proportionally without removal. The main idea is that label enhancement and softening can improve server-side image input. This study compares SL's performance, resource efficiency trade-offs, and other state-of-the-art distributed deep-learning variants, such as multi-head split learning (MHSL) and split-federated learning (SFL). The results achieved are very encouraging for mixers in each specific case. This study effectively guides an online learning model to focus on less discriminative parts of the broken feature transition, allowing the network to generalize better and improve individuality. |
Issue Date: | 2025 |
Publisher: | IEEE |
URI: | https://digital.lib.ueh.edu.vn/handle/UEH/76101 |
DOI: | https://doi.org/10.1109/TETCI.2024.3523698 |
ISSN: | 2471-285X |
Appears in Collections: | INTERNATIONAL PUBLICATIONS
|