Please use this identifier to cite or link to this item:
https://digital.lib.ueh.edu.vn/handle/UEH/76101
Full metadata record
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Vo Phuc Tinh | - |
dc.contributor.other | Tran Anh Khoa | - |
dc.contributor.other | Pham Duc Lam | - |
dc.contributor.other | Nguyen Hoang Nam | - |
dc.contributor.other | Duc Ngoc Minh Dang | - |
dc.contributor.other | Duy Dong Le | - |
dc.date.accessioned | 2025-08-28T01:54:00Z | - |
dc.date.available | 2025-08-28T01:54:00Z | - |
dc.date.issued | 2025 | - |
dc.identifier.issn | 2471-285X | - |
dc.identifier.uri | https://digital.lib.ueh.edu.vn/handle/UEH/76101 | - |
dc.description.abstract | In recent years, split learning (SL) with personalized data and region-dropout strategies has been proposed to enhance the performance of classifier convolutional neural networks (CNNs). Many SL studies have suggested solutions for edge devices, which suffer from performance degradation as local data heterogeneity across clients increases. In this study, we evaluate mixing strategies to improve performance in SL, called Smashed-Mix SL (SMixSL). These strategies include Smashed-Cutout, which uses pruned patches; Smashed-CutMix, which cuts and pastes to mix proportionally to the area of the output-generating patches; and Smashed-Mixup, which mixes proportionally without removal. The main idea is that label enhancement and softening can improve server-side image input. This study compares SL's performance, resource efficiency trade-offs, and other state-of-the-art distributed deep-learning variants, such as multi-head split learning (MHSL) and split-federated learning (SFL). The results achieved are very encouraging for mixers in each specific case. This study effectively guides an online learning model to focus on less discriminative parts of the broken feature transition, allowing the network to generalize better and improve individuality. | en |
dc.language.iso | eng | - |
dc.publisher | IEEE | - |
dc.relation.ispartof | IEEE TRANSACTIONS ON EMERGING TOPICS IN COMPUTATIONAL INTELLIGENCE | - |
dc.rights | IEEE | - |
dc.subject | Servers | en |
dc.subject | Training | en |
dc.subject | Mixers | en |
dc.subject | Data models | en |
dc.subject | Computational modeling | en |
dc.subject | Costs | en |
dc.subject | Accuracy | en |
dc.subject | Urban areas | en |
dc.subject | Predictive models | en |
dc.subject | Data privacy | en |
dc.subject | Split learning | en |
dc.subject | Personalized data | en |
dc.subject | Dropout | en |
dc.subject | Smashed-mix | en |
dc.subject | Federated learning | en |
dc.title | SMixSL: The Smashed-Mixture Technique for Split Learning With Localizable Features | en |
dc.type | Journal Article | en |
dc.identifier.doi | https://doi.org/10.1109/TETCI.2024.3523698 | - |
dc.format.firstpage | 1 | - |
dc.format.lastpage | 13 | - |
ueh.JournalRanking | ISI | - |
item.fulltext | Only abstracts | - |
item.cerifentitytype | Publications | - |
item.openairecristype | http://purl.org/coar/resource_type/c_18cf | - |
item.languageiso639-1 | en | - |
item.grantfulltext | none | - |
item.openairetype | Journal Article | - |
Appears in Collections: | INTERNATIONAL PUBLICATIONS |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.