Follow
Eunjeong Jeong
Title
Cited by
Cited by
Year
Communication-efficient on-device machine learning: Federated distillation and augmentation under non-iid private data
E Jeong, S Oh, H Kim, J Park, M Bennis, SL Kim
arXiv preprint arXiv:1811.11479, 2018
5932018
Mix2FLD: Downlink federated learning after uplink federated distillation with two-way mixup
S Oh, J Park, E Jeong, H Kim, M Bennis, SL Kim
IEEE Communications Letters 24 (10), 2211-2215, 2020
532020
Distilling on-device intelligence at the network edge
J Park, S Wang, A Elgabli, S Oh, E Jeong, H Cha, H Kim, SL Kim, ...
arXiv preprint arXiv:1908.05895, 2019
342019
Multi-hop federated private data augmentation with sample compression
E Jeong, S Oh, J Park, H Kim, M Bennis, SL Kim
arXiv preprint arXiv:1907.06426, 2019
242019
Hiding in the crowd: Federated data augmentation for on-device learning
E Jeong, S Oh, J Park, H Kim, M Bennis, SL Kim
IEEE Intelligent Systems 36 (5), 80-87, 2020
132020
Asynchronous decentralized learning over unreliable wireless networks
E Jeong, M Zecchin, M Kountouris
ICC 2022-IEEE International Conference on Communications, 607-612, 2022
102022
Personalized decentralized federated learning with knowledge distillation
E Jeong, M Kountouris
ICC 2023-IEEE International Conference on Communications, 1982-1987, 2023
62023
The system can't perform the operation now. Try again later.
Articles 1–7