Communication Efficient Decentralized Collaborative Learning Of Heterogeneous Deep Learning Models Using Distillation And Incentive-Based Techniques

Smart devices, collectively, have very valuable and real-time data which can be used to train very efficient deep learning models for AI applications. However, due to the sensitive nature of this data, people are more concerned about the privacy of their data and not willing to share it. Therefore,...

Full description

Saved in:
Bibliographic Details
Main Author: Iqbal, Zahid
Format: Thesis
Language:English
Published: 2023
Subjects:
Online Access:http://eprints.usm.my/60077/1/ZAHID%20IQBAL%20-%20TESIS%20cut.pdf
Tags: Add Tag
No Tags, Be the first to tag this record!
id my-usm-ep.60077
record_format uketd_dc
spelling my-usm-ep.600772024-03-07T08:03:06Z Communication Efficient Decentralized Collaborative Learning Of Heterogeneous Deep Learning Models Using Distillation And Incentive-Based Techniques 2023-07 Iqbal, Zahid QA75.5-76.95 Electronic computers. Computer science Smart devices, collectively, have very valuable and real-time data which can be used to train very efficient deep learning models for AI applications. However, due to the sensitive nature of this data, people are more concerned about the privacy of their data and not willing to share it. Therefore, there is a need to learn from this valuable data in a decentralized fashion by withholding data localized on these intended devices and efficiently performing necessary computation on these devices by exploiting their computational resources. Statistical heterogeneity and fully model heterogeneity are among the key challenges in applying the Decentralized Learning (DL) approaches in real scenarios. Typically, all existing DL techniques assume that all devices would have homogeneous model architecture. However, in real applications of DL, due to different computational resources and distinct business needs of devices, it is intuitive that they may have completely different model architectures. Very limited work has been performed to address fully model heterogeneity problem. In the same way, some work has been performed to address the statistical heterogeneity however mostly is hard to apply in real scenarios or is only for limited use cases. 2023-07 Thesis http://eprints.usm.my/60077/ http://eprints.usm.my/60077/1/ZAHID%20IQBAL%20-%20TESIS%20cut.pdf application/pdf en public phd doctoral Universiti Sains Malaysia Pusat Pengajian Sains Komputer
institution Universiti Sains Malaysia
collection USM Institutional Repository
language English
topic QA75.5-76.95 Electronic computers
Computer science
spellingShingle QA75.5-76.95 Electronic computers
Computer science
Iqbal, Zahid
Communication Efficient Decentralized Collaborative Learning Of Heterogeneous Deep Learning Models Using Distillation And Incentive-Based Techniques
description Smart devices, collectively, have very valuable and real-time data which can be used to train very efficient deep learning models for AI applications. However, due to the sensitive nature of this data, people are more concerned about the privacy of their data and not willing to share it. Therefore, there is a need to learn from this valuable data in a decentralized fashion by withholding data localized on these intended devices and efficiently performing necessary computation on these devices by exploiting their computational resources. Statistical heterogeneity and fully model heterogeneity are among the key challenges in applying the Decentralized Learning (DL) approaches in real scenarios. Typically, all existing DL techniques assume that all devices would have homogeneous model architecture. However, in real applications of DL, due to different computational resources and distinct business needs of devices, it is intuitive that they may have completely different model architectures. Very limited work has been performed to address fully model heterogeneity problem. In the same way, some work has been performed to address the statistical heterogeneity however mostly is hard to apply in real scenarios or is only for limited use cases.
format Thesis
qualification_name Doctor of Philosophy (PhD.)
qualification_level Doctorate
author Iqbal, Zahid
author_facet Iqbal, Zahid
author_sort Iqbal, Zahid
title Communication Efficient Decentralized Collaborative Learning Of Heterogeneous Deep Learning Models Using Distillation And Incentive-Based Techniques
title_short Communication Efficient Decentralized Collaborative Learning Of Heterogeneous Deep Learning Models Using Distillation And Incentive-Based Techniques
title_full Communication Efficient Decentralized Collaborative Learning Of Heterogeneous Deep Learning Models Using Distillation And Incentive-Based Techniques
title_fullStr Communication Efficient Decentralized Collaborative Learning Of Heterogeneous Deep Learning Models Using Distillation And Incentive-Based Techniques
title_full_unstemmed Communication Efficient Decentralized Collaborative Learning Of Heterogeneous Deep Learning Models Using Distillation And Incentive-Based Techniques
title_sort communication efficient decentralized collaborative learning of heterogeneous deep learning models using distillation and incentive-based techniques
granting_institution Universiti Sains Malaysia
granting_department Pusat Pengajian Sains Komputer
publishDate 2023
url http://eprints.usm.my/60077/1/ZAHID%20IQBAL%20-%20TESIS%20cut.pdf
_version_ 1794024087105830912