Read the original document by opening this link in a new tab.
Table of Contents
1 Introduction
2 Inference on Edge Devices
3 Adaptation on Edge Devices
4 Learning on Edge Devices
5 Edge-Server System
Summary
This document explores the enablement of deep learning on edge devices, focusing on reducing redundancy and resource consumption. It covers various methodologies for efficient inference, adaptation, and learning on edge devices. The thesis discusses scenarios such as inference on edge devices, adaptation of DNNs, efficient learning in unseen environments, and edge-server systems. The goal is to achieve a better trade-off between resource consumption and model accuracy in the context of resource-constrained edge devices.