Courselet Title: Using edge devices for CPU-based inference
Author: Fraida Fund
Contact Email: ffund@nyu.edu
This experiment demonstrates using an edge device for machine learning inference, focusing on image classification. While ML models are typically trained on cloud servers, edge inference can improve privacy, latency, and handle connectivity constraints.
Using a device on the CHI@Edge testbed, the experiment covers:
1. Launching a container
2. Attaching an IP for SSH access
3. File transfers
4. Running pre-trained image classification
5. Container cleanup
This approach processes data on-site, avoiding cloud data transfers. While edge devices may have less processing power, they offer advantages in certain scenarios.
The experiment provides hands-on experience with edge computing for ML inference, showing how models can be used in resource-constrained environments close to data sources.
Link to Artifact: https://chameleoncloud.org/experiment/share/d6ff6024-5d24-4ff7-966e-e066b2f657a6
Apply for a FOUNT badge to add your courselet to the table!