Abstract
Editor's notes: This article addresses the optimization of data movement in accelerating machine learning workloads, one of the most critical issues of state-ofthe- art computing platforms. It presents a novel in-DRAM accelerator for convolutional neural networks using mixed analog-stochastic optimizations and shows significant energy-efficiency improvements. - Umit Ogras, University of Wisconsin, USA.
| Original language | English |
|---|---|
| Pages (from-to) | 47-55 |
| Number of pages | 9 |
| Journal | IEEE Design and Test |
| Volume | 42 |
| Issue number | 1 |
| DOIs | |
| State | Published - 2025 |
Bibliographical note
Publisher Copyright:© 2013 IEEE.
Keywords
- convolution neural networks
- in-DRAM processing
- stochastic computing
ASJC Scopus subject areas
- Software
- Hardware and Architecture
- Electrical and Electronic Engineering