We do computer architecture research in the Department of Computer Science at The University of Texas at Austin.
In Spring 2023, we meet Fridays @ 3 PM in GDC 5.516. All interested students are welcome.
Matthew Giordano
Caroline Li
Parth Shroff
Zhan Shi (2020)
Kai Wang (2020)
Hao Wu (2020)
Ashay Rane (2019)
Jia Chen (2019)
Akanksha Jain (2016)
Oswaldo Olivo (2016)
Renee St. Amant (2014)
Paul Navratíl (2010)
Alison Norman (2010)
Walter Chang (2010)
Ben Hardekopf (2009)
Teck B. Tok (2007)
Ibrahim Hur (2006)
Sam Guyer (2003)
Rich Cardone (2002)
Daniel Jiménez (2002)
Anjana Subramanian (2019)
Pawan Joshi (2019)
Apollo Ellis (2011)
Karthik Murthy (2010)
Adam Brown (2007)
Kent Spaulding (1998)
Frank Kuehndel (1998)
We're applying deep neural networks and reinforcement learning to design more accurate, timely, high-coverage data prefetchers.
We're building out a pipeline that automatically co-optimizes and generates the hardware and software for dense tensor accelerators.
We're re-examining longstanding principles in ISA design, with a goal of delivering a richer, more flexible load-store interface that delivers finer-grained control of microarchitectural optimization to help mitigate the memory wall.
C. Sakhuja, Z. Shi, and C. Lin. Leveraging Domain Information for the Efficient, Automated Design of Deep Learning Accelerators. International Symposium on High-Performance Computer Architectural (HPCA). 2023.
I. Shah, A. Jain, and C. Lin. Effective Mimicry of Belady's MIN Policy. International Symposium on High-Performance Computer Architectural (HPCA). 2022. (Finalist, Best Paper Award)
Z. Shi, A. Jain, K. Swerksy, M. Hashemi, P. Ranganathan, and C. Lin. A Hierarchical Neural Model of Data Prefetching. International Conference on Architectural Support for Programming Languages and Operating Systems (ASPLOS). 2020. (Micro Top Picks Honorable Mention)
K. Wang, D. Fussell, and C. Lin. A Fast Work-Efficient SSSP Algorithm for GPUs. 26th Annual Symposium on Principles and Practice of Parallel Programming (PPoPP). 2020.
Copyright © 2023-present