Optimization could cut the carbon footprint of AI training by up to 75%Deep learning models that power giants like TikTok and Amazon, as well as tools like ChatGPT, could save energy without new hardware or infrastructure.
Fan Lai earns Towner Prize for Outstanding PhD ResearchThe award recognizes creative and outstanding research achievements.
Researchers cut down on AI's carbon footprint with new optimization frameworkZeus automatically adapts the power usage of deep learning models to chase clean electricity sources throughout the day
Open source platform enables research on privacy-preserving machine learningVirtual assortment of user devices provides a realistic training environment for distributed machine learning, protects privacy by learning where data lives.
Multi-institute project "Treehouse" aims to enable sustainable cloud computing"We are buying thousands of GPUs and running them at full speed, and no one really knows just how much energy is being spent in the process."
Enabling efficient, globally distributed machine learningA group of researchers at U-M is working on the full big data stack for training machine learning models on millions of devices worldwide.
Four papers with Michigan authors at SIGCOMM 2021ACM SIGCOMM's annual conference is the leading conference in data communications and networking in the world.
Human resilience study to benefit from new data privacy technique
Prof. Mosharaf Chowdhury is leading development of a new machine learning application that will protect the privacy of participants.
Mosharaf Chowdhury named Morris Wellman Professor
Chowdhury is an expert in network-informed data systems design for big data and AI applications.
“Hiding” network latency for fast memory in data centers
A new system called Leap earned a Best Paper award at USENIX ATC ‘20 for producing remote memory access speed on par with local machines over data center networks.
Enabling fairer data clusters for machine learning
Their findings reduce average job completion time by up to 95% when the system load is high, while treating every job fairly.
Big data, small footprint
How changing the rules of computing could lighten Big Data’s impact on the internet.
Five papers by CSE researchers presented at NSDIThe teams designed systems for faster and more efficient distributed and large-scale computing.
Chowdhury receives VMWare Award to further research on cluster-wide memory efficiency
Chowdhury’s work has produced important results that can make memory in data centers both cheaper and more efficient.
Chowdhury wins NSF CAREER award for making memory cheaper, more efficient in big data centers
Chowdhury connects all unused memory in a data cluster and treats it as a single unit.
Two solutions for GPU efficiency can boost AI performance
Chowdhury’s lab multiplied the number of jobs a GPU cluster can finish in a set amount of time
Designing a flexible future for massive data centers
A new approach recreates the power of a large server by linking up and pooling the resources of smaller computers with fast networking technology.
A breakthrough for large scale computing
New software finally makes ‘memory disaggregation’ practical.
Jack Kosaian selected for NSF Graduate Research Fellowship
Jack has enjoyed involvement in research across diverse domains within the College of Engineering.
Mosharaf Chowdhury receives ACM SIGCOMM Dissertation Award
Prof. Chowdhury bridges the gap between application-level performance and network-level optimizations through the coflow abstraction.
Mosharaf Chowdhury receives Google Faculty Research Award
The project aims to create a new software stack for analytics over geo-distributed datasets.
Eleven New Faculty Join CSE
We're building a bigger, better CSE.