IBM Research takes responsibility for technology and its role in society. Working in IBM Research means you'll join a team who invent what's next in computing always choosing the big urgent and mind-bending work that endures and shapes generations. Our passion for discovery and excitement for defining the future of tech is what builds our strong culture around solving problems for clients and seeing the real world impact that you can make.
IBM's product and technology landscape includes Research Software and Infrastructure. Entering this domain positions you at the heart of IBM where growth and innovation thrive.
We are seeking a PhD-level summer intern to help design and advance the next generation of Hybrid Cloud AI platforms. The role focuses on working with full-time IBM researchers on highly scalable systems for machine learning (training and inference) that are both novel and impactful. Our research builds on open-source platforms like llm-d Kubernetes Kserve and explores optimizations across the entire stack from GPU networking model scheduling serving AI platform optimization including inference optimization performance modeling and Compound AI / Agentic systems. You will work with an agile team of researchers and engineers developing practical innovations in scalable GenAI systems that can impact thousands of developers and applications worldwide.
- PhD student in Computer Science Computer Engineering or related discipline
- Research background in systems for generative AI (training or inference)
- Experience with distributed systems or microservices for data or ML workloads
- Familiarity with cloud-native platforms (Kubernetes Docker or hybrid cloud environments)
- Proficiency in at least one of the following languages: Python C++ Go Java or Rust
- Knowledge of open-source large language model frameworks (e.g. Hugging Face PyTorch DeepSpeed)
- Familiarity with open source serving platforms like vllm llm-d and KServe
- Research or development experience in GPU networking and large-scale acceleration
- Research or development experience in Inference performance modeling and optimization