Happy to share that I got tenured last month!
While every phase in life is special, this one feels a bit more meaningful, and it made me reflect on the past 15+ years in academia. I'd like to thank
@UWMadison and
@UWMadisonECE for tremendous support throughout the past six years, helping me grow.
I am very grateful to all the teachers I’ve met in the past 15+ years of research since undergrad. Prof. Sae-Young Chung introduced me to engineering, and in particular, information theory. Prof. Yung Yi and Prof. Song Chong introduced me to communication network theory, and from Prof. Yung Yi I learned the true passion for research. I miss him a lot.
At Berkeley, I learned everything about research from my advisor Prof. Kannan Ramchandran. In particular, I learned that the most important motivation behind great research is endless curiosity and the desire to really understand how things work. From my postdoc mentor Prof. Changho Suh at KAIST, I learned the mindset of perfection, making every single paper count.
During my assistant professorship, I was lucky to have the best colleagues. I learned so much from Rob (
@rdnowak) and Dimitris (
@DimitrisPapail). I am still learning from Dimitris' unique sense of research taste and Rob's example of how to live as the coolest senior professor. I also learned a lot from the Optibeer folks Steve Wright, Jeff Linderoth, and my ECE colleagues Ramya (
@ramyavinayak) and Grigoris (
@Grigoris_c). Thank you all!
I’d like to thank my former students and postdocs too. Daewon and Jy-yong (
@jysohn1108) joined my lab early on and worked on many interesting projects. Changhun and Tuan (
@tuanqdinh) joined midway through his PhD and worked on interesting research projects, and in particular, Tuan initiated our lab’s first LLM research five years ago!
Yuchen (
@yzeng58), Ziqian (
@myhakureimu), and Ying (
@yingfan_bot) joined around the same time, and working with them has been the most fun and rewarding part of my job. Each took on a challenging topic and did great work. Yuchen advanced LLM fine-tuning, especially parameter-efficient methods. Ziqian resolved the mystery of LLM in-context learning. Ying explored "a model in a loop," focusing on diffusion models and looped Transformers.
They all graduated earlier this year and are continuing their research at
@MSFTResearch and
@Google. Best wishes! 🥰
I am also grateful for co-advising Nayoung (
@nayoung_nylee), Liu (
@Yang_Liuu), and Joe (
@shenouda_joe) with Dimitris and/or Rob. Nayoung's work on Transformer length generalization, Liu's on in-context learning, and Joe's on the mathematical theory of vector-valued neural networks are all very exciting. They are all graduating very soon, so stay tuned! (And reach out to them if you have great opportunities!)
I also had the pleasure of working with master's students Ruisu, Andrew, Jackson (
@kunde_jackson), Bryce (
@BryceYicongChen), and Michael (
@michaelgira23), as well as many visiting students and researchers. Thank you for being such great collaborators.
I’d like to thank and introduce the new(ish) members too. Jungtaek (
@jungtaek_kim) and Thomas are studying LLM reasoning. Jongwon (
@jongwonjeong123) just joined, and interestingly he was an MS student in Prof. Chung’s lab at KAIST, which makes him my academic brother turned academic son. Ethan (
@ethan_ewer), Lynnix, and Chungpa (visiting) are also working on cool LLM projects!
Thank you to
@NSF,
@amazon,
@WARF_News,
@FuriosaAI,
@kseayg, and KFAS for generous funding. I also learned a lot from leading and working with the AI team at
@Krafton_AI, particularly with Jaewoong
@jaewoong_cho, so thank you for that as well.
Last and most importantly, thanks to my family! ❤️
I only listed my mentors and mentees here, not all my amazing collaborators, but thank you all for the great work together.
With that, I’m excited for what’s ahead, and so far no "tenure blues."
Things look the same, if not more exciting... haha!