Research Experience
-
SNU ARC LAB | Internship (2023 - Present)
Contributed to Any-Precision LLM experimentation, implementation and core logic optimization.
-
SNU HCI LAB | UROP (2023)
Worked on dimensionality reduction technique UMATO. Optimized UMATO to reach performance comparable to SOTA DR techniques and helped create ZADU, a DR evaluating library.
Publications
-
Any-Precision LLM: Low-Cost Deployment of Multiple, Different-Sized LLMs
Paper
Code
Yeonhong Park, Jake Hyun, SangLyul Cho, Bonggeun Sim, Jae W. Lee
International Conference on Machine Learning 2024 - Oral Presentation (144/9473 = 1.5%)
Any-precision LLM enables the creation of variable bitrate models, significantly reducing deployment costs for multiple Large Language Models (LLMs) through lightweight post-training quantization and optimized software.
-
ZADU: A Python Library for Evaluating the Reliability of Dimensionality Reduction Embeddings
Paper
Code
Hyeon Jeon, Aeri Cho, Jinhwa Jang, Soohyun Lee, Jake Hyun, Hyung-Kwon Ko, Jaemin Jo, Jinwook Seo
IEEE Visualization Conference, 2024
ZADU is a Python library that offers efficient and comprehensive evaluation of dimensionality reduction (DR) embeddings through optimized distortion measures.
Open Source Contributions
-
flash1dkmeans GitHub
Devised, verified, and implemented a variant of K-means clustering highly optimized for the 1D data, used directly in quantization works like Any-Precision LLM to bring down the quantization cost dramatically.
-
Steadiness & Cohesiveness GitHub
Metrics for evaluating the reliability of dimensionality reduction embeddings, used in ZADU to provide a comprehensive evaluation of DR techniques.
-
UMATO: Uniform Manifold Approximation with Two-phase Optimization GitHub
A novel dimensionality reduction technique that achieves state-of-the-art performance in terms of accuracy, scalability, and stability, preserving both local and global structures of the data.