基本信息
views: 6
Career Trajectory
Bio
- Working on all things, text like NER, closed domain multi-span question answering, summarization, multi-task learning, self-supervised learning, using few shot learning/meta learning for low resource tasks.
- Long standing collaboration with researchers from Columbia University and Google Brain working on projects to create fast, scalable attention mechanisms with provable convergence properties and applying them to diverse tasks like vision, audio, text and graphs. (ICLR-2022, ICML-2022, 2023)
- Previously worked on self-supervised learning on graph neural networks to predict diseases from single cell data (Accepted at ACM CHIL-20, Spotlight at ICML GRL+ 2020 workshop, AAAI '21)
- Applying DL to quantum mechanics and statistical physics problems (Accepted at AAAI-MLPS spring symposium 2021, 2 papers accepted at Neurips ML4Physical Sciences)
- Worked on various generative models like Normalizing Flows, VAEs and GANs.
- Created GLYPH + BERT by augmenting BERT by encoded images of Chinese characters which improved NER performance on Chinese Onto notes by +1 and on Weibo by +3 over the baseline BERT model at Human Language Technology Center of Excellence, JHU. (Accepted at AAAI-20)
- Graduated with a PhD in mathematics with a passion for teaching, learning and research. My thesis is focused on modular Galois deformations and Iwasawa theory.
- Also interested in using mathematical ideas to tackle gerrymandering and other social justice issues.
- Long standing collaboration with researchers from Columbia University and Google Brain working on projects to create fast, scalable attention mechanisms with provable convergence properties and applying them to diverse tasks like vision, audio, text and graphs. (ICLR-2022, ICML-2022, 2023)
- Previously worked on self-supervised learning on graph neural networks to predict diseases from single cell data (Accepted at ACM CHIL-20, Spotlight at ICML GRL+ 2020 workshop, AAAI '21)
- Applying DL to quantum mechanics and statistical physics problems (Accepted at AAAI-MLPS spring symposium 2021, 2 papers accepted at Neurips ML4Physical Sciences)
- Worked on various generative models like Normalizing Flows, VAEs and GANs.
- Created GLYPH + BERT by augmenting BERT by encoded images of Chinese characters which improved NER performance on Chinese Onto notes by +1 and on Weibo by +3 over the baseline BERT model at Human Language Technology Center of Excellence, JHU. (Accepted at AAAI-20)
- Graduated with a PhD in mathematics with a passion for teaching, learning and research. My thesis is focused on modular Galois deformations and Iwasawa theory.
- Also interested in using mathematical ideas to tackle gerrymandering and other social justice issues.
Research Interests
Papers共 24 篇Author StatisticsCo-AuthorSimilar Experts
By YearBy Citation主题筛选期刊级别筛选合作者筛选合作机构筛选
时间
引用量
主题
期刊级别
合作者
合作机构
Sang Min Kim, Byeongchan Kim,Arijit Sehanobish,Krzysztof Choromanski, Dongseok Shim,Avinava Dubey,Min-hwan Oh
arxiv(2024)
Cited0Views0Bibtex
0
0
arXivorg (2023)
INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 162 (2022)
Cited11Views0Bibtex
11
0
Conference on Empirical Methods in Natural Language Processingpp.332-347, (2022)
CoRRpp.332-347, (2022)
Load More
Author Statistics
#Papers: 24
#Citation: 94
H-Index: 6
G-Index: 9
Sociability: 4
Diversity: 0
Activity: 1
Co-Author
Co-Institution
D-Core
- 合作者
- 学生
- 导师
Data Disclaimer
The page data are from open Internet sources, cooperative publishers and automatic analysis results through AI technology. We do not make any commitments and guarantees for the validity, accuracy, correctness, reliability, completeness and timeliness of the page data. If you have any questions, please contact us by email: report@aminer.cn