
Title |
|---|
![]() Initialization is Critical to Whether Transformers Fit Composite Functions by Reasoning or MemorizingNeural Information Processing Systems (NeurIPS), 2024 |
![]() Exploring the Compositional Deficiency of Large Language Models in
Mathematical ReasoningConference on Empirical Methods in Natural Language Processing (EMNLP), 2024 |
![]() What makes Models Compositional? A Theoretical View: With SupplementInternational Joint Conference on Artificial Intelligence (IJCAI), 2024 |
![]() Iterated Learning Improves Compositionality in Large Vision-Language
ModelsComputer Vision and Pattern Recognition (CVPR), 2024 |
![]() Reasoning Abilities of Large Language Models: In-Depth Analysis on the
Abstraction and Reasoning CorpusACM Transactions on Intelligent Systems and Technology (ACM TIST), 2024 |
![]() The pitfalls of next-token predictionInternational Conference on Machine Learning (ICML), 2024 |
![]() Will GPT-4 Run DOOM?IEEE Transactions on Games (IEEE Trans. Games), 2024 |