Skip to content

Project Website of [COLING'25] "Exploring Concept Depth: How Large Language Models Acquire Knowledge at Different Layers?"

Notifications You must be signed in to change notification settings

Luckfort/explore_CD

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

85 Commits
 
 
 
 
 
 
 
 

Repository files navigation

Exploring Concept Depth: How Large Language Models Acquire Knowledge at Different Layers?

This is the repository that contains source code for the Explore Concept Depth.

Please cite our work:

@inproceedings{jin-etal-2025-exploring,
    title = "Exploring Concept Depth: How Large Language Models Acquire Knowledge and Concept at Different Layers?",
    author = "Jin, Mingyu and Yu, Qinkai and Huang, Jingyuan and Zeng, Qingcheng and Wang, Zhenting and Hua, Wenyue and Zhao, Haiyan and Mei, Kai and Meng, Yanda and Ding, Kaize and Yang, Fan and Du, Mengnan and Zhang, Yongfeng",
    editor = "Rambow, Owen and Wanner, Leo and Apidianaki, Marianna and Al-Khalifa, Hend and Eugenio, Barbara Di and Schockaert, Steven",
    booktitle = "Proceedings of the 31st International Conference on Computational Linguistics",
    month = jan,
    year = "2025",
    address = "Abu Dhabi, UAE",
    publisher = "Association for Computational Linguistics",
    url = "https://aclanthology.org/2025.coling-main.37/",
    pages = "558--573"
}

Website License

Creative Commons License
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.

About

Project Website of [COLING'25] "Exploring Concept Depth: How Large Language Models Acquire Knowledge at Different Layers?"

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • JavaScript 78.0%
  • HTML 20.4%
  • CSS 1.6%