Bibliography

[1]

S. Robertson and H. Zaragoza, “The Probabilistic Relevance Framework: BM25 and Beyond,” Foundations and Trends in Information Retrieval, vol. 3, no. 4, pp. 334-385, 2009.

[2]

A. Vaswani, S. Noam, N. Parmar, J. Uskoreit, L. Jones, A. N. Gomez, L. Kaiser and I. Polosukhim, “Attention Is All You Need,” arxiv.org, 2017. [Online]. Available: https://arxiv.org/abs/1706.03762.

[3]

ESCO, “Multilingual classification of European Skills, Competences, and Occupations,” [Online]. Available: https://esco.ec.europa.eu/en.

[4]

Y. Liu, M. Ott, N. Goyal, J. Du, M. Joshi, D. Chen, O. Levy, M. Lewis, L. Zettlemoyer and V. Stoyanov, “RoBERTa: A Robustly Optimized BERT Pretraining Approach,” 2019.

[5]

M. Doueze, A. Guzhva, C. Deng, J. Johnson, G. Szlvasy, P.-E. Mazaré, M. Lomeli, L. Hosseini and H. Jégou, “The Fais library,” Meta, 2024. [Online]. Available: https://arxiv.org/abs/2401.08281.

[6]

Ollama, “Ollama,” [Online]. Available: https://ollama.com/.

[7]

J. Devlin, W.-W. Chang, K. Lee and K. Toutanava, “BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding,” arXiv, 2018.

[8]

J. Devlin, W. Chang, K. Lee and K. Toutanova, “BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding,” CoRR, 2018.

Last updated