Transformer-Based Semantic Embedding Model for Resume-Job Matching in Intelligent Talent Screening

Authors

  • Yuerong Yan Shanghai Zizen Consulting Co., Ltd., Shanghai, China Author

DOI:

https://doi.org/10.70088/mvanpe51

Keywords:

Transformer, Resume-Job Matching, Semantic Embedding, Dual-Tower Model, Talent Screening

Abstract

Based on the semantic matching requirements between resume text and job descriptions, this study investigates the application of Transformer semantic embedding models in intelligent talent screening. By constructing dual-sided semantic encoding networks for resumes and job postings, we design a dual-tower embedding matching structure and semantic scoring mechanism to achieve unified semantic representation and matching ranking between candidates and job requirements. Experiments validated model performance using real recruitment datasets, with ablation studies analyzing contributions from different semantic features. Results show the model achieves 89.47% accuracy, 88.63% recall, and an F1 score of 89.04%. Removing job constraint semantics reduces accuracy to 84.58%, demonstrating that integrating semantic embedding with constraint fusion significantly enhances recruitment matching effectiveness.

References

Y. Liu, "Deep Learning-Based Intelligent Resume-Position Matching System: Semantic Understanding and Recommendation of BERT Model in Massive Recruitment Data," in *Proceedings of the 2025 International Symposium on Machine Learning and Social Computing*, October 2025, pp. 8-13. doi: 10.1145/3778450.3778452

V. Narula, R. Kumar, R. Arora, and R. Bhatia, "Enhancing job recommendations using NLP and machine learning techniques," TIJER-International Research Journal, pp. 826-853, 2023.

A. Deshmukh and A. Raut, "Applying bert-based nlp for automated resume screening and candidate ranking," Annals of Data Science, vol. 12, no. 2, pp. 591-603, 2025. doi: 10.1007/s40745-024-00524-5

C. Tuan, M. T. Dang, H. N. Do, V. K. Solanki, J. Torres, R. Gonzalez Crespo, and T. N. A. Nguyen, "Ontology and its applications in skills matching in job recruitment," Applied Ontology, vol. 19, no. 3, pp. 287-306, 2024.

F. Antony, P. Eshanth, and S. J. Sharine, "An Intelligent Resume-Job Matching System: Bridging Candidates and Opportunities through Advanced Semantic Understanding," in *2026 9th International Conference on Computational Intelligence in Data Science (ICCIDS)*, January 2026, pp. 1-6. doi: 10.1109/iccids69108.2026.11407881

S. Asudani, N. K. Nagwani, and P. Singh, "Impact of word embedding models on text analytics in deep learning environment: a review," Artificial Intelligence Review, vol. 56, no. 9, pp. 10345-10425, 2023. doi: 10.1007/s10462-023-10419-1

Y. Wang, Y. Sun, Y. Fu, D. Zhu, and Z. Tian, "Spectrum-BERT: pretraining of deep bidirectional transformers for spectral classification of Chinese liquors," IEEE Transactions on Instrumentation and Measurement, vol. 73, pp. 1-13, 2024. doi: 10.1109/tim.2024.3374300

A. Areshey and H. Mathkour, "Transfer learning for sentiment classification using bidirectional encoder representations from transformers (BERT) model," Sensors, vol. 23, no. 11, p. 5232, 2023. doi: 10.3390/s23115232

A. Choudhary and A. Arora, "Assessment of bidirectional transformer encoder model and attention based bidirectional LSTM language models for fake news detection," Journal of Retailing and Consumer Services, vol. 76, p. 103545, 2024. doi: 10.1016/j.jretconser.2023.103545

A. Portes, A. Trott, S. Havens, D. King, A. Venigalla, M. Nadeem, and J. Frankle, "MosaicBERT: A bidirectional encoder optimized for fast pretraining," Advances in Neural Information Processing Systems, vol. 36, pp. 3106-3130, 2023. doi: 10.52202/075280-0137

Downloads

Published

10 April 2026

Issue

Section

Article

How to Cite

Yan, Y. (2026). Transformer-Based Semantic Embedding Model for Resume-Job Matching in Intelligent Talent Screening. Artificial Intelligence and Digital Technology, 3(1), 82-91. https://doi.org/10.70088/mvanpe51