A Structural Examination of Trust Calibration in Artificial Intelligence and the Contingent Influence of AI-Driven Performance Analytics

Authors

  • Dr K. Kavitha

Keywords:

Artificial Intelligence, Trust Calibration, AI-Driven Performance Analytics, Employee Performance, Algorithmic Governance, Structural Equation Modeling

Abstract

The rapid institutionalization of artificial intelligence (AI) within talent management architectures has fundamentally reconfigured decision-making processes, employee evaluation mechanisms and performance governance systems. Despite widespread adoption, empirical understanding of how employees cognitively and affectively calibrate trust toward AI-enabled systems and how such trust translates into performance outcomes remains theoretically fragmented and empirically underexplored. Addressing this gap, the present study develops and empirically validates a structural model examining the mediating role of employee trust calibration in artificial intelligence and the contingent moderating influence of AI-driven performance analytics on employee performance. Data were collected from 300 employees working in AI-enabled organizations across technology-intensive and service-oriented sectors. The findings reveal that AI-enabled talent management systems exert a significant positive influence on employee performance indirectly through trust calibration mechanisms. Moreover, AI-driven performance analytics significantly moderate the trust–performance relationship, amplifying performance outcomes under conditions of high analytical transparency and perceived algorithmic fairness. The study contributes to the emerging literature on algorithmic governance by advancing trust calibration as a critical psychological conduit through which AI technologies shape human performance. Practical implications emphasize the strategic necessity of trust-sensitive AI deployment to ensure sustainable performance gains.

References

Benbya, H., Davenport, T. H., & Pachidi, S. (2020). Artificial intelligence in organizations. MIS Quarterly, 44(4), 1–11. https://doi.org/10.25300/MISQ/2020/16511

Brougham, D., & Haar, J. (2018). Smart technology, artificial intelligence, robotics and algorithms: Implications for future jobs. Journal of Management & Organization, 24(2), 239–257. https://doi.org/10.1017/jmo.2018.10

Brynjolfsson, E., & McAfee, A. (2017). The business of artificial intelligence. Harvard Business Review, 95(4), 3–11.

Davenport, T. H., Guha, A., Grewal, D., & Bressgott, T. (2020). How artificial intelligence will change the future of marketing. Journal of the Academy of Marketing Science, 48(1), 24–42. https://doi.org/10.1007/s11747-019-00696-0

De Visser, E. J., Pak, R., & Shaw, T. H. (2018). From automation to autonomy: The importance of trust repair in human–machine interaction. Human Factors, 60(6), 911–935. https://doi.org/10.1177/0018720818789716

Downloads

How to Cite

Dr K. Kavitha. (2026). A Structural Examination of Trust Calibration in Artificial Intelligence and the Contingent Influence of AI-Driven Performance Analytics. International Journal of Engineering Science & Humanities, 16(1), 479–490. Retrieved from https://www.ijesh.com/j/article/view/627

Similar Articles

<< < 20 21 22 23 24 25 26 27 > >> 

You may also start an advanced similarity search for this article.