ارایه چارچوب نظری طراحی فضای غیرماده معماری بر مبنای اعتماد (TRUST) در موج چهارم تعامل انسان و کامپیوتر(HCI)
محورهای موضوعی : معماریسیده مستوره موسوی 1 , وحید شالی امینی 2 , مهدی خاکزند 3 , مرتضی رهبر 4 , پریسا علیمحمدی 5
1 - پژوهشگر دوره دکتری، دانشکده معماری و شهرسازی، دانشگاه آزاد اسلامی واحد تهران مرکزی، تهران، ایران.
2 - استادیار، گروه معماری و شهرسازی، دانشگاه آزاد اسلامی واحد تهران مرکزی، تهران، ایران.
3 - دانشیار، گروه معماری، دانشگاه علم و صنعت ایران، تهران، ایران.
4 - استادیار، گروه معماری، دانشگاه علم و صنعت ایران، تهران، ایران.
5 - استادیار، گروه معماری و شهرسازی، دانشگاه آزاد اسلامی واحد تهران مرکزی، تهران، ایران.
کلید واژه: عامل انسان و تکنولوژی, اعتماد, واقعیت مجازی, فضای غیرماده معماری, طراحی انسان شناسانه,
چکیده مقاله :
مواجهه انسان و تکنولوژِی در موج چهارم، مبین درهمتنیدگی عمیق فرآیندهای تعاملی میان آنها در عصر حاضر است. پژوهش حاضر با تحلیل منابع موجود و ارائة آن در قالب یک چارچوب نظری به ایجاد ارتباط مابین دسپلینهای درگیر و شناسایی حفره ادبیات موجود در معماری میپردازد، و براین اساس با هدف رفع سوگیری نسبت به تکنولوژی های نوظهور چارچوب نظری طراحی فضای غیرماده معماری برمبنای اعتماد را مطرح می نماید. بر اساس مرور سیستماتیک، تعداد 242 مقاله در زمینههای تعامل انسان و رایانه، اخلاق رایانه، هوشمصنوعی، تاثیرات پساپاندمی، روانشناسی محیطو غیره در سالهای 2017-2022 مورد بررسی قرار گرفت و پس از غربالگری نهایی و تحلیل در نرم افزار MAXQDA به شکل دادن نتیجه گیری منجر شد؛ در آینده طراحی بر اساس انسان شناسی نقش بسیار مهمی در رویارویی با تکنولوژیهای نوظهور دارد. جلب اعتماد افراد با هرچه انسان شناسانه کردن طراحی محیط غیرمادی معماری میتواند منجربه ترغیب استفاده و درگیری بیشتر با این سبک جدید زندگی شود.
The 4th wave of the human-Computer interaction (HCI) refers to the entanglement of a new form of interactive process between humans and emerging technologies and their presence in today’s human life. With an emphasis on the emergence of artificial intelligence (AI), social robotics, virtual reality (VR), neural implants, cyber-physical systems, smart spaces, and autonomous vehicles (AVs), recent scientific literature has recognized the necessity of the human-computer interaction (HCI) 4th wave to show how human-machine interaction has turned into an ambiguous subject given human-technology boundaries. By reviewing, analyzing, and conceptually framing the literature, this study aimed to find possible relationships between the involved disciplines and architectural literature gaps. The obtained results can be used to (1) determine research gaps in the field of HCI from an architectural perspective to eliminate bias against emerging technologies, (2) theoretically explain how to design a virtual architectural space while considering those gaps, and (3) recommend a path for future studies. Given the research's goal and to develop the theoretical foundations, aiming at achieving a comprehensive concept that will go through the integration of the components of different fields of knowledge, systematic and content analyses have been used. The latent content and the thematic connection of the literature were analyzed using a structured and precise method. In the first stage, a broad range of literature with a rich variety of keywords was investigated, based on which the main keywords were extracted for final analysis and coding in three specific layers and different disciplines. To this end, 242 articles published between 2017 and 2022, as well as 21 doctoral theses published between 2019-2022 were reviewed. In the second stage, with the final screening, 73 articles were analyzed more closely using the thematic analysis method in MAXQDA and were categorized into 5 main codes and 17 sub-groups.
The findings of the content analysis suggest that the level of knowledge of HCI and its subgroup in the 4th HCI wave is theoretically and practically evolving. Importance of HCI and its extensive use in different disciplines. This is because it accounted for 59% of the extracted content, followed by “Trust” with 19%, indicating the importance of the subject. The low frequency of design codes can be attributed to anthropological factors because there is still limited information about the users' awareness, preferences, and needs, and how this matter affects the design of virtual architectural space. This relationship has been less architecturally discussed and thus lacks a comprehensive definition.
Results Showed In this stage, building up trust through anthropologically designed environments can encourage further use of this new lifestyle. It should be considered that trust in technology and machines is volatile as it is a form of feeling that varies in different situations. This study emphasizes that anthropologically informed design in virtual and intelligent environments is essential for enhancing human-technology interaction and a design based on trust-centered anthropological approaches increases the chance of welcoming emerging technologies and interacting with virtual spaces.
1. Ahmad, I., & Mikinski, M. S. (2021). Trust in Smart Homes: The Power of Social Influences and Perceived Risks. In International Conference on Information Systems (pp. 1-13).
2. Banaei, M., Ahmadi, A., Gramann, K., & Hatami, J. (2020). Emotional evaluation of architectural interior forms based on personality differences using virtual reality. Frontiers of Architectural Research, 9(1), 138-147.
3. Cervantes, S., López, S., & Cervantes, J. A. (2020). Toward ethical cognitive architectures for the development of artificial moral agents. Cognitive systems research, 64, 117-125.
4. Ćetković, A. (2020). Architectural paradox in the smart home. Ubiquity: The Journal of Pervasive Media, 7(1), 3-16.
5. Chan, K. T. (2022). Emergence of the ‘Digitalized Self’in the Age of Digitalization. Computers in Human Behavior Reports, 6, 100191.
6. Chubarov, A. A., Tikhomirova, D. V., Shirshova, A. V., Veselov, N. O., & Samsonovich, A. V. (2020). Virtual Listener: A Turing-like test for behavioral believability. Procedia Computer Science, 169, 892-899.
7. Damiano, L., & Dumouchel, P. (2018). Anthropomorphism in human–robot co-evolution. Frontiers in psychology, 9, 468.
8. Pandey, N., & Pal, A. (2020). Impact of digital surge during Covid-19 pandemic: A viewpoint on research and practice. International journal of information management, 55, 102171.
9. Duan, H., Li, J., Fan, S., Lin, Z., Wu, X., & Cai, W. (2021, October). Metaverse for social good: A university campus prototype. In Proceedings of the 29th ACM international conference on multimedia (pp. 153-161).
10. Ekman, F., Johansson, M., Bligård, L. O., Karlsson, M., & Strömberg, H. (2019). Exploring automated vehicle driving styles as a source of trust information. Transportation research part F: traffic psychology and behaviour, 65, 268-279.
11. Ekman, F., Johansson, M., Karlsson, M., Strömberg, H., & Bligård, L. O. (2021). Trust in what? Exploring the interdependency between an automated vehicle’s driving style and traffic situations. Transportation research part F: traffic psychology and behaviour, 76, 59-71.
12. Fox, J., & Gambino, A. (2021). Relationship development with humanoid social robots: Applying interpersonal theories to human–robot interaction. Cyberpsychology, Behavior, and Social Networking, 24(5), 294-299.
13. Frauenberger, C. (2019). Entanglement HCI the next wave?. ACM Transactions on Computer-Human Interaction (TOCHI), 27(1), 1-27. http://dx.doi.org/10.1145/3364998
14. Frison, A. K., Wintersberger, P., & Riener, A. (2019). Resurrecting the ghost in the shell: A need-centered development approach for optimizing user experience in highly automated vehicles. Transportation research part F: traffic psychology and behaviour, 65, 439-456.
15. Gram-Hanssen, K., & Darby, S. J. (2018). “Home is where the smart is”? Evaluating smart home research and approaches against the concept of home. Energy Research & Social Science, 37, 94-101.
16. Guerrero, E., Lu, M. H., Yueh, H. P., & Lindgren, H. (2019). Designing and evaluating an intelligent augmented reality system for assisting older adults’ medication management. Cognitive Systems Research, 58, 278-291.
17. Han, M. J. N., Kim, M. J., & Kim, I. H. (2021). Exploring the user performance of Korean women in smart homes with a focus on user adoption. Journal of Building Engineering, 39, 102303.
18. Hassani, H., Huang, X., & Silva, E. (2021). The human digitalisation journey: Technology first at the expense of humans?. Information, 12(7), 267.
19. Hengstler, M., Enkel, E., & Duelli, S. (2016). Applied artificial intelligence and trust—The case of autonomous vehicles and medical assistance devices. Technological Forecasting and Social Change, 105, 105-120.
20. Hoff, K. A., & Bashir, M. (2015). Trust in automation: Integrating empirical evidence on factors that influence trust. Human factors, 57(3), 407-434.
21. Horner, D. S. (2010). Moral luck and computer ethics: Gauguin in cyberspace. Ethics and information technology, 12(4), 299-312.
22. Irfan, B. (2019, August). Multi-modal Personalisation in Long-Term Human-Robot Interaction. 9th Joint IEEE International Conference on Development and Learning and on Epigenetic Robotics (ICDL-EpiRob 2019), Workshop on Personal Robotics and Secure Human-Robot Collaboration.
23. Jeon, M., Fiebrink, R., Edmonds, E. A., & Herath, D. (2019). From rituals to magic: Interactive art and HCI of the past, present, and future. International Journal of Human-Computer Studies, 131, 108-119.
24. Jin, Q., et al. (2018). Research roadmap intelligent and responsive buildings.
25. Kim, J., Shin, S., Bae, K., Oh, S., Park, E., & del Pobil, A. P. (2020). Can AI be a content generator? Effects of content generators and information delivery methods on the psychology of content consumers. Telematics and Informatics, 55, 101452.
26. Iachini, T., Pagliaro, S., & Ruggiero, G. (2015). Near or far? It depends on my impression: Moral information and spatial behavior in virtual interactions. Acta psychologica, 161, 131-136. http://dx.doi.org/10.1016/j.actpsy.2015.09.003
27. Lee, J. D., & See, K. A. (2004). Trust in automation: Designing for appropriate reliance. Human factors, 46(1), 50-80.
28. Lee, J. G., Kim, K. J., Lee, S., & Shin, D. H. (2015). Can autonomous vehicles be safe and trustworthy? Effects of appearance and autonomy of unmanned driving systems. International Journal of Human-Computer Interaction, 31(10), 682-691.
29. Lee, L. H., Braud, T., Zhou, P. Y., Wang, L., Xu, D., Lin, Z., ... & Hui, P. (2024). All one needs to know about metaverse: A complete survey on technological singularity, virtual ecosystem, and research agenda. Foundations and trends® in human-computer interaction, 18(2–3), 100-337.
30. Leichtmann, B., & Nitsch, V. (2020). How much distance do humans keep toward robots? Literature review, meta-analysis, and theoretical considerations on personal space in human-robot interaction. Journal of environmental Psychology, 68, 101386.
31. Liu, B., & Sundar, S. S. (2018). Should machines express sympathy and empathy? Experiments with a health advice chatbot. Cyberpsychology, Behavior, and Social Networking, 21(10), 625-636.
32. Lüders, M., Andreassen, T. W., Clatworthy, S., & Hillestad, T. (2017). Innovating for trust. In Innovating for trust (pp. 1-14). Edward Elgar Publishing.
33. Mannino, A., Dejaco, M. C., & Re Cecconi, F. (2021). Building information modelling and internet of things integration for facility management—Literature review and future needs. Applied Sciences, 11(7), 3062.
34. Mashal, I., & Shuhaiber, A. (2019). What makes Jordanian residents buy smart home devices? A factorial investigation using PLS-SEM. Kybernetes, 48(8), 1681-1698.
35. Massimiliano, P. (2018). A developmental model of trust in humanoid robot. Univ. Plymouth, Plymouth, UK, Tech. Rep.
36. McKnight, D. H., Choudhury, V., & Kacmar, C. (2002). The impact of initial consumer trust on intentions to transact with a web site: a trust building model. The journal of strategic information systems, 11(3-4), 297-323.
37. Mele, C., Spena, T. R., Kaartemo, V., & Marzullo, M. L. (2021). Smart nudging: How cognitive technologies enable choice architectures for value co-creation. Journal of Business Research, 129, 949-960.
38. Namazian, A., & Mehdipour, A. (2013). Psychological demands of the built environment, privacy, personal space and territory in architecture. International Journal of Psychology and Behavioral Sciences, 3(4), 109-113.
39. Netanyahu, A., Shu, T., Katz, B., Barbu, A., & Tenenbaum, J. B. (2021, May). Phase: Physically-grounded abstract social events for machine social perception. In Proceedings of the aaai conference on artificial intelligence (Vol. 35, No. 1, pp. 845-853).
40. Nevelsteen, K. J. (2018). Virtual world, defined from a technological perspective and applied to video games, mixed reality, and the Metaverse. Computer animation and virtual worlds, 29(1), e1752.
41. Norman, D.A. (2007). The design of everyday things. Basic Books.
42. Oliveira, L., Burns, C., Luton, J., Iyer, S., & Birrell, S. (2020). The influence of system transparency on trust: Evaluating interfaces in a highly automated vehicle. Transportation research part F: traffic psychology and behaviour, 72, 280-296.
43. Park, E., Kim, S., Kim, Y., & Kwon, S. J. (2018). Smart home services as the next mainstream of the ICT industry: determinants of the adoption of smart home services. Universal Access in the Information Society, 17, 175-190.
44. Pessoa, L. (2019). Intelligent architectures for robotics: The merging of cognition and emotion. Physics of Life Reviews, 31, 157-170.
45. Pink, S., Osz, K., Raats, K., Lindgren, T., & Fors, V. (2020). Design anthropology for emerging technologies: Trust and sharing in autonomous driving futures. Design Studies, 69, 100942. https://doi.org/10.1016/j.destud.2020.04.002
46. Puig, X., Shu, T., Li, S., Wang, Z., Liao, Y. H., Tenenbaum, J. B., ... & Torralba, A. (2020). Watch-and-help: A challenge for social perception and human-ai collaboration. arXiv preprint arXiv:2010.09890.
47. Raats, K., Fors, V., & Pink, S. (2020). Trusting autonomous vehicles: An interdisciplinary approach. Transportation Research Interdisciplinary Perspectives, 7, 100201. http://dx.doi.org/10.1016/j.trip.2020.100201
48. Rogers, E.M. (2003). Diffusion of Innovations. , New York: Free Press.
49. Russell, S.J., & Norvig, P. (2010). Artificial Intelligence: A Modern Approach. Prentice Hall, third ed. Englewood Cliffs, New Jersey.
50. Shuhaiber, A., & Mashal, I. (2019). Understanding users’ acceptance of smart homes. Technology in society, 58, 101110. https://doi.org/10.1016/j.techsoc.2019.01.003
51. Slovic, P. (1993). Perceived risk, trust, and democracy. Risk analysis, 13(6), 675-682.
52. Strauch, C., Mühl, K., Patro, K., Grabmaier, C., Reithinger, S., Baumann, M., & Huckauf, A. (2019). Real autonomous driving from a passenger’s perspective: Two experimental investigations using gaze behaviour and trust ratings in field and simulator. Transportation research part F: traffic psychology and behaviour, 66, 15-28.
53. Tejwani, R., Kuo, Y. L., Shu, T., Katz, B., & Barbu, A. (2022, January). Social interactions as recursive mdps. In Conference on Robot Learning (pp. 949-958). PMLR.
54. Vial, G. (2021). Understanding digital transformation: A review and a research agenda. Managing digital transformation, 13-66.
55. Yang, H., Lee, H., & Zo, H. (2017). User acceptance of smart home services: an extension of the theory of planned behavior. Industrial Management & Data Systems, 117(1), 68-89.
56. Zhao, Y., Jiang, J., Chen, Y., Liu, R., Yang, Y., Xue, X., & Chen, S. (2022). Metaverse: Perspectives from graphics, interactions and visualization. Visual Informatics, 6(1), 56-67.