
Enhancing Existing Reality: A Theoretical Analysis of User-focused Mixed Reality Research
Abstract
Background Driven by improvements in form factor and tracking capabilities, recent advancements in mixed reality (MR) technologies have rapidly transformed head-mounted displays into consumer-facing devices, and as MR becomes more accessible, research has increasingly focused on maximising its potential. Interestingly, while much attention is being given to novel MR interactions, there is comparatively little research that holistically examines the user experience (UX) in MR, especially regarding the context of use, products involved, and the emotional or narrative impacts of MR-enhanced experiences (Forlizzi & Battarbee, 2004).
Methods To address this gap, we employed well-established UX frameworks from Human-Computer Interaction (HCI) literature as an analytical lens, conducting a review of 65 MR user studies from which we identified key components of the MR user experience including interaction type, experience type, users, products, and context of use. These elements are considered essential in understanding the full spectrum of UX.
Results Our findings indicate that MR user-focused research often prioritises the development of new MR interactions that demand significant cognitive load from users while tending to overlook the more familiar, fluent interactions that users are accustomed to in their physical environments. As such, we believe that by focusing on existing physical products and contexts, MR design can support rather than disrupt familiar interactions, thereby reducing cognitive load and enhancing the overall user experience.
Conclusions MR user experience design can benefit from a shift in focus toward supporting existing physical interactions within the MR environment. We suggest that MR UX consider more deeply the products and contexts outside of the HMD and immediate MR virtual reality towards potentially creating more intuitive and seamless MR experiences, ultimately leading to improved user satisfaction and effectiveness. By leveraging MR to enhance rather than replace familiar interactions, designers can create more user-friendly and contextually appropriate MR applications.
Keywords:
Mixed Reality, Augmented Reality, User Experience, Human-Computer Interaction1. Introduction
The rapid advancement in mixed reality (MR) technologies has ushered in a new era of immersive user experiences (UX), blending the physical and digital worlds in unprecedented ways, and as the technology continues to evolve, developing a deeper understanding of the user experience will be crucial for maximising its potential and addressing emerging challenges. However, MR’s unique interaction affordance - blending both physical and digital mediums - has led to an intense research focus on specific individual interactions, and less research that explores a more holistic user experience (Alexandrovsky et al., 2021; Pamparǎu & Vatavu, 2020). Moreover, previous UX research highlights that interaction is just one aspect of the user experience. In reality, it is an inter-connected web of interactions, products, and contexts of use that, in turn, allow users to create emotional narratives in the form of user experiences (Bargas-Avila & Hornbæk, 2011; Forlizzi & Battarbee, 2004; Forlizzi & Ford, 2000; Law et al., 2009). Hollan and Stornetta (1992) made the point that these technologies should go ‘beyond being there’ by satisfying the unmet needs of users in existing situations and creating interfaces that attribute to those needs. Furthermore, early UX research noted that usability focused on task efficiency often neglected other core components of the user experience (Bargas-Avila & Hornbæk, 2011). This dilemma has been identified in existing MR research, where a strong focus on specific MR interactions has failed to provide a clear and practical understanding for designers about what constitutes a ‘good’ MR UX (Krauß et al., 2021; Speicher et al., 2019).
Despite significant research delving into the taxonomy of technologies, algorithms, and capabilities that comprise MR, determining what aspects of MR constitute UX and which areas MR designers should focus on to improve UX remains an open research question (Krauß et al., 2021; Speicher et al., 2019).
To understand the MR experience, research on UX must expand from a single focus on interaction to gaining a comprehensive understanding of the user experience. Hence, in this paper, we begin by analysing existing MR studies focusing on the entire UX defining it in this context as a holistic approach to the user’s experience that includes consideration of their context of use, interaction type, and experience type, unlike other reviews, which focus purely on interaction or usability (Dey et al., 2018; Papadopoulos et al., 2021). To achieve this, we draw on well-established UX frameworks that have already been used to analyse and unpack UX.
From a perspective based on Forlizzi and Ford’s influential UX frameworks (Forlizzi & Battarbee, 2004; Forlizzi & Ford, 2000), we review 65 MR studies to identify the following:
(a) The current focus of MR user-centred research.
(b) Aspects of MR UX that are unique and under-researhced.
In doing so, we address the following research question:
In what ways can existing user experience design frameworks inform designers about current MR experiences, and do these frameworks apply in an MR context?
We contribute a form of analysis for MR UX that can be used to identify areas for improvement in the current MR design. By applying a UX lens to MR in the way we attempt in this paper, we can identify potential gaps in the current MR design that may be addressed to deliver an enhanced MR UX for users.
2. Related Works
In this section, we review the existing literature on MR to contextualise the current state of research and identify gaps in understanding user experience (UX) in this rapidly evolving field. We also define MR and examine critical studies that have shaped its conceptualisation. Following this, we delve into the numerous literature reviews on MR, augmented reality (AR), and virtual reality (VR), highlighting their contributions and limitations, while aiming to establish a foundation for further exploration of user experience in mixed-reality environments.
2. 1. Mixed Reality
We begin this related works section by succinctly clarifying MR based on the existing literature. Perhaps the most common understanding of MR is based on Milgram and Kishino’s (1994) reality-virtuality (RV) continuum, which conceptualises MR as a continuum from an entirely physical to a virtual environment (Skarbez et al., 2021). Everything between these two points is considered MR, as illustrated in the figure below.
Since its conception in 1994, MR technologies have developed considerably, and much research has highlighted that the continuum is predominantly focused on visual displays (Milgram & Kishino, 1994; Speicher et al., 2019) with a focus on visual interactions also present in the literature. For example, Dey et al.’s (2018) review found that 281 studies (96%) focused on augmenting the visual sense. However, research argues that the continuum’s leaning toward the visual sense may not be entirely appropriate for MR in its current form (Speicher et al., 2019). In Rasuchnabel et al.’s (2022) review, they instead define the difference between MR experiences based on the user’s presence, distinguishing Augmented Reality and Virtual Reality by suggesting users cannot be immersed in both simultaneously. Furthermore, new technologies have led to the possibility of interactions for each of the human senses (Speicher et al., 2019), and for multi-user or collaborative settings (Billinghurst & Kato, 2002; Rauschnabel et al., 2022; Speicher et al., 2019).
For this literature review, we consider MR to include the concepts presented in Milgram and Kishino’s (1994) RV continuum. In particular, we align with other well-recognised definitions of MR, which propose overlaying real-world environments with digital, computer-generated objects allowing the user to perceive both the digital and physical environments simultaneously (Azuma, 1997).
2. 2. MR Literature Reviews
Many MR, AR and VR literature reviews have been conducted in the past twenty years. Swan and Gabbard (2005) reviewed 1104 augmented reality papers with a particular focus on user-based experimentation while Costanza et al. (2009) provided a comprehensive review of existing MR applications. Meanwhile, Rokhsaritalemi et al. (2020) reviewed 117 MR articles to construct a framework for necessary components in MR applications, and Merino et al. (2020) reviewed 458 papers to report how evaluations are conducted in MR research. Furthermore, some studies have compared literature with industry professionals to attempt to foster a shared understanding of MR, AR, or XR (Rauschnabel et al., 2022; Speicher et al., 2019).
There have also been reviews conducted on specific technologies, such as research into augmented reality (Billinghurst, 2021; Billinghurst et al., 2014; Billinghurst & Kato, 2002; Goncalves et al., 2021; Tan et al., 2001), or specific clarifications of extended reality, XR and the metaverse (Almoqbel et al., 2022; Ratcliffe et al., 2021; Rauschnabel et al., 2022). Moreover, many reviews have been conducted on specific topics such as interaction, collaboration, usability and analytics (Billinghurst & Kato, 2002; Dey et al., 2018; Nebeling et al., 2020; Papadopoulos et al., 2021), and finally, literature reviews exist on MR in specific domains, such as education or healthcare (Ali et al., 2019; Howard & Davis, 2022; C. R. Nelson & Gabbard, 2023; Viglialoro et al., 2021; Xia et al., 2023).
Nevertheless, while there have been extensive reviews of MR technology and the domains in which it is being utilised, there appears to be a lack of literature focused on understanding the user’s experience and how to design for it (Davis & Aslam, 2024; Krauß et al., 2021). The majority of current literature reviews have made extensive efforts to define the technical components of an MR experience and, in some cases, categorise the types of specific interactions that constitute one. For example, Plopski et al. (2022) discuss how AI is used to improve aspects such as object tracking for user interaction or reduce motion sickness. Meanwhile, Ghamandi et al. (2023) produced a taxonomy of MR tasks that, when viewed in detail, could aid in understanding interactions with an MR system. However, we argue that technical components and interactions do not represent the entire user experience, and in most of these literature reviews, the authors call for more profound research into MR UX that extends beyond specific gaze or gesture interactions (Dey et al., 2018; Swan & Gabbard, 2005).
This challenge is common in the early stages of UX research in the area of novel technology (Bargas-Avila & Hornbæk, 2011). Indeed, early UX research argues that usability research was too intensely focused on a task-oriented view of interaction rather than a focus on more hedonic qualities such as aesthetics or affect (Bargas-Avila & Hornbæk, 2011). UX research tends to consider the users, products, contexts of use and narratives that form user experiences (Forlizzi & Battarbee, 2004; Forlizzi & Ford, 2000). Interactions are considered to make up a piece of this puzzle but are not the only factor in understanding the user’s experience. We, therefore, believe MR research can benefit from UX-positioned analysis, identifying how interactive products, characteristics and context work together in shaping an experience of use (Forlizzi & Battarbee, 2004).
We are not alone in identifying this gap: In other MR literature reviews, it has been noted that despite the rapid innovation of products that enable MR, overall the design of the user experience has been neglected (Pamparǎu & Vatavu, 2020). These reviews have found that user-focused studies mostly explore usability and interactions and that MR development still lacks relevant and applicable knowledge for improving user experience design (Davis & Aslam, 2024; Krauß et al., 2021).
3. Methodology
In our analysis of existing MR literature, we found that many reviews predominantly focus on interactions and technical specifications. To provide a more comprehensive understanding, we decided to analyse MR research through the lens of well-established UX frameworks. First, we define ‘user experience’ and the chosen analytical frameworks.
3. 1. Defining User Experience and Selecting a Framework
Critical analyses of user experience studies tend to approach the topic from the following perspectives: Some take a holistic view of product interactions, emphasizing all aspects of product use, while others take a positive viewpoint that excludes a utilitarian task-related focus to, instead, explore the joy of use, aesthetics, and values (Bargas-Avila & Hornbæk, 2011). Each of these criteria has its own deep body of literature, and again many UX review papers have pointed out that “UX research is fragmented and complicated by diverse theoretical models with different foci such as pragmatism, emotion, affect, experience, value, pleasure, beauty and hedonic quality” (Law et al., 2009). Law et al. (2009) pointed out that, generally, with the expansion of popular UX concepts and the rapidly increasing interest in them, UX has become something desirable, but without a clear definition of what that ‘something’ is.
However, underneath a majority of HCI user research, there are a few key influential articles. Forlizzi and Ford’s (2004) ‘The Building Blocks of Experience’ and Forlizzi and Battarbee’s (2004) ‘Understanding Experience in Interactive Systems’ have been cited 852 and 1586 times respectively on Google Scholar, presenting frameworks that have been used to define, categorize and broadly understand most conceptions of user experience in HCI research (Zimmerman et al., 2010).
In the framework for understanding the user experience, they present three types of interactions between product and user - fluent, cognitive and expressive interactions. Fluent interactions are automatic, cognitive interactions focus on the product at hand resulting in knowledge or confusion, and expressive interactions help the user form a relationship with the product. They also describe three types of experience—experience, an experience and co-experience, stating that experiences are a constant stream of ‘self-talk’ that occurs when interacting with a product, while an experience has a more definitive beginning and end inspiring behavioural change. Co-experiences meanwhile include other users, creating meaning and emotion through product use (Forlizzi & Battarbee, 2004; Forlizzi & Ford, 2000).
Further, they highlight that all experiences consist of three key components—the user, product, and context of use. They discuss how users represent people who influence experience and products represent artefacts that influence experience, and finally, they discuss how these product-user interactions must take place in a ‘context of use’ (Figure 2). They conclude that a successful design will take into consideration all of the components in the user-product interaction: user, product and context of use’ (Forlizzi & Battarbee, 2004; Forlizzi & Ford, 2000).
These articles suggest that designers looking to fully understand experience should learn about the most basic interactions and what experiences unfold and are articulated. Forlizzi’s frameworks have been applied directly and indirectly across service design, game design, and interaction design throughout the past two decades (Arhippainen, 2003; Choi et al., 2016; DiSalvo et al., 2004; Forlizzi et al., 2003; Koskinen et al., 2024; Lee & Forlizzi, 2009). This framework, therefore, provides an appropriate starting place for categorising what a mixed reality user experience consists of and may enable worthwhile areas of exploration for future MR UX research.
3. 2. The Literature Search
We began this search by identifying as many user-focused MR studies as possible within the scope and timeline of our project, with the goal of comparing each of these studies with the selected framework. We adopted an approach based on the QUOROM method for systematic reviews and evidenced in other MR systematic literature reviews (Figure 3). This approach is explained in detail in the following sections (Quintero et al., 2019; Saadon et al., 2020).
3. 3. Phase 1 - Identification of Potentially Relevant Publications
A Google Scholar search for relevant user studies in MR returned an unreasonable amount of research possible for the scope of this review, as it included all domains, technologies and disciplines. Instead, previous literature reviews on MR highlight that solely ISMAR and CHI are suitable venues for presenting a sound body of representative MR research (Kent et al., 2021; Merino et al., 2020). We decided to specifically search the SCOPUS database for the ACM digital library’s SIGCHI group, as it encompassed a wide range of venues inclusive of but not limited to ISMAR and CHI. We also restricted the article search to a timeframe of 2018 or later, as MR user research from 2005-2014 has already been evaluated, and in our analysis of these literature reviews, we found that the description of MR can change with the technology’s development (Davis & Aslam, 2024; Merino et al., 2020). Therefore, by selecting articles from 2018 and later, we could ensure the version of MR in question aligned with our expected definitions (Davis & Aslam, 2024; Merino et al., 2020).
We searched the ACM digital library SIGCHI group with the search terms “mixed reality”, “augmented reality”, and “virtual reality”, as well as “user experience”, “user study”, “user value”, and “user needs.” Interestingly, the terms “mixed reality user experience”, “mixed reality user needs”, and “mixed reality user value” did not return many specific results—mostly returning articles containing the words ‘mixed reality’ and ‘user’, or just ‘reality’ and ‘user’. We then introduced the search term ‘mixed reality interaction’ to which many more results became available.
The search returned 783 articles to be screened for inclusion in Phase 2. We then imported all results into Mendeley and checked for duplicates, excluding 53 more articles. Despite our search limitations, we also removed any articles without author listings, such as conference proceedings or articles outside the year restriction, leaving us with 715 articles remaining.
The scope of our project required a more defined, representative corpus. Nelson highlights that corpus design should strive for a reasonable representation of the full repertoire of available texts (M. Nelson, 2010). Moreover, Merino et al. (2020), amongst others, highlight that only ISMAR and CHI demonstrate a sound representative body of MR literature (Kent et al., 2021). Thus, we aimed to identify a representative corpus across SIGCHI over the past five years (Figure 4). We began by organizing all the articles according to their venue type, article type, relevant keywords, and publication year. Then, we utilized a title and abstract screening approach, whereby the authors read each title and abstract to determine the technology utilized in the study (AR, VR, XR, etc.) and whether the study included some kind of user test to ensure it was user-focused. In this section, we specifically excluded studies focused on the technical specifications of an MR implementation.
This reduced our selection to 65 articles across ISMAR, VRST, UIST, SUI, MOBILE HCI, CHI, CHIPLAY, and the Journal of Human-Computer Interaction, as visualised in Figure 5.
3. 4. Phase 2 - Analysis and Coding
Of the 65 individual articles collected, we completed two phases of coding and analysis in accordance with the types of user-product interactions (Table 1), types of experience and users (Table 2) and products and contexts of use (Table 3) presented in the frameworks (Forlizzi & Battarbee, 2004; Forlizzi & Ford, 2000). These frameworks were selected as they are widely reflected upon in HCI research when a clearer understanding of experience is needed. When considered in conjunction with one another, they present a holistic analysis of experience, which has proven to be useful in identifying key user experience components (Jensen, 2013; Marti & Iacono, 2016; Ortiz et al., 2011). The authors completed these two phases of coding with consistency and reliability meetings held to ensure that the applied codes matched between authors. The coding results found in the Appendices were checked multiple times by each author to ensure the authors were in agreement regarding the results and that the coding was appropriate.
The definitions presented in the previous section acted as the basis for our coding approach. With each paper collected, we noted which interaction and experience type the study was most aligned to on a scale of 0 (not aligned), 0.5 (semi-aligned) and 1 (aligned). A detailed analysis of our approach can be found in the appendices.
Finally, we analysed the papers according to the ‘users’, ‘product’ and ‘context of use’. The initial framework discusses how users represent people who are influenced by experience and products represent artefacts that influence experience, and finally, it discusses how these product-user interactions must take place in a ‘context of use’ (Forlizzi & Battarbee, 2004; Forlizzi & Ford, 2000). The full outcomes of our analysis can be found in the appendix in greater detail.
4. Results
Here we present the results of our comparison between the studies and the frameworks. These were scored on a scale of not aligned (0), semi-aligned (0.5), and aligned (1) depending on the user experience described. Results are shown in Table 4 and Table 5.
4. 2. 1. Fluent, Cognitive and Expressive Interactions
We found that 46 of the user studies seemed to include mostly cognitive interactions, meaning the user needed to focus on the product at hand and use this to generate some kind of knowledge (Forlizzi & Battarbee, 2004). There is some overlap between cognitive and expressive interactions, in that many studies enabled users to personalize information in MR or form some kind of relationship with the product. However, as MR technologies were often unfamiliar to the participants, almost any interaction required some kind of cognitive focus. Overall, we noted that even simple fluent interactions, such as Schubert et al.’s (2023) magnification window study, become cognitive when performed in MR.
4. 2. 2. Experience, An Experience, Co-Experience
We found that 44 of the studies reviewed were related to the description of ‘an experience’, in that it can be articulated or named and has a beginning and end. This is mostly due to how MR (especially HMD-enabled MR) requires the user to start and stop the experience by wearing or removing the headset. There were also explorations into collaborative or multi-user MR experiences, which aligns closely with the concept of ‘co-experience’. For example, in the initial framework, they explicitly identified an example of co-experience as ‘interacting with others in a museum exhibit’, which we found in multiple AR studies that researched this (or very similar) use cases (Li et al., 2019; Mann & Fryazinov, 2019; Yi & Kim, 2021).
4. 2.3. Users, Products and Contexts of Use
In Medeiros et al.’s (2023) study, AR interactions were analysed in the context of confined passenger spaces in public transport. Using definitions found in the initial framework, this study encompassed multiple different users, products and contexts of use. We began by noting the MR user, MR product and intended context of use. All of which can be seen in the appendix.
However, if users represent how people influence experience, then both AR-equipped passengers and non-AR-equipped passengers constitute users. Similarly, the immediate product is the AR-enabled head-mounted display, but the seats, windows, and other qualities of the confined passenger space are products, too. Finally, regarding the ‘context of use,’ multiple environmental and social contexts must be considered.
Our results, presented in the appendix, highlight the first impressions of what an MR experience may consist of in terms of users, products, and context of use. However, in applying this framework, we have come to realize the multitude of other users, products, and contexts of use that are often underrepresented in MR research.
5. Analysis
RQ - In what ways can existing user experience design frameworks inform designers about the current MR experience, and do these frameworks apply in an MR context?
Existing UX research has found that by first collecting and understanding user experiences, narratives can be formalized and constructed in the form of a product, which, in turn, creates deeper ongoing beneficial experiences long term (Forlizzi & Ford, 2000). Collecting and understanding user experiences is incredibly complex to begin with, and our review of the literature suggests it is perhaps even more complex in MR. MR poses a unique challenge in that it essentially multiplies the variables in both physical and digital experiences. Each of the three interaction types and experiences, as well as the product, user and context of use, has to be considered across both physical and digital realities.
5. 1. Interaction Types
Shifting the role of MR from direct to supportive interaction
In terms of interaction types, we found that the majority of MR studies focused on new, cognitive interactions, expecting users to interact with MR content that blends physical and digital contexts—a novel interaction for the user that requires some cognitive load to execute. We believe that a focus on fluent interactions can shift the role of MR technology from direct interaction to supportive interaction, improving the user experience by enhancing users’ existing capabilities in physical reality rather than expecting them to develop new capabilities in MR.
In fact, many instruction-based MR user studies are excellent examples of how fluent interactions can be prioritized. In Marques et al.’s (2022) study, experts provided MR-based situated instructions to an on-site collaborator through three different types of notifications—visual, audio, and tactile, finding that tactile notifications were preferred by all participants as they generated a greater level of awareness and attentional allocation for less mental effort (Marques et al., 2022). In Werrlich et al.’s (2018) comparison of HMD-based and paper-based assembly training, they found that HMD assembly training was preferred by all participants. In these examples, MR is adopting a supportive role by situating instructions for the user while they interact with physical products using their existing fluent interactions. Users did not have to change their behaviour directly, but MR enhanced their capabilities passively, improving the UX.
5. 2. Experience Types
Enhancing existing experience
When analysing the articles according to the three types of experiences (experience, an experience, co-experience) we found that most MR user experiences seem to sit within the category of ‘an experience’. Often the experience is something ‘that could be articulated or named’ and ‘has a beginning and end that inspires emotional or behaviour changes’ (Forlizzi & Battarbee, 2004; Forlizzi & Ford, 2000). In one example, MR was used to construct a virtual ‘scenography piece’ of the “Othello” play, which would later be accurately positioned within a concert hall. Once the application registered the stage area, it would begin the MR experience (Schauer & Sieck, 2023). We use this example in particular due to its clear ‘beginning and end’ limited by its physical location; however, we reviewed many other studies of cultural heritage in particular, where MR was used to enhance and deliver ‘an experience’, due to its aim to inspire emotional or behavioural change (Ivanova & Vassilev, 2021; Liu et al., 2022; Silva & Teixeira, 2020; Yuan et al., 2024).
In collaborative MR studies, we found an understandable overlap between ‘experience’ and ‘co-experience’. For example, an MR escape room game was designed to evaluate the potential of MR as a team-building tool (Warmelink et al., 2017). In another, MR was used as a collaborative and intergenerational story-creation tool between two remote users (Healey et al., 2021).
Not dissimilar to our findings of the interaction types, there is little exploration into how MR applies to just ‘experience’, which is described mostly as ‘the constant stream of “self-talk” that happens when we interact with products, citing examples such as ‘walking in a park’ or ‘doing light housekeeping’ (Forlizzi & Battarbee, 2004; Forlizzi & Ford, 2000).
We believe there is value in exploring MR in the context of ‘experience’, where (similar to fluent interactions) MR can be used in a more supportive role to passively enhance the user’s existing experience, rather than effect a direct and cognitive change.
In some cases, MR has been explored as a stress relief tool, where the device offers an array of serene surroundings and background music (Soni & Shete, 2020), while in others, it has been used as a supportive tool for elderly writers, who, while deep in the creative process, could take inspiration from MR visualisations (Ameb et al. 2019). In these examples, MR offers support to the existing user experience without necessarily demanding the direct cognitive attention of a new experience.
5. 3. Building Blocks of Experience
Multiplying experience variables
Coding MR literature according to the user, product, and context of use is where the study of the MR user experience becomes notably complex.
First, how can users be appropriately discussed?
We found that most research focuses on single users, and of course, collaborative works become multi-user. There are, however, interesting grey areas, such as virtual avatars; one study implemented virtual avatars capable of demonstrating sign language to people who were hearing impaired (Luo et al., 2022). These avatars were humans, previously recorded for the sake of the research. In the framework, users are considered as representing ‘how people influence experience’, and so we posit, should avatars also be considered users?
We suggest it may be worthwhile to expand the focus of users to consider the impacts of other more ‘passive’ users within a given MR experience. This sentiment has been echoed in other MR research, where AR and VR headsets in confined passenger spaces raised the question, “Is there something inherent in the non-occlusive AR experience that makes visible interactions more or less acceptable to both users and bystanders?” (Medeiros et al., 2023).
In MR studies, the HMD is often the focal ‘product’ to be considered in the analysis of the user experience. However, the chosen framework states that products represent ‘how artefacts influence experience’, in which case there are many products in any given experience. By blending digital and physical worlds, MR experiences often require interaction with multiple products simultaneously, such as using an HMD while interacting with a physical object.
In one study, users were able to look through their HMD to view a virtual avatar next to their TV that would perform sign language (Vinayagamoorthy et al., 2019). In this experience, we noted multiple products, including the HMD, the digital interface, the TV, the remote control and even the couch they were sitting on. As such, we can see that there are multiple digital and physical products for consideration in any MR experience. Interestingly, other MR research has highlighted how focusing first on the ‘physical’ products of the experience, led to more creative uses of MR and more positive responses from study participants (Yi & Kim, 2021).
Lastly, each MR experience contains a multitude of physical and digital contexts of use. Sometimes the studies could be broadly categorized according to their application domain, such as healthcare, maintenance or manufacturing (Palmarini et al., 2018; Park et al., 2020; Viglialoro et al., 2021). Although, oftentimes, this experience also exists for training or educational purposes, which then has its own wide body of literature in MR (Al-Ansi et al., 2023; Ali et al., 2019; Del Pezo Izaguirre et al., 2021; Geigel et al., 2023). Education in the context of the classroom is different from education in the context of remote communication between novice and expert (Johnson et al., 2023; Luo et al., 2022). Finally, there has been much discussion around the ‘social’ interactions that take place in MR, exploring a socialized MR (Ameb et al., 2019; M. Hunter et al., 2021; M. G. Hunter et al., 2022; Soro et al., 2020).
6. Discussion and Future Research
Our review of 65 user-focused MR studies reveals a predominantly inward focus on the MR device and its unique affordances and interactions. This inward focus often overlooks the broader physical experience in which MR is used and as such, we propose that future MR UX research should adopt an outward perspective, with a focus on the physical world and its existing users, products and contexts of use. This outward focus can help designers uncover existing fluent interactions and important contextual elements, which can be leveraged to build user experiences that complement rather than complicate the MR experience. By more deeply integrating users’ existing physical realities, MR can become a supportive tool that enhances rather than disrupts current interactions.
6. 1. Considerations for MR Designers
In summarising the results of our study into some helpful considerations for future MR designers, we suggest the following:
- 1. There is value in considering how MR can be used in a supportive role, enhancing the users’ existing experience rather than directly changing their immediate experience.
- 2. This requires considering the more significant environmental context, extra products, and other users involved in the experience. Rather than focusing specifically on the MR device, MR designers should look outward to the experience the device is using to assist.
It is worthwhile exploring how MR can function as an adjunct to physical reality, requiring less direct and immediate interaction and leveraging existing fluent interactions while supporting ongoing experiences. Situated instructions that can be read just like traditional instructions but positioned precisely where the user needs them, allowing them to maintain their existing fluent interactions, enable MR to assist users without interrupting their current activities (Chen et al., 2023; Hoffmann et al., 2022; Johnson et al., 2023; Kumaravel et al., 2019).
In the context of users, products, and contexts of use, MR design often prioritises augmenting the environment over utilising existing products. These existing products, however, may relate to a familiar context of use, which provides a more fluent interaction and enhances their overall experience. For instance, in immersive dance theatre (Kim et al., 2023), users can engage in familiar, fluent interactions in the physical context, and MR can be used as a tool to develop these into expressive interactions in the digital context. This approach reinforces existing relationships between the user and the physical product while creating new and novel experiences in the same relationship.
In this article, we analysed the MR user experience through the lens of well-established frameworks (Forlizzi & Battarbee, 2004; Forlizzi & Ford, 2000). In doing so, this approach has brought new perspectives on essential and under-researched aspects of the MR user experience. Therefore, we acknowledge the value of utilising well-known HCI research to analyse emerging technology experiences, and we encourage future research to reuse and recontextualize existing HCI frameworks to understand novel and emerging HCI challenges, as, in many cases, they can offer refreshing perspectives on challenging research problems.
7. Limitations
While we endeavoured to review a wide range of MR research, we noted a limitation in the scope of this review. Specifically, we focused on MR and user experience, but a large number of studies also focus on a particular application domain, such as healthcare, training or education (Koukopoulos et al., 2022; Viglialoro et al., 2021). We encourage future research to utilise the UX framework to analyse MR UX within a specific application domain. Furthermore, we noted in our findings that identifying the context of use, particularly in these studies, was increasingly difficult for MR. This is both a limitation and a finding in that prior to this type of analysis, it has rarely been discussed how MR complicates UX by including both the physical and digital elements of experience. However, we suggest these frameworks be utilised in specific individual user studies to begin categorising the physical and digital elements of an MR user experience. In turn, this can lead to a more repeatable form of analysis for future MR UX research.
8. Conclusion
Our exploration of MR UX underscores the complex and multifaceted challenges inherent in designing for this emerging field. Despite rapid advancements in MR technologies, there remains a notable scarcity of MR UX analysis, increasing the difficulty of creating new MR experiences (Krauß et al., 2021; Merino et al., 2020). This gap in knowledge presents significant challenges for designers and researchers looking to enhance user experiences through the unique affordances of MR technologies, and in this article, we addressed this challenge by employing well-known UX frameworks in a new and emerging field. These frameworks, traditionally used to understand, categorise, identify and design user experiences (Forlizzi & Battarbee, 2004; Forlizzi & Ford, 2000), offered a fresh perspective on MR UX, revealing key aspects that warrant deeper exploration in future research. Ultimately, this approach allowed us to identify the interplay between the MR experience components, offering a structured method to analyse and understand the MR user experience. Our review of 65 articles demonstrated a strong focus on cognitive interactions and uncovered an opportunity for deeper exploration into fluent and expressive interactions.
Significantly, studies reporting positive user experiences often emphasised interactions that enhanced the user’s existing physical experiences rather than diverting their attention towards entirely new, unfamiliar experiences. Our findings indicate that MR’s potential is best realised when it builds upon and augments familiar fluent interactions, and the review highlights that these interactions appear to create more satisfying user experiences. Additionally, our analysis revealed the complex nature of MR experiences, encompassing multiple users, products, and contexts of use across two active realities. We observed a predominant focus on the immediate user and product, with limited consideration for other users or products within the overarching experience (Medeiros et al., 2023).
Future MR research should delve deeper into these aspects, looking outward towards physical products with existing fluent interactions that occur in physical contexts of use and recognising the presence of multiple active and passive users. This approach will help create more holistic and satisfying MR experiences, leveraging the full potential of MR technologies to enhance user interaction and engagement.
Acknowledgments
This work was supported by the National Research Foundation of Korea Grant funded by the Korean Government (NRF-2022R1C1C1010883)
Notes
Copyright : This is an Open Access article distributed under the terms of the Creative Commons Attribution Non-Commercial License (http://creativecommons.org/licenses/by-nc/3.0/), which permits unrestricted educational and non-commercial use, provided the original work is properly cited.
References
-
Al-Ansi, A. M., Jaboob, M., Garad, A., & Al-Ansi, A. (2023). Analyzing augmented reality (AR) and virtual reality (VR) recent development in education. Social Sciences & Humanities Open, 8(1), 100532.
[https://doi.org/10.1016/J.SSAHO.2023.100532]
-
Alexandrovsky, D., Putze, S., Schwind, V., Mekler, E. D., Smeddinck, J. D., Kahl, D., Krüger, A., & Malaka, R. (2021). Evaluating User Experiences in Mixed Reality. Conference on Human Factors in Computing Systems - Proceedings.
[https://doi.org/10.1145/3411763.3441337]
-
Ali, A. A., Dafoulas, G. A., & Augusto, J. C. (2019). Collaborative Educational Environments Incorporating Mixed Reality Technologies: A Systematic Mapping Study. IEEE Transactions on Learning Technologies, 12(3), 321-332.
[https://doi.org/10.1109/TLT.2019.2926727]
-
Almoqbel, M. Y., Naderi, A., Wohn, D. Y., & Goyal, N. (2022). The Metaverse: A Systematic Literature Review to Map Scholarly Definitions. Proceedings of the ACM Conference on Computer Supported Cooperative Work, CSCW, 80-84.
[https://doi.org/10.1145/3500868.3559448]
-
Ambe, A. H., Brereton, M., Soro, A., Buys, L., & Roe, P. (2019, May). The adventures of older authors: Exploring futures through co-design fictions. In Proceedings of the 2019 CHI Conference on Human Factors in Computing systems (pp. 1-16).
[https://doi.org/10.1145/3290605.3300588]
- Arhippainen, L. (2003). Capturing user experience for product design. The 26th, 1-10.
-
Azuma, R. T. (1997). A survey of augmented reality. In Presence: Teleoperators and Virtual Environments (Vol. 6, Issue 4, pp. 355-385). MIT Press Journals.
[https://doi.org/10.1162/pres.1997.6.4.355]
-
Bargas-Avila, J. A., & Hornbæk, K. (2011). Old wine in new bottles or novel challenges? A critical analysis of empirical studies of User Experience. Conference on Human Factors in Computing Systems - Proceedings, 2689-2698.
[https://doi.org/10.1145/1978942.1979336]
-
Billinghurst, M. (2021). Grand Challenges for Augmented Reality. Frontiers in Virtual Reality, 2, 578080.
[https://doi.org/10.3389/FRVIR.2021.578080/BIBTEX]
-
Billinghurst, M., Clark, A., & Lee, G. (2014). A survey of augmented reality. In Foundations and Trends in Human-Computer Interaction, 8(2-3), 73-272.
[https://doi.org/10.1561/1100000049]
-
Billinghurst, M., & Kato, H. (2002). Collaborative augmented reality. Communications of the ACM, 45(7), 64-70.
[https://doi.org/10.1145/514236.514265]
-
Chen, C., Nguyen, C., Hoffswell, J., Healey, J., & Bui, T. (2023). PaperToPlace: Transforming Instruction Documents into Spatialized and Context-Aware Mixed Reality Experiences. UIST 2023 - Proceedings of the 36th Annual ACM Symposium on User Interface Software and Technology, 1-21.
[https://doi.org/10.1145/3586183.3606832]
-
Choi, J. O., Forlizzi, J., Christel, M., Moeller, R., Bates, M., & Hammer, J. (2016). Playtesting with a Purpose. Proceedings of the 2016 Annual Symposium on Computer-Human Interaction in Play, 254-265.
[https://doi.org/10.1145/2967934.2968103]
-
Costanza, E., Kunz, A., & Fjeld, M. (2009). Mixed Reality: A Survey. www.icvr.ethz.ch.
[https://doi.org/10.1007/978-3-642-00437-7_3]
-
Davis, L., & Aslam, U. (2024). Analyzing consumer expectations and experiences of Augmented Reality (AR) apps in the fashion retail sector. Journal of Retailing and Consumer Services, 76, 103577.
[https://doi.org/10.1016/J.JRETCONSER.2023.103577]
-
Del Pezo Izaguirre, E., Abasolo, M. J., & Collazos, C. A. (2021). Educational Methodologies for Hearing Impaired Children Supported by Mobile Technology and Extended Reality: Systematic Analysis of Literature. Revista Iberoamericana de Tecnologias Del Aprendizaje, 16(4), 410-418.
[https://doi.org/10.1109/RITA.2021.3135202]
-
Dey, A., Billinghurst, M., Lindeman, R. W., & Swan, J. E. (2018). A systematic review of 10 Years of Augmented Reality usability studies: 2005 to 2014. Frontiers Robotics AI, 5(APR), 329739.
[https://doi.org/10.3389/FROBT.2018.00037/BIBTEX]
-
DiSalvo, C., Hanington, B., & Forlizzi, J. (2004). An accessible framework of emotional experiences for new product conception. Design and Emotion, 256-287.
[https://doi.org/10.1201/9780203608173-c46]
-
Forlizzi, J., & Battarbee, K. (2004, August). Understanding experience in interactive systems. In Proceedings of the 5th conference on Designing interactive systems: processes, practices, methods, and techniques (pp. 261-268).
[https://doi.org/10.1145/1013115.1013152]
-
Forlizzi, J., Disalvo, C., & Hanington, B. (2003). On the Relationship between Emotion, Experience and the Design of New Products. The Design Journal, 6(2), 29-38.
[https://doi.org/10.2752/146069203789355507]
-
Forlizzi, J., & Ford, S. (2000). The building blocks of experience: an early framework for interaction designers. Proceedings of the 3rd Conference on Designing Interactive Systems: Processes, Practices, Methods, and Techniques (pp. 419-423).
[https://doi.org/10.1145/347642.347800]
-
Geigel, J., Warfield, T., Ma, Y. S., Roach, D., & Foster, S. (2023). Music, Motion, and Mixed Reality: An Interdisciplinary, Problem-Based Educational Experience. Proceedings - SIGGRAPH 2023 Educator's Forum (pp. 1-2).
[https://doi.org/10.1145/3587424.3595581]
-
Ghamandi, R. K., Hmaiti, Y., Nguyen, T. T., Ghasemaghaei, A., Kattoju, R. K., Taranta, E. M., & LaViola, J. J. (2023). What And How Together: A Taxonomy On 30 Years Of Collaborative Human-Centered XR Tasks. 2023 IEEE International Symposium on Mixed and Augmented Reality (ISMAR), (pp. 322-335). IEEE.
[https://doi.org/10.1109/ISMAR59233.2023.00047]
-
Goncalves, G., Monteiro, P., Coelho, H., Melo, M., & Bessa, M. (2021). Systematic Review on Realism Research Methodologies on Immersive Virtual, Augmented and Mixed Realities. IEEE Access, 9, 89150-9161.
[https://doi.org/10.1109/ACCESS.2021.3089946]
-
Healey, J., Wang, D., Wigington, C., Sun, T., & Peng, H. (2021). A Mixed-Reality System to Promote Child Engagement in Remote Intergenerational Storytelling. Proceedings - 2021 IEEE International Symposium on Mixed and Augmented Reality Adjunct, ISMAR-Adjunct 2021, (pp. 274-279).
[https://doi.org/10.1109/ISMAR-Adjunct54149.2021.00063]
-
Hoffmann, S., Pinatti De Carvalho, A. F., Schweitzer, M., Abele, N. D., & Wulf, V. (2022). Producing and Consuming Instructional Material in Manufacturing Contexts: Evaluation of an AR-based Cyber-Physical Production System for Supporting Knowledge and Expertise Sharing. Proceedings of the ACM on Human-Computer Interaction, 6(CSCW2), 1-36.
[https://doi.org/10.1145/3555091]
-
Hollan, J., & Stornetta, S. (1992). Beyond being there. In Proceedings of the SIGCHI conference on Human factors in computing systems (pp. 119-125).
[https://doi.org/10.1145/142750.142769]
-
Howard, M. C., & Davis, M. M. (2022). A meta-analysis and systematic literature review of mixed reality rehabilitation programs: Investigating design characteristics of augmented reality and augmented virtuality. Computers in Human Behavior, 130, 107197.
[https://doi.org/10.1016/J.CHB.2022.107197]
-
Hunter, M. G., Soro, A., Brown, R. A., Harman, J., & Yigitcanlar, T. (2022). Augmenting Community Engagement in City 4.0: Considerations for Digital Agency in Urban Public Space. Sustainability, 14(16), 9803.
[https://doi.org/10.3390/su14169803]
-
Hunter, M., Soro, A., & Ross Brown. (2021). Enhancing Urban Conversation for Smarter Cities - Augmented Reality as an enabler of digital civic participation. Interaction Design and Architecture (s), 48, 75-99..
[https://doi.org/10.55612/s-5002-048-004]
-
Ivanova, B., & Vassilev, T. (2021, June). A mixed reality approach to visualizing cultural heritage artefacts: Mixed reality approach to cultural heritage. In Proceedings of the 22nd International Conference on Computer Systems and Technologies (pp. 107-111).
[https://doi.org/10.1145/3472410.3472432]
-
Jensen, J. F. (2013). IT and experiences: user experience, experience design and user-experience design. In Handbook on the experience economy (pp. 179-208). Edward Elgar Publishing.
[https://doi.org/10.4337/9781781004227.00016]
-
Johnson, J. G., Sharkey, T., Butarbutar, I. C., Xiong, D., Huang, R., Sy, L., & Weibel, N. (2023, April). UnMapped: Leveraging Experts' Situated Experiences to Ease Remote Guidance in Collaborative Mixed Reality. In Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems (pp. 1-20).
[https://doi.org/10.1145/3544548.3581444]
-
Kent, L., Snider, C., Gopsill, J., & Hicks, B. (2021). Mixed reality in design prototyping: A systematic review. Design Studies, 77, 101046.
[https://doi.org/10.1016/J.DESTUD.2021.101046]
-
Kim, Y. J., Lu, J., & Höllerer, T. (2023). Dynamic Theater: Location-Based Immersive Dance Theater, Investigating User Guidance and Experience. Proceedings of the ACM Symposium on Virtual Reality Software and Technology, VRST, (pp. 1-11).
[https://doi.org/10.1145/3611659.3615705]
-
Koskinen, H. M. K., Savioja, P., Mannonen, P., & Aikala, M. (2024, October). The Process is Under Control! Understanding the Building Blocks of User Experience in Operator Work. In Proceedings of the 13th Nordic Conference on Human-Computer Interaction (pp. 1-12).
[https://doi.org/10.1145/3679318.3685394]
-
Koukopoulos, D., Dafiotis, P., Sylaiou, S., Koukoulis, K., & Fidas, C. (2022, October). XR technologies for self-regulated student exhibitions in art education: Survey and first design considerations. In 2022 International Conference on Interactive Media, Smart Systems and Emerging Technologies (IMET) (pp. 1-8). IEEE.
[https://doi.org/10.1109/IMET54801.2022.9929450]
-
Krauß, V., Jasche, F., Saßmannshausen, S. M., Ludwig, T., & Boden, A. (2021, December). Research and practice recommendations for mixed reality design-different perspectives from the community. In Proceedings of the 27th ACM Symposium on Virtual Reality Software and Technology (pp. 1-13).
[https://doi.org/10.1145/3489849.3489876]
-
Thoravi Kumaravel, B., Anderson, F., Fitzmaurice, G., Hartmann, B., & Grossman, T. (2019, October). Loki: Facilitating remote instruction of physical tasks using bi-directional mixed-reality telepresence. In Proceedings of the 32nd Annual ACM Symposium on User Interface Software and Technology (pp. 161-174).
[https://doi.org/10.1145/3332165.3347872]
-
Law, E. L. C., Roto, V., Hassenzahl, M., Vermeeren, A. P., & Kort, J. (2009, April). Understanding, scoping and defining user experience: a survey approach. In Proceedings of the SIGCHI conference on human factors in computing systems (pp. 719-728).
[https://doi.org/10.1145/1518701.1518813]
- Lee, M. K., & Forlizzi, J. (2009). Designing adaptive robotic services. Proc. of IASDR'09, 1-10.
-
Li, X., Chen, W., & Wu, Y. (2019, October). Distance-driven user interface for collaborative exhibit viewing in augmented reality museum. In Adjunct Proceedings of the 32nd Annual ACM Symposium on User Interface Software and Technology (pp. 42-43).
[https://doi.org/10.1145/3332167.3357109]
-
Liu, Z., Yan, S., Lu, Y., & Zhao, Y. (2022, April). Generating Embodied Storytelling and Interactive Experience of China Intangible Cultural Heritage "Hua'er" in Virtual Reality. In CHI Conference on Human Factors in Computing Systems Extended Abstracts (pp. 1-7).
[https://doi.org/10.1145/3491101.3519761]
-
Luo, L., Weng, D., Songrui, G., Hao, J., & Tu, Z. (2022, April). Avatar interpreter: improving classroom experiences for deaf and hard-of-hearing people based on augmented reality. In CHI Conference on Human Factors in Computing Systems Extended Abstracts (pp. 1-5).
[https://doi.org/10.1145/3491101.3519799]
-
Mann, L., & Fryazinov, O. (2019). 3D printing for mixed reality hands-on museum exhibit interaction. In ACM siggraph 2019 posters (pp. 1-2).
[https://doi.org/10.1145/3306214.3338609]
-
Marques, B., Silva, S., Dias, P., & Sousa-Santos, B. (2022, November). Which notification is better? comparing visual, audio and tactile cues for asynchronous mixed reality (mr) remote collaboration: A user study. In Proceedings of the 21st International Conference on Mobile and Ubiquitous Multimedia (pp. 276-278).
[https://doi.org/10.1145/3568444.3570587]
-
Marti, P., & Iacono, I. (2016, September). Anticipated, momentary, episodic, remembered: the many facets of User eXperience. In 2016 federated conference on computer science and information systems (fedcsis) (pp. 1647-1655). IEEE.
[https://doi.org/10.15439/2016F302]
-
Medeiros, D., Dubus, R., Williamson, J., Wilson, G., Pöhlmann, K., & Mcgill, M. (2023). Surveying the Social Comfort of Body, Device, and Environment-Based Augmented Reality Interactions in Confined Passenger Spaces Using Mixed Reality Composite Videos. Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies, 7(3), 25.
[https://doi.org/10.1145/3610923]
-
Merino, L., Schwarzl, M., Kraus, M., Sedlmair, M., Schmalstieg, D., & Weiskopf, D. (2020, November). Evaluating mixed and augmented reality: A systematic literature review (2009-2019). In 2020 IEEE International Symposium on Mixed and Augmented Reality (ISMAR) (pp. 438-451). IEEE.
[https://doi.org/10.1109/ISMAR50242.2020.00069]
- Milgram, P., & Kishino, F. (1994). A taxonomy of mixed reality visual displays. IEICE TRANSACTIONS on Information and Systems, 77(12), 1321-1329. https://search.ieice.org/bin/summary.php?id=e77-d_12_1321&category=D&year=1994&lang=E&abst=.
-
Nebeling, M., Speicher, M., Wang, X., Rajaram, S., Hall, B. D., Xie, Z., ... & Kulkarni, R. (2020, April). MRAT: The mixed reality analytics toolkit. In Proceedings of the 2020 CHI Conference on human factors in computing systems (pp. 1-12).
[https://doi.org/10.1145/3313831.3376330]
-
Nelson, C. R., & Gabbard, J. L. (2023, October). Augmented Reality Rehabilitative and Exercise Games (ARREGs): A Systematic Review and Future Considerations. In 2023 IEEE International Symposium on Mixed and Augmented Reality (ISMAR) (pp. 1016-1025). IEEE.
[https://doi.org/10.1109/ISMAR59233.2023.00118]
-
Nelson, M. (2010). Building a written corpus. The Routledge Handbook of Corpus Linguistics, 53-65.
[https://doi.org/10.4324/9780203856949.ch5]
- Ortiz, N., Juan, C., & Aurisicchio, M. (2011). The scenario of user experience. In DS 68-7: Proceedings of the 18th International Conference on Engineering Design (ICED 11), Impacting Society through Engineering Design, Vol. 7: Human Behaviour in Design, Lyngby/Copenhagen, Denmark, 15.-19.08. 2011 (pp. 182-193). https://www.designsociety.org/publication/30674/THE+SCENARIO+OF+USER+EXPERIENCE.
-
Palmarini, R., Erkoyuncu, J. A., Roy, R., & Torabmostaedi, H. (2018). A systematic review of augmented reality applications in maintenance. Robotics and Computer-Integrated Manufacturing, 49, 215-228.
[https://doi.org/10.1016/j.rcim.2017.06.002]
-
Pamparău, C., & Vatavu, R. D. (2020, November). A research agenda is needed for designing for the user experience of augmented and mixed reality: a position paper. In Proceedings of the 19th International Conference on Mobile and Ubiquitous Multimedia (pp. 323-325).
[https://doi.org/10.1145/3428361.3432088]
-
Papadopoulos, T., Evangelidis, K., Kaskalis, T. H., Evangelidis, G., & Sylaiou, S. (2021). Interactions in augmented and mixed reality: an overview. Applied Sciences, 11(18), 8752.
[https://doi.org/10.3390/APP11188752]
-
Park, K. B., Kim, M., Choi, S. H., & Lee, J. Y. (2020). Deep learning-based smart task assistance in wearable augmented reality. Robotics and Computer-Integrated Manufacturing, 63, 101887.
[https://doi.org/10.1016/j.rcim.2019.101887]
-
Plopski, A., Hirzle, T., Norouzi, N., Qian, L., Bruder, G., & Langlotz, T. (2022). The eye in extended reality: A survey on gaze interaction and eye tracking in head-worn extended reality. ACM Computing Surveys (CSUR), 55(3), 1-39.
[https://doi.org/10.1145/3491207]
-
Quintero, J., Baldiris, S., Rubira, R., Cerón, J., & Velez, G. (2019). Augmented reality in educational inclusion. A systematic review on the last decade. Frontiers in Psychology, 10, 1835.
[https://doi.org/10.3389/fpsyg.2019.01835]
-
Ratcliffe, J., Soave, F., Bryan-Kinns, N., Tokarchuk, L., & Farkhatdinov, I. (2021, May). Extended reality (XR) remote research: A survey of drawbacks and opportunities. In Proceedings of the 2021 CHI conference on human factors in computing systems (pp. 1-13).
[https://doi.org/10.1145/3411764.3445170]
-
Rauschnabel, P. A., Felix, R., Hinsch, C., Shahab, H., & Alt, F. (2022). What is XR? Towards a framework for augmented and virtual reality. Computers in human behavior, 133, 107289.
[https://doi.org/10.1016/J.CHB.2022.107289]
-
Rokhsaritalemi, S., Sadeghi-Niaraki, A., & Choi, S. M. (2020). A review on mixed reality: Current trends, challenges and prospects. Applied Sciences, 10(2), 636..
[https://doi.org/10.3390/APP10020636]
-
Saadon, N. F. S. M., Ahmad, I., Pee, A. N. C., & Hanapi, C. (2020, May). The implementation of augmented reality in increasing student motivation: systematic literature review. In IOP Conference Series: Materials Science and Engineering (Vol. 854, No. 1, p. 012043). IOP Publishing.
[https://doi.org/10.1088/1757-899X/854/1/012043]
-
Schauer, S., & Sieck, J. (2023, September). Tracking, visualisation and interaction for virtual reconstruction of cultural heritage in mixed reality. In Proceedings of the 20th International Conference on Culture and Computer Science: Code and Materiality (pp. 1-5).
[https://doi.org/10.1145/3623462.3623464]
-
Schubert, R., Bruder, G., & Welch, G. (2023, October). Intuitive User Interfaces for Real-Time Magnification in Augmented Reality. In Proceedings of the 29th ACM Symposium on Virtual Reality Software and Technology (pp. 1-10).
[https://doi.org/10.1145/3611659.3615694]
-
Silva, M., & Teixeira, L. (2020, November). Developing an extended reality platform for immersive and interactive experiences for cultural heritage: Serralves museum and coa archeologic park. In 2020 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct) (pp. 300-302). IEEE.
[https://doi.org/10.1109/ISMAR-Adjunct51615.2020.00084]
-
Skarbez, R., Smith, M., & Whitton, M. C. (2021). Revisiting Milgram and Kishino's Reality-Virtuality Continuum. Frontiers in Virtual Reality, 2, 647997.
[https://doi.org/10.3389/FRVIR.2021.647997/BIBTEX]
-
Soni, A., & Shete, S. (2020, February). Mixed reality for stress relief. In Proceedings of the fourteenth international conference on tangible, embedded, and embodied interaction (pp. 937-942).
[https://doi.org/10.1145/3374920.3374996]
-
Soro, A., Brown, R., Wyeth, P., & Turkay, S. (2020, April). Towards a smart and socialised augmented reality. In Extended Abstracts of the 2020 CHI Conference on Human Factors in Computing Systems (pp. 1-8).
[https://doi.org/10.1145/3334480.3383002]
-
Speicher, M., Hall, B. D., & Nebeling, M. (2019, May). What is mixed reality?. In Proceedings of the 2019 CHI conference on human factors in computing systems (pp. 1-15).
[https://doi.org/10.1145/3290605.3300767]
- Swan, J. E., & Gabbard, J. L. (2005, July). Survey of user-based experimentation in augmented reality. In Proceedings of 1st international conference on virtual reality (Vol. 22, pp. 1-9).
-
Tan, D., Poupyrev, I., Billinghurst, M., Kato, H., Regenbrecht, H., & Tetsutani, N. (2001, July). The best of two worlds: merging virtual and real for face-to-face collaboration. In (2001) IEEE International Conference on Multimedia and Expo (pp. 861-864).
[https://doi.org/10.1109/ICME.2001.1237858]
-
Viglialoro, R. M., Condino, S., Turini, G., Carbone, M., Ferrari, V., & Gesi, M. (2021). Augmented reality, mixed reality, and hybrid approach in healthcare simulation: a systematic review. Applied Sciences, 11(5), 2338.
[https://doi.org/10.3390/APP11052338]
-
Vinayagamoorthy, V., Glancy, M., Ziegler, C., & Schäffer, R. (2019, May). Personalising the TV experience using augmented reality: An exploratory study on delivering synchronised sign language interpretation. In Proceedings of the 2019 CHI conference on human factors in computing systems (pp. 1-12).
[https://doi.org/10.1145/3290605.3300762]
-
Warmelink, H., Mayer, I., Weber, J., Heijligers, B., Haggis, M., Peters, E., & Louwerse, M. (2017, October). AMELIO: Evaluating the team-building potential of a mixed reality escape room game. In Extended abstracts publication of the annual symposium on computer-human interaction in play (pp. 111-123).
[https://doi.org/10.1145/3130859.3131436]
-
Werrlich, S., Daniel, A., Ginger, A., Nguyen, P. A., & Notni, G. (2018, October). Comparing HMD-based and paper-based training. In 2018 IEEE International Symposium on Mixed and Augmented Reality (ISMAR) (pp. 134-142). IEEE.
[https://doi.org/10.1109/ISMAR.2018.00046]
-
Xia, X., Liang, J., Zhao, R., Zhao, Z., Wu, M., Li, Y., & Liang, H. N. (2023, October). Cross-Reality Interaction and Collaboration in Museums, Education, and Rehabilitation. In 2023 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct) (pp. 815-820). IEEE.
[https://doi.org/10.1109/ISMAR-Adjunct60411.2023.00180]
-
Yi, J. H., & Kim, H. S. (2021). User experience research, experience design, and evaluation methods for museum mixed reality experience. Journal on Computing and Cultural Heritage (JOCCH), 14(4), 1-28.
[https://doi.org/10.1145/3462645]
-
Yuan, Q., Chen, K., Yang, Q., Pan, Z., Xu, J., & Yao, Z. (2024). Exploring intuitive visuo-tactile interaction design for culture education: A Chinese-chess-based case study. International Journal of Human-Computer Interaction, 40(8), 2099-2119..
[https://doi.org/10.1080/10447318.2023.2223863]
-
Zimmerman, J., Stolterman, E., & Forlizzi, J. (2010, August). An analysis and critique of Research through Design: towards a formalization of a research approach. In proceedings of the 8th ACM conference on designing interactive systems (pp. 310-319).
[https://doi.org/10.1145/1858171.1858228]