Publications

Janet Johnson, Tommy Sharkey, Cynthia Butarbutar, Danica Xiong, Lauren Sy, Ruijie Huang, Nadir Weibel

CHI 2023

Abstract:

Collaborative Mixed Reality (MR) systems that help extend expertise for physical tasks to remote environments often situate experts in an immersive view of the task environment to bring the collaboration closer to collocated settings. In this paper, we design UnMapped, an alternative interface for remote experts that combines a live 3D view of the active space within the novice's environment with a static 3D recreation of the expert's own workspace to leverage their existing spatial memories within it. We evaluate the impact of this approach on single and repeated use of collaborative MR systems for remote guidance through a comparative study. Our results indicate that despite having a limited understanding of the novice's environment, using an UnMapped interface increased performance and communication efficiency while reducing experts' task load. We also outline the various affordances of providing remote experts with a familiar and spatially-stable environment to assist novices from.




CAPT Matthew D. Tadlock, MD, LCDR Erik J. Olson, MD, Danilo Gasques, MS, Roland Champagne, RN, MPH, MBA, CDR Michael J. Krzyzaniak, MD, CDR Shawn A. Belverud, MD, LCDR Vijay Ravindra, MD, Jakob Kerns, BA, LCDR Pamela M. Choi, MD, Jennifer Deveraux, BS, Janet Johnson, MS, Thomas Sharkey, MS, Michael Yip, PhD, Nadir Weibel, PhD, CAPT (R) Konrad Davis, MD

Surgery 2022

Abstract:

Background: Most telemedicine modalities have limited ability to enhance procedural and operative care. We developed a novel system to provide synchronous bidirectional expert mixed reality-enabled virtual procedural mentoring. In this feasibility study, we evaluated mixed reality mentoring of combat casualty care related procedures in a re-perfused cadaver model. Methods: Novices received real-time holographic mentoring from experts using augmented reality via Hololens (Microsoft Inc, Redmond, WA). The experts maintained real-time awareness of the novice’s operative environment using virtual reality via HTC-Vive (HTC Corp, Xindian District, Taiwan). Additional cameras (both environments) and novel software created the immersive, shared, 3-dimensional mixed reality environment in which the novice and expert collaborated. The novices were prospectively randomized to either mixed reality or audio-only mentoring. Blinded experts independently evaluated novice procedural videos using a 5-point Likert scale-based questionnaire. Nonparametric variables were evaluated using the Wilcoxon rank-sum test and comparisons using the χ2 analysis; significance was defined at P < .05. Results: Surgeon and nonsurgeon novices (14) performed 69 combat casualty care-related procedures (38 mixed reality, 31 audio), including various vascular exposures, 4-compartment lower leg fasciotomy, and emergency neurosurgical procedures; 85% were performed correctly with no difference in either group. Upon video review, mixed reality–mentored novices showed no difference in procedural flow and forward planning (3.67 vs 3.28, P = .21) or the likelihood of performing individual procedural steps correctly (4.12 vs 3.59, P = .06). Conclusion: In this initial feasibility study, our novel mixed reality-based mentoring system successfully facilitated the performance of a wide variety of combat casualty care relevant procedures using a high fidelity re-perfused cadaver model. The small sample size and limited variety of novice types likely impacted the ability of holographically mentored novices to demonstrate improvement over the audio-only control group. Despite this, using virtual, augmented, and mixed reality technologies for procedural mentoring demonstrated promise, and further study is needed.




Tommy Sharkey, Timothy Wood, Robert Twomey, Ying Wu, Amy Eguchi, Monica Sweet

CHI 2022 Interactivity

Abstract:

The increasing sophistication and availability of Augmented and Virtual Reality (AR/VR) technologies wield the potential to transform how we teach and learn computational concepts and coding. This project develops a platform for creative coding in virtual and augmented reality. The Embodied Coding Environment (ECE) is a flow-based visual coding system designed to increase physical engagement with programming and lower the barrier to entry for novice programmers. It is conceptualized as a merged digital/physical workspace where spatial representation of code, the visual outputs of the code, and user interactions and edit histories are co-located in a virtual 3D space.




Tommy Sharkey, Timothy Wood, Robert Twomey, Ying Wu, Amy Eguchi, Monica Sweet

CSEDU 2022

Abstract:

Eight middle- and high-school Computer Science (CS) teachers in San Diego County were interviewed about the major challenges their students commonly encounter in learning computer programming. We identified strategic design opportunities -- that is, challenges and needs that can be addressed in innovative ways through the affordances of Augmented and Virtual Reality (AR/VR). Thematic Analysis of the interviews yielded six thematic clusters: Tools for Learning, Visualization and Representation, Pedagogical Approaches, Classroom Culture, Motivation, and Community Connections. Within the theme of visualization, focal clusters centered on visualizing problem spaces and using metaphors to explain computational concepts, indicating that an AR/VR coding system could help users to represent computational problems by allowing them to build from existing embodied experiences and knowledge. Additionally, codes clustered within the theme of learning tools reflected educators’ preference for web-based IDEs, which involve minimal start-up costs, as well as concern over the degree of transfer in learning between block- and text-based interfaces. Finally, themes related to motivation, community, and pedagogical practices indicated that the design of an AR coding platform should support collaboration, self-expression, and autonomy in learning. It should also foster self-efficacy and learners’ ability to address lived experience and real-world problems through computational means.




How Are Computational Concepts Learned and Taught?: A Thematic Analysis Study Informing the Design of an Augmented Reality Coding Platform

Tommy Sharkey, Timothy Wood, Robert Twomey, Ting Wu, Amy Eguchi, Monica Sweet

CSEDU 2021 Poster

Abstract:

How could social, collaborative, immersive coding experiences anchored in 3D physical space transform the teaching and learning of computational concepts? The concept of implementing a 3D block or node-based coding platform in Augmented Reality (AR) in support of collaborative and body-based engagement is motivated by foundational work on the benefits of pair programming (Hanks et al, 2011) and collaborative learning (Johnson & Johnson, 1987). It is also motivated by findings that science, math, and even computational concepts are fundamentally rooted in sensorimotor experience (Lindgren & Johnson-Glenberg, 2013; Lakoff & Núñez, 2000; Black et al, 2012) or body syntonicity (Papert, 1993). Further, extensive evidence suggests that body movements in general support brain function and cognition important for learning, memory, creativity, and problem-solving (Basso & Suzuki, 2017; Oppezzo & Schwartz, 2014; Statton et al, 2015; Thomas & Lleras, 2009). A first step in designing such a platform is understanding fundamental challenges faced by students learning to code. To address this, we conducted Zoom interviews with eight high school and middle school computer science (CS) educators in San Diego county. We asked open-ended questions, prompting them to speak at length and give unprompted evaluations of tools, platforms, and approaches. We present the thematic analysis of these interviews in this abstract. Most participants had used block-based programming tools (namely, Scratch) and universally found them a powerful tool for promoting algorithmic thinking by abstracting syntax into an “intuitive” interface. However, two major drawbacks were raised. 1) As projects get large, the interface becomes cumbersome and collapsing chunks of code exacerbates rather than mitigates this problem. 2) High school students often perceive block-based programming as “childish” and “unprofessional,” causing them to avoid it. For some participants, rather than formally teaching core programming concepts (e.g. loops, conditionals), they would have students first search online forums and videos and then focus on projects that allow them to apply what they have learned. The rationale for this approach prioritized engagement through project-based learning and mastery of skills and tools necessary to solve coding challenges. In contrast, other teachers stated that project-intensive approaches can prevent students from developing a solid foundation in programming fundamentals. Surprisingly, none of our participants described another common programming metaphor: flow-based programming where nodes in a flow-diagram are connected by curves to show how the information/algorithm flows through time/space. These environments can often be found in professional tools (Unreal, Unity, Rhino/Grasshopper, Blender) for coding complex event-based behavior. Despite never mentioning these tools, a number of participants did use visual diagrams and flow charts as a valuable external means for understanding program flow. Using the relationship between these themes, we can identify the properties of a tool needed to aid CS education for students. The physical metaphors that come from block-based coding needs to pair with the capacity for complexity exhibited by text-based coding. Additional research needs to be conducted to evaluate the affordances of flow-based coding as a third modality. This needs to be packaged in an IDE enabling self-motivated, creative endeavors.




ARTEMIS: A Collaborative Mixed-Reality System for Immersive Surgical Telementoring

Danilo Gasques, Janet G Johnson, Tommy Sharkey, Yuanyuan Feng, Ru Wang, Zhuoqun Robin Xu, Enrique Zavala, Wanze Xie, Xinming Zhang, Konrad Davis, Michael Yip, Nadir Weibel

CHI 2021

Abstract:

Traumatic injuries require timely intervention, but medical expertise is not always available at the patient's location. Despite recent advances in telecommunications, surgeons still have limited tools to remotely help inexperienced surgeons. Mixed Reality hints at a future where remote collaborators work side-by-side as if co-located; however, we still do not know how current technology can improve remote surgical collaboration. Through role-playing and iterative-prototyping, we identify collaboration practices used by expert surgeons to aid novice surgeons as well as technical requirements to facilitate these practices. We then introduce ARTEMIS, an AR-VR collaboration system that supports these key practices. Through an observational study with two expert surgeons and five novice surgeons operating on cadavers, we find that ARTEMIS supports remote surgical mentoring of novices through synchronous point, draw, and look affordances and asynchronous video clips. Most participants found that ARTEMIS facilitates collaboration despite existing technology limitations explored in this paper.




Do You Really Need to Know Where "That'' Is? Enhancing Support for Referencing in Collaborative Mixed Reality Environments

Janet G Johnson, Danilo Gasques, Tommy Sharkey, Evan Schmitz, Nadir Weibel

CHI 2021

Abstract: 

Mixed Reality has been shown to enhance remote guidance and is especially well-suited for physical tasks. Conversations during these tasks are heavily anchored around task objects and their spatial relationships in the real world, making referencing - the ability to refer to an object in a way that is understood by others - a crucial process that warrants explicit support in collaborative Mixed Reality systems. This paper presents a 2x2 mixed factorial experiment that explores the effects of providing spatial information and system-generated guidance to task objects. It also investigates the effects of such guidance on the remote collaborator's need for spatial information. Our results show that guidance increases performance and communication efficiency while reducing the need for spatial information, especially in unfamiliar environments. Our results also demonstrate a reduced need for remote experts to be in immersive environments, making guidance more scalable, and expertise more accessible.




ARTEMIS: Mixed-reality environment for immersive surgical telementoring.

Nadir Weibel, Danilo Gasques, Janet Johnson, Thomas Sharkey, Zhuoqun Robin Xu, Xinming Zhang, Enrique Zavala, Michael Yip, Konrad Davis

CHI 2020 Poster

The golden hour following a traumatic injury holds the highest likelihood where surgical treatment may prevent mortality and morbidity. However, the increasing occurrence of large-scale disasters, overwhelm local medical systems and patients find themselves without timely access to medical expertise. To respond to this problem medical care experts are exploring telemedicine, which typically attempts to use synchronous audiovisual communication. While current telemedicine approaches have limited ability to impact the physical care required for trauma, AR technology allows medical professionals to see their patients while seeing additional digital information. In this paper we describe ARTEMIS (Augmented Reality Technology to Enable reMote Integrated Surgery), an immersive AR-VR telementoring infrastructure that allows experienced surgeons to remotely aid less experienced medical professionals in the field. ARTEMIS provides Mixed Reality immersive visual aids by tracking a patient in real-time and showing a reconstructed 3D point cloud in a VR environment; expert surgeons can interact with the 3D point cloud representation of the patient, instruct the remote novice through real-time 3D annotations projected in AR on the patient's body, using hand-maneuvers shown in AR through an avatar of the expert surgeon, and by projecting small video clips of specific procedures in the AR space for the novice to follow.




Thomas Sharkey, Janet Johnson, Danilo Gasques and Nadir Weibel. 2019. I Want to Be a Surgeon! Role Playing for Remote Surgery in Mixed Reality. WISH ’19.




Danilo Gasques, Janet G. Johnson, Tommy Sharkey, and Nadir Weibel. 2019. What You Sketch Is What You Get: Quick and Easy Augmented Reality Prototyping with PintAR. CHI EA ’19




Danilo Gasques, Janet G. Johnson, Tommy Sharkey, and Nadir Weibel. 2019. PintAR: Sketching Spatial Experiences in Augmented Reality. DIS '19 Companion.




K.L. Davis, D. Gasques, Y.Zhang, W. Xie, J. Johnson, Y. Feng, Z. Xu, J. Riback, T. Sharkey, M. Yip, N. Weibel. ARTEMIS, Augmented Reality Technology to Enable reMote Integrated Surgery: A Review of Technical Consideration and Study Design, 2019 Military Health System Research Symposium (MHSRS 2019), Orlando Florida, August 2019.