Other Publications

As of 2017, this list is not current. Please look at my Google Scholar Page and group website for an up-to-date list of publications.

2017

Megidish, B., Zuckerman, O., & Hoffman, G. (2017). Animating Mechanisms: A Pipeline for Authoring Robot Gestures [HRI'17 LBR]

Megidish, B., Zuckerman, O., & Hoffman, G. (2017)

Animating Mechanisms: A Pipeline for Authoring Robot Gestures

Companion Proceedings of the 2017 ACM/IEEE International Conference on Human-Robot Interaction

Abstract

Designing and authoring gestures for socially expressive robots has been an increasingly important problem in recent years. In this demo we present a new pipeline that enables animators to create gestures for robots in a 3D animation authoring environment, without knowledge in computer programming. The pipeline consists of an exporter for a 3D animation software and an interpreter running on a System-on-Module translating the exported animation into motor control commands.

close

Görür, O. C., et al. (2017). Toward integrating Theory of Mind into Adaptive Decision-Making of Social Robots to Understand Human Intention [HRI'17 Workshop]

Görür, O. C., Rosman, B. S., Hoffman, G., & Albayrak, S. (2017)

Toward integrating Theory of Mind into Adaptive Decision-Making of Social Robots to Understand Human Intention

HRI 2017 Workshop on the Role of Intentions in Human-Robot Interaction

Abstract

We propose an architecture that integrates Theory of Mind into a robot’s decision-making to infer a human’s intention and adapt to it. The architecture implements human-robot collaborative decision-making for a robot incorporating human variability in their emotional and intentional states. This research first implements a mechanism for stochastically estimating a human’s belief over the state of the actions that the human could possibly be executing. Then, we integrate this information into a novel stochastic human-robot shared planner that models the human’s preferred plan. Our contribution lies in the ability of our model to handle the conditions: 1) when the human’s intention is estimated incorrectly and the true intention may be unknown to the robot, and 2) when the human’s intention is estimated correctly but the human doesn’t want the robot’s assistance in the given context. A robot integrating this model into its decision-making process would better understand a human’s need for assistance and therefore adapt to behave less intrusively and more reasonably in assisting its human companion.

close

2016

Hoffman, G. (2016). OpenWoZ, A Runtime-Configurable Wizard-of-Oz Framework for Human-Robot Interaction pdf

Hoffman, G. (2016)

OpenWoZ, A Runtime-Configurable Wizard-of-Oz Framework for Human-Robot Interaction

AAAI Spring Symposium on Enabling Computing Research in Socially Intelligent Human-Robot Interaction

Abstract

Wizard-of-Oz (WoZ) is a common technique enabling HRI researchers to explore aspects of interaction not yet backed by autonomous systems. A standardized, open, and flexible WoZ framework could therefore serve the community and accelerate research both for the design of robotic systems and for their evaluation.

This paper presents the definition of OpenWoZ, a Wizard-of-Oz framework for HRI, designed to be updated during operation by the researcher controlling the robot. OpenWoZ is implemented as a thin HTTP server running on the robot, and a cloud-backed multi-platform client schema. The WoZ server accepts representational state transfer (REST) requests from a number and variety of clients simultaneously. This “separation of concerns” in OpenWoZ allows addition of commands, new sequencing of behaviors, and adjustment of parameters, all during run-time.

Download PDF
close

Roizman, M., et al. (2016). Studying the Opposing Effects of Robot Presence on Human Corruption [HRI'16 LBR]

Roizman, M., Hoffman, G., Ayal, S., Hochman, G., Reifen-Tagar, M., & Maaravi, Y. (2016)

Studying the Opposing Effects of Robot Presence on Human Corruption

11th ACM/IEEE International Conference on Human-Robot Interaction (HRI'16) Late Breaking Reports

Abstract

Social presence has two opposing effects on human corruption: the collaborative and contagious nature of another person’s presence can cause people to behave in a more corrupt manner. In contrast, the monitoring nature of another person’s presence can decrease corruption. We hypothesize that a robot’s presence can provide the best of both worlds: Decreasing corruption by providing a monitoring presence, without increasing it by collusion. We describe an experimental study currently underway that examines this hypothesis, and report on initial findings from pilot runs of our experimental protocol.

close

Alves-Oliveira, P., et al. (2016). Boosting Children’s Creativity through Creative Interactions with Social Robots [HRI'16 LBR]

Alves-Oliveira, P., Arriaga, P., Hoffman, G. & Paiva, A. (2016)

Boosting Children’s Creativity through Creative Interactions with Social Robots

11th ACM/IEEE International Conference on Human-Robot Interaction (HRI'16) Late Breaking Reports

Abstract

Creativity is one of the most important and pervasive of all human abilities. However, it seems to decline during school age years, in a phenomenon entitled ”creative crisis”. As developed societies are shifting from an industrialized economy to a creative economy, there is a need to support creative abilities through life. With this work, we aim to use social robots as boosters for creative-driven behaviors with children.

close

2015

Mizrahi, M., et al. (2015). Robotic Attachment [SPSP'15 Poster]

Mizrahi, M., Birnbaum, G. E., Hoffman, G., Sass, O., Reis, H. T., & Finkel, E. J. (2015)

Robotic Attachment: The Effects of a Robot’s Responsiveness on its Appeal as a Source of Consolation

Poster at the 16th Annual Meeting of the Society for Personality and Social Psychology

close

2014

Hoffman, G., Cakmak, M, &, Chao, C. (2014). Timing in human-robot interaction [HRI'14 Workshop Summary]

Hoffman, G., Cakmak, M, &, Chao, C. (2014)

Timing in human-robot interaction

Workshop in the 9th ACM/IEEE International Conference on Human-Robot Interaction

Abstract

Timing plays a role in a range of human-robot interaction scenarios, as humans are highly sensitive to timing and interaction fluency. It is central to spoken dialogue, with turn-taking, interruptions, and hesitation influencing both task efficiency and user affect. Timing is also an important factor in the interpretation and generation of gestures, gaze, facial expressions, and other nonverbal behavior. Beyond communication, temporal synchronization is functionally necessary for sharing resources and physical space, as well as coordinating multi-agent actions. Timing is thus crucial to the success of a broad spectrum of HRI applications, including but not limited to situated dialogue; collaborative manipulation; performance, musical, and entertainment robots; and expressive robot companions. Recent years have seen a growing interest in the HRI community in the various research topics related to human-robot timing. The purpose of this workshop is to explore and discuss theories, computational models, systems, empirical studies, and interdisciplinary insights related to the notion of timing, fluency, and rhythm in human-robot interaction.

close

2013

Hoffman, G. (2013). Evaluating Fluency in Human-Robot Collaboration [RSS'13 Workshop] Best Workshop Paper pdf

Hoffman, G. (2013)

Evaluating Fluency in Human-Robot Collaboration

Robotics: Science and Systems Workshop on Human-Robot Collaboration

Abstract

Please refer to the new journal version of this paper.

Collaborative fluency is the coordinated meshing of joint activities between members of a well-synchronized team. We aim to build robotic team members that can work side-by-side humans by displaying the kind of fluency that humans are accustomed to from each other. As part of this effort, we have developed a number of metrics to evaluate the level of fluency in human-robot shared-location teamwork. In this paper we discuss issues in measuring fluency, present both subjective and objective metrics that have been used to measure fluency between a human and robot, and report on findings along the proposed metrics.

Download PDF
close

2012

Hoffman, G., et al. (2012). Evaluating Music Listening with a Robotic Companion [IROS-iHAI'12]

Hoffman, G., Bauman, S., Elbaz, Y., Gottlieb, O., and Krug, S. (2012)

Evaluating Music Listening with a Robotic Companion

IEEE/RSJ International Conference on Intelligent Robots and Systems Int'l Workshop on Human-Agent Interaction

Abstract

Music listening is a central activity in human culture, and throughout history the introduction of new audio reproduction technologies have influenced the way music is consumed and perceived.

In this work, we discuss a robotic speaker, designed to behave as both a reproduction device and as a music listening companion. The robot is intended to enhance a human’s listening experience by providing social presence and embodied musical performance. In a sample application, it generates segment-specific, beat-synchronized gestures based on the song’s genre, and maintains eye-contact with the user.

We describe an experimental human-subject study (n=67), evaluating the effect of the robot’s behavior on people’s enjoyment of the songs played, as well as on their sense of the robot’s social presence and their impression of the robot as an autonomous agent.

close

2011

Hoffman, G. (2011). On Stage: Robots as Performers [RSS'11 Workshop] pdf

Hoffman, G. (2011)

On Stage: Robots as Performers

Robotics: Science and Systems Workshop on Human-Robot Interaction

Abstract

This paper suggests to turn to the performative arts for insights that may help the fluent coordination and joint-action timing of human-robot interaction (HRI). We argue that theater acting and musical performance robotics could serve as useful testbeds for the development and evaluation of action coordination in robotics. We also offer two insights from theater acting literature for HRI: the maintenance of continuous sub-surface processes that manifest in motor action, and an emphasis on fast, inaccurate responsiveness using partial information and priming in action selection.

Download PDF
close

2010

Hoffman, G., Weinberg, G. (2010). Shimon: An Interactive Improvisational Robotic Marimba Player [CHI'10 Extended Abstract]

Hoffman, G., Weinberg, G. (2010)

Shimon: An Interactive Improvisational Robotic Marimba Player

Extended Abstracts Proceedings of the ACM International Conference on Human Factors in Computing Systems

Abstract

Shimon is an autonomous marimba-playing robot designed to create interactions with human players that lead to novel musical outcomes. The robot combines music perception, interaction, and improvisation with the capacity to produce melodic and harmonic acoustic responses through choreographic gestures. We developed an anticipatory action framework, and a gesture-based behavior system, allowing the robot to play improvised Jazz with humans in synchrony, fluently, and without delay. In addition, we built an expressive non-humanoid head for musical social communication. This paper describes our system, used in a performance and demonstration at the CHI 2010 Media Showcase.

close

Hoffman, G. (2010). Anticipation in Human-Robot Interaction [AAAI'10 Spring Symposium]

Hoffman, G. (2010)

Anticipation in Human-Robot Interaction

AAAI 2010 Spring Symposium: It’s All in the Timing

Abstract

Anticipating the actions of others is key to coordinating joint activities. We propose the notion of anticipatory action and perception for for robots acting with humans. We describe four systems in which anticipation has been modeled for human-robot interaction; two in a teamwork setting, and two in a human-robot joint performance setting. In evaluating the effects of anticipatory agent activity, we find in one study that anticipation aids in team efficiency, as well as in the perceived commitment of the robot to the team and its contribution to the team’s fluency and success. In another study we see anticipatory action and perception affect the human partner’s sense of team fluency, the team’s improvement over time, the robot’s contribution to the efficiency and fluency, the robot’s intelligence, and the robot’s adaptation to the task. We also find that subjects working with the anticipatory robot attribute more human qualities to the robot, such as gender and intelligence.
close

Gray, J., et al. (2010). Expressive, Interactive Robots [HRI'10 Workshop]

Gray, J., Hoffman, G., Adalgeirsson, S. O., Berlin, M., & Breazeal, C. (2010)

Expressive, Interactive Robots: Tools, Techniques, and Insights Based on Collaborations

HRI 2010 Workshop: What do collaborations with the arts have to say about HRI?

Abstract

Abstract—In our experience, a robot designer, behavior architect, and animator must work closely together to create an interactive robot with expressive, dynamic behavior. This paper describes lessons learned from these collaborations, as well as a set of tools and techniques developed to help facilitate the collaboration. The guiding principles of these tools and techniques are to allow each collaborator maximum flexibility with their role and shield them from distracting complexities, while facilitating the integration of their efforts, propagating important constraints to all parties, and minimizing redundant or automatable tasks. We focus on three areas: (1) how the animator shares their creations with the behavior architect, (2) how the behavior architect integrates artistic content into dynamic behavior, and (3) how that behavior is performed on the physical robot.

close

2006

Hoffman, G. (2006). Acting Lessons for Artificial Intelligence [AI'50 Summit]

Hoffman, G. (2006)

Acting Lessons for Artificial Intelligence

50th Anniversary Summit of Artificial Intelligence

Abstract

Theater actors have been staging artificial intelligence for centuries. If one shares the view that intelligence manifests in behavior, one must wonder what lessons the AI community can draw from a practice that is historically concerned with the infusion of artificial behavior into such vessels as body and text. Like researchers in AI, actors construct minds by systematic investigation of intentions, actions, and motor processes with the proclaimed goal of artificially recreating human-like behavior. Therefore, acting methodology may hold valuable directives for designers of artificially intelligent systems. Indeed, a review of acting method literature reveals a number of insights that may be of interest to the AI community.

close

Hoffman, G., & Breazeal, C. (2006). Robotic Partners’ Bodies and Minds [CogRob'06]

Hoffman, G., & Breazeal, C. (2006)

Robotic Partners’ Bodies and Minds: An Embodied Approach to Fluid Human-Robot Collaboration

AAAI'06 Fifth International Workshop on Cognitive Robotics

Abstract

A mounting body of evidence in psychology and neuroscience points towards an embodied model of cognition, in which the mechanisms governing perception and action are strongly interconnected, and also play a central role in higher cognitive functions, traditionally modeled as amodal symbol systems.

We argue that robots designed to interact fluidly with humans must adopt a similar approach, and shed traditional distinctions between cognition, perception, and action. In particular, embodiment is crucial to fluid joint action, in which the robot’s performance must tightly integrate with that of a human counterpart, taking advantage of rapid sub-cognitive processes.

We thus propose a model for embodied robotic cognition that is built upon three propositions: (a) modal, perceptual models of knowledge; (b) integration of perception and action; (c) top-down bias in perceptual processing. We then discuss implications and derivatives of our approach.

close

Hoffman, G., & Breazeal, C. (2006). What Lies Ahead? Expectation Management in Human-Robot Collaboration [AAAI'06 Spring Symposium]

Hoffman, G., & Breazeal, C. (2006)

What Lies Ahead? Expectation Management in Human-Robot Collaboration

AAAI 2006 Spring Symposium: To Boldly Go Where No Human-Robot Team Has Gone Before

Abstract

We aim to build robots that go beyond command-and-response and can engage in fluent collaborative behavior with their human counterparts. This paper discusses one aspect of collaboration fluency: expectation management -predicting what a human collaborator will do next and how to act on that prediction. We propose a formal time-based collaborative framework that can be used to evaluate this and other aspects of collocated human-robot teamwork, and show how expectation management can enable a higher level of fluency and improved efficiency in this framework. We also present an implementation of the proposed theoretical framework in a simulated human-robot collaborative task.

close