The 7th International Conference on Movement and Computing Is a Visionary Success
Interdisciplinary conference chaired by Dr. Antonia Zaferiou explores the intersections between human movement and technology
On July 15-17, a diverse group of innovators gathered virtually to explore the use of computational technology to support and understand human movement practices (like computational analysis) as well as movement as a means of interacting with computers (such as movement interfaces) as part of the 7th International Conference on Movement and Computing, also known as MOCO’20. This interdisciplinary conference was a shared venture between Rutgers University, Mana Contemporary, and Stevens Institute of Technology, with biomedical engineering professor Dr. Antonia Zaferiou serving as the general conference chair and scientific chair.
The virtual conference’s presenters covered a wide range of computational studies, including modeling, representation, segmentation, recognition, classification, and generation of movement information, and provided an interdisciplinary understanding of movement that ranged from biomechanics to embodied cognition and the phenomenology of bodily experience. Conference presentations and session recordings are posted on the conference’s website.
“The MOCO community is a really interesting group of people who span fields including computer science and art and everything in between,” Zaferiou said, describing this year’s event as an inspirational success. The conference had approximately 142 registrants from all across the globe, including participants from France, Germany, Italy, Spain, and even Australia.
MOCO’20 featured three keynote addresses, underscoring the interdisciplinary nature of the conference, starting with keynote speaker Dr. Dimitris Metaxas, a Distinguished Professor in the Department of Computer Science at Rutgers University, where he directs the Center for Computational Biomedicine, Imaging and Modeling (CBIM). Dr. Metaxas has been conducting research towards the development of formal methods upon which both computer vision, computer graphics and medical imaging can advance synergistically. In computer vision, he works on the simultaneous segmentation and fitting of complex objects, shape representation, statistical model-based tracking, learning, sparsity, American Sign Language (ASL) and gesture recognition. In particular he is focusing on human body and shape motion analysis, human surveillance, security applications, ASL recognition, behavior modeling and analysis and scalable solutions to large and distributed sensor-based networks.
The second keynote speaker was Heidi Latsky, the executive and artistic director of Heidi Latsky Dance, a New York-based, female-run organization dedicated to the creation of relevant, immersive performance art that is accessible to all.
“Heidi works with professional disabled dancers,” Zaferiou explained. “Her most recent work highlights the current dichotomy in which dancers with disabilities now have both increased access, with everything moving online [due to the Covid-19 pandemic] but also decreased access—for example, if the only way they had access to WiFi was to go to a coffee shop.”
In her keynote presentation, Latsky facilitated a panel discussion about what it is like to work with her building interfaces at the intersection of technology and art.
The third and final keynote speaker was Dr. Gil Weinberg, a professor and the founding director of the Georgia Tech Center for Music Technology, where he leads the Robotic Musicianship group. His research focuses on developing artificial creativity and musical expression for robots and augmented humans. Among his projects are a marimba playing robotic musician called Shimon that uses machine learning for composition, improvisation, and interaction, and a prosthetic robotic arm for amputees that restores and enhances human musical abilities.
“His keynote was about robotic musicianship and human augmentation,” Zaferiou said. “For example, he worked with a drummer who had lost a limb. Standard prosthetics only allow very rigid connection between the musician and the drumstick. As a drummer, you need the interface between the drumstick and hand to be compliant to provide the drumstick the ability to modulate the force profiles between the stick and drum that drive the expressive quality of the sound. So [Weinberg] created adaptive controllers that allowed the drummer not only to play as he once did, but also at superhuman speeds.”
In addition to the three keynotes, there were other invited presentations and events.The second day included three stand-out invited presentations from “STEM From Dance”, Rutgers University’s “Dance for Parkinson’s”, and “Living Practice: Choreographer Kyle Marshall” presented by Mana Contemporary.
STEM From Dance was a networking event hosted by the STEM From Dance CEO Yamilée Toussaint Beach who founded the organization to improve diversity in the STEM workforce: “STEM From Dance gives girls of color access to a STEM education by using dance to empower, educate, and encourage them as our next generation of engineers, scientists, and techies.” During the event, attendees discussed ways to share their work with the public and young students to encourage and support a diverse future of STEM leaders.
Dance for Parkinson’s was presented by Dr. Jeff Friedman from the Dance Department at the Mason Gross School of the Arts at Rutgers University. “Rutgers has a group helping people with Parkinson’s disease control their movement through dance class, which includes valuable social interactions. This presentation was a tearjerker, frankly,” Zaferiou explained. “The way they’re interweaving dance technology and health is very exciting and important.”
Later in the day, Mana Contemporary presented an artist talk with choreographer Kyle Marshall. Marshall discussed his award-winning choreography and what it is like to be a Black man in dance right now. “He discussed George Floyd's murder and shared his raw reaction to that horrific crime through his improvisational choreography from the day it happened. We were blown away.” noted Zaferiou.
The conference was proud to host more than 50 presentations that spanned scientific and artistic domains related to movement and computation, with 40 of the presentations having associated peer-reviewed papers and extended abstracts published by the Association for Computing Machinery. Dr. Jean Zu and Dr. Dilhan Kalyon generously supported the conference and made remarks to the international group of participants. Frances Salvo, the Assistant Director of Academic Event Management, helped Zaferiou organize the conference. When the organizing committee shifted to a virtual format, Michael Scalero and Valerie Dumova supported the event by providing a team of Stevens IT student employees to monitor and host zoom meetings. These IT student employees along with the undergraduate research assistants from Zaferiou’s lab worked together to host zoom meetings for concurrent presentation sessions.
In addition to the outside speakers, three graduate students from Stevens presented their work in MOCO’s graduate symposium. Mitchell Tillman of the Musculoskeletal Control and Dynamics lab presented his research “Real-time Optical Motion Capture Balance Sonification System,” and Sean Stanford and Mingxiao Liu of the Movement Control Rehabilitation lab presented their research “The Effects of Visual Feedback Complexity on Training the Two-Legged Squat Exercise” and “Inducing Cognition of Secure Grasp and Agency to Accelerate Motor Rehabilitation from an Instrumented Glove”, respectively.
“From a technology standpoint, the MOCO community is very innovative,” Zaferiou said. “Many researchers and practitioners at MOCO are pushing technology to be increasingly interactive. These innovations open the door to use interactive systems to significantly advance the future of fields spanning medicine, robotic construction, gaming, smart homes, and more. These advances foster human-computer collaboration for both task-driven purposes and future avenues of human expression. I am incredibly thankful for the teamwork both within Stevens and with our collaborating institutions that made this international interdisciplinary conference an enormous success, for the first time virtually!”
To view more videos from the conference, explore the following videos:
Invited Research Highlight of Dr. Thomas Papathomas, Laboratory of Vision Research (LVR)Rutgers University
Approaching 21: Naming Things—Maya Man
Human-Sound Interaction: Towards a Human-centred Sonic Interaction Design Approach—Balandino Di Donato, Christopher Dewey, and Tychonas Michailidis
Neural Connectivity Evolution during Adaptive Learning With and Without Proprioception—Harshit Bokadia, Jonathan Cole, and Elizabeth Torres
Movement Computing Education for Middle Grades—Yoav Bergner, Deborah Damast, Allegra Romita, Shiri Mund, and Anne Marie Robson Smock
Let's Resonate! How to Elicit Improvisation and Letting Go in Interactive Digital Art—Jean-François Jego and Margherita Bergamo
MoViz: A Visualization Tool for Comparing Motion Capture Data Clustering Algorithms—Lucas Liu, Duri Long, and Brian Magerko
Learn more about biomedical engineering at Stevens:
Learn more about research in the Department of Biomedical Engineering →