A Whole New World: Stevens Researchers Explore How A.I. Shapes Landscapes, Both Real and Virtual
From Massive Video Games to Minuscule Microchips, Autonomous Tools Are Changing the Way Designers Work
When Dr. Aron Lindberg first started playing video games as a kid back in the 1990s, he had no idea that he was planting the seeds for future research.
“The games were really small and had different levels that were highly constrained, but a human being had designed them bit by bit,” said Dr. Lindberg, an assistant professor at the School of Business. “Today what video game consumers increasingly expect is open-world games, which means you need massive spaces.”
To produce these boundary-pushing games, video game designers are turning to autonomous tools, which use artificial intelligence and machine learning to create these worlds. But, as researchers at Stevens Institute of Technology are discovering, these tools can’t just be applied across the board.
“If you use autonomous tools for everything, you’ll probably end up with very stiff designs that don’t really have a human touch to them,” Dr. Lindberg said. “And if you do it all using manual design, it’ll be very expensive.”
Dr. Lindberg is part of a team of researchers — which includes Dr. Jeffrey Nickerson, associate dean of research at the School of Business — that is exploring the use of these type of autonomous tools across different design industries. In a recent paper in IEEE Computer, the researchers looked at the design process for “Tom Clancy’s Ghost Recon Wildlands,” the largest open-world game ever from video game studio Ubisoft. During play, gamers explore a fictionalized version of Bolivia, with villages and cities scattered throughout a landscape of real landmarks, such as the infamous “Death Road” and Red Lagoon, and varied ecosystems, from the country’s Amazonian jungles to its Andean peaks.
It would have been virtually impossible to create this entire world manually — even with a massive team of designers — so Ubisoft streamlined the process by strategically deploying autonomous tools to build out certain aspects of the game, such as the terrain and road networks. This allowed a smaller team of designers to focus on areas that would have the biggest impact on the game’s narrative.
A.I. is shaping other virtual worlds to keep users engaged, as well, Dr. Lindberg said.
“When you play a video game, it’s clear that there is an intentionally designed world within the game,” he said. “Yet, you’re actually encountering a very similar situation when you engage with social media, such as Facebook and Twitter. You don’t have a visual landscape with mountains and rivers, but it’s a carefully designed experience driven by advanced A.I. algorithms.”
A changing creative process
These tools and their varying levels of autonomy have changed designers’ roles and responsibilities. A designer needs to set parameters for the tool, then modify those parameters based on the output.
Though these novel tools are reshaping the design process, the way they are managed isn’t entirely unfamiliar.
“It's a different way of working,” Dr. Nickerson said. “It's like delegating to human beings, where you ask them to go off and try something and come back with the results.”
This new type of workplace relationship between humans and machines is just one of the threads Dr. Nickerson is researching through a grant from the National Science Foundation.
“We're looking at the effects of artificial intelligence on the workplace,” he said. “Our job is to assemble interdisciplinary teams of researchers and help them create a common language for talking about this.”
Dr. Nickerson and Dr. Lindberg, along with the rest of their autonomous tools research team, sought to add to this common language in another recent paper in Communications of the ACM. They proposed what they referred to as a “triple-loop approach” to the design process, referring to the back-and-forth learning interactions between humans and machines.
“It speaks to the way people are learning from machines in these environments, and also the way machines are learning from people,” Dr. Nickerson said.
In another example, the research team looked at the design of semiconductor chips, which are found in mobile phones and other electronics. As chips have gotten smaller, yet more complicated, the design process has had to change.
“Designers of the early semiconductor chips from the 1950s and onward would draw their plans on physical blueprints, which grew excessively large, thus prompting a process of digitalization,” Dr. Lindberg said. “Today, if you had to print out a chip design on paper, it would cover all of Hoboken, since modern chips often feature billions of components.”
As with video game designers, chip designers use autonomous tools to generate layout options, inputting and modifying parameters based on the design outcome. Through the cycles of this process, designers learn more about the tool and can improve their parameters, while developers of the tool itself can also improve their algorithms by learning about the mental models of both the machine and the designer.
Human-machine interaction will continue to redefine the design process, as well as other areas of work, which is why Dr. Lindberg refers to this as an “ongoing research journey.” Much like players in the open world of “Ghost Recon Wildlands,” he and Dr. Nickerson have plenty of terrain to explore.
School of Business Business research Information Systems Program