This blog is intended to provide thoughtful commentary on modern systems. Faculty at the School of Systems and Enterprises at Stevens Institute of Technology share their perspectives and insights about systems, society and technology; design, innovation and education; complexity, architecture and modeling; and analytics, big data and visualization.

Complexity and the Price of Control

Michael Pennock | March 7, 2017

Stevens Systems Perspectives blog

“Complexity” is ubiquitous term, yet no one seems to know quite what it means. Or, more properly, the word means different things to different people. Rather than teasing out the various nuances of its usage, let’s consider why complexity has risen to buzzword status. 

Our fascination with complexity may be symptomatic of a growing recognition of the failings of our industrial age systems of technical and social control.  These failings manifest themselves in a myriad of ways, but in all cases it seems to involve a frustration with our inability to get a particular system, whether technical or social, to behave “the way it is supposed to.”

Complex versus complicated

Let’s assert that the intent of all engineered systems is to tame the natural world in some manner and steer it toward our ends. This can range from the simple (a roof to keep the rain out), to the complicated (an airliner to move us around more quickly), to the abstract (a system of laws and government to help us live and work together).  A roof on a house seems to be fairly stable, while tax law seems to be under constant revision. Considering just these two examples of engineered solutions, the explanation seems obvious. People, and perhaps more importantly groups of people, are always changing their behavior in unpredictable ways.  In the United States, a slight change in who shows up to vote in an election can shift the entire philosophy of the tax code. This is not an issue for the typical roof. Rain and snow are predictable phenomena in comparison. So, one might conclude, as some have, that social systems are complex and technical systems are complicated.  A complicated system has well defined cause and effect relationships. A complex system does not (Poli 2013).

While the point is well taken, the line may be starting to blur as the social and the technical intermingle.  Some have argued that modern technological systems are approaching a level of complexity similar to that of biological systems (Alderson and Doyle 2010). Consider the case of a modern fighter aircraft such as the F-35 (Joint Strike Fighter). Colloquially, it has been referred to as a complex system even though it is engineered to be complicated. Why? There are so many interacting hardware and software components that it becomes difficult to predict the behavior of the system under a broad range of circumstances. Unintended interactions can trigger surprise failures. Consequently, a modern fighter aircraft takes substantially longer to design and develop than its predecessors. The F-35 is years behind schedule and experiencing substantial cost overruns (Browne 2016)

Engineering in the face of complexity

Why not just make the F-35 simpler? In many cases, the design of technical system is just as subject to the evolving and adaptive behavior of humans as is the tax code. In a military context, adversaries adapt to each other’s actions. Measures beget countermeasures. Thus, the modern fighter aircraft is in a sense an accumulation of countermeasures to an adversary’s real or perceived capabilities. It is difficult to remove one from the design without introducing a vulnerability. Thus, one is left with an extremely challenging and expensive task of engineering the aircraft to accommodate the sum total of an adversary’s relevant capabilities.  This is not to mention that the aircraft also has to interoperate with friendly forces, that the program has to retain the support of Congress, and that adversaries will adapt to whatever is ultimately built.  These challenges are not limited to military applications. Civil systems such as airliners and automobiles also face continuously involving demands in terms of performance, safety, and cost, to name a few.  Such circumstances have led some to call for a better understanding of engineering complex systems (Ottino 2004). 

The price of control

While we may not want to intentionally engineer a system to be complex, we are going to have to deal with the fact that the context in which many systems operates is becoming increasingly complex. To consider why, let us return to the social system. In “The Collapse of Complex Societies,” the archeologist Joseph Tainter (1990) studied the collapse of historical societies such as the Roman Empire and the Mayan civilization.  He reached the conclusion that societies are problem solving organizations. Each time a society faces a crisis, it develops a solution. A modern example is the creation of the U.S. Consumer Financial Protection Bureau (CFPB) in response to the 2008 financial crisis. 

However, each solution imposed consumes some of the society’s resources. Since solutions are seldom removed, the solutions accumulate over time. Eventually, so much of the society’s resources are consumed by the implemented solutions that there are not enough left to deal with the next crisis and the society collapses.

At first glance, this may seem to be a problem of how a government should be run. Rather, the issue is more fundamental. For example, privatizing the Federal Aviation Administration in the United States, as some have proposed, would not remove the solution or its costs, it would simply shift them to another entity. Unless we suddenly decided to give up air travel entirely, we will still need mechanisms to coordinate the use of airspace.  The control function is still required. The choice of private or public is simply a choice in implementation.

In an accessible treatment of this concept, John Casti (2012) adapted Ashby’s law of requisite variety to assert that a control mechanism must be at least as complex as the system it controls.  Thus, as the complexity of society increases, the complexity of the control system must increase as well to maintain stability. If that does not occur, there is a mismatch. Casti calls this mismatch a “complexity gap.” The greater the gap, the greater the risk of societal collapse.

Is the complexity of society increasing? Given the definitional issues described above, complexity is not such an easy thing to measure. However, it stands to reason that as technology evolves and society learns more, its behavior becomes more complex. Technology enables a society more degrees of freedom.

In a future blog post, I will discuss approaches that some have proposed for managing this complexity.

Stay tuned. 


--Michael Pennock
Assistant Professor, Systems Engineering

Dr. Michael Pennock - Stevens School of Systems and Enterprises Professor Pennock’s current research interests involve modeling of enterprise systems and systems of systems, multi-scale modeling, and model uncertainty. His research application domains include health care, national security and finance.




  • Alderson, D. L., & Doyle, J. C. (2010). Contrasting views of complexity and their implications for network-centric infrastructures. IEEE Transactions on systems, man, and cybernetics-Part A: Systems and humans, 40(4), 839-852.
  • Browne, R. (2016, April 27) John McCain: F-35 is 'a scandal and a tragedy,' CNN, retrieved from
  • Casti, J. L. (2012). X-events: The collapse of everything. Harper Collins.
  • Ottino, J. M. (2004). Engineering complex systems. Nature, 427(6973), 399-399.
  • Poli, R. (2013). A note on the difference between complicated and complex social systems. Cadmus, 2(1), 142.
  • Tainter, J. (1990). The collapse of complex societies. Cambridge University Press. 

The Long Road to Fully Autonomous Vehicles

Yeganeh M. Hayeri | November 11, 2016

Uber is testing autonomous cars throughout Pittsburgh. And Google and Tesla have tests of their own in different cities. Technology has put us on a road to self-driving vehicles, but we have a long way to go before we reach full autonomy.

There are different levels of vehicle autonomy and automated applications associated with each level. Examples of autonomy today include adaptive cruise control, which enables a vehicle to adjust itself if it’s getting too close to a leading vehicle, and lane departure warning, which causes the steering wheel to shake if the vehicle shifts out of lane. These automated features promote safety on the road.

The ultimate goal is to get to self-driving vehicles. Meanwhile, we want to minimize any potential negative consequence the autonomous vehicle systems – especially self-driving cars – could impose as we move forward. Learning from the lessons of the past, the U.S. freeway system was supposed to solve the traffic congestion problems. Yet the largest freeway system in the Los Angeles area has the worst traffic congestion problem and the consequential pollution problems in the nation. My work as a researcher revolves around impact analysis: how autonomous technologies will change our transportation system and society as a whole. How do the new technologies change travel behavior? What would be the impact on other modes of transportation? What policies could accelerate the advancement and implementation of autonomous technologies while minimizing the unintentional negative consequences they might impose? Given that transportation system is the backbone of our society and that it is changing rapidly, practically everything we can think of will change: the economy, housing situations, real estate markets, energy security, and urban planning to name a few.

In my research, I develop models to study the impact of the autonomous vehicle systems on various other systems. This field of research is novel and exciting but it comes with its challenges, namely the fact that data in this realm is scarce. As a result we need to optimize between the few datasets we have and the use of assumptions and engineering judgments to intact the reliability of our models. 

Transportation systems as the backbone of our economy

Everything that you can think of in terms of transportation – from how our food and products are delivered and accessed by consumers to methods of transport including train, plane, bus and cars – all of it is essential for a working economy and society. Given that the transportation system is the backbone of our society and that it’s changing rapidly, the impact that new autonomous vehicle technologies will have on our society can practically change everything that you can think of including travel behavior, environmental issues, insurance policies and even existing transportation training (e.g. driver’s license training, law enforcement training).

Here are specific examples of a few changes to expect:

  • Volume. A significant percentage of people cannot drive, whether because of a disability or age. For example, kids and elderly populations do not drive. Imagine how this significant part of the population can benefit from self-driving cars. A different way of looking at it is how much additional traffic will be on the roads? How will traffic congestion look like with the additional demand?
  • Environmental issues. Autonomous vehicles can still use fossil fuels. So if there are more cars on the road, how much more pollution will be created? However, if this technology is tied with electric vehicles or with other alternative fuel sources, then what’s the impact of greenhouse gas emissions and particulate matters? 
  • Accident liability. How are insurance companies going to adapt? Who is going to be liable in case of car crashes? The manufacturing company? The person behind the wheel? How will our insurance premiums change?
  • Infrastructure. Departments of Transportation (DOT) will need to start (some already have) looking into their infrastructure and particularly the older infrastructure. Especially with connected vehicles technologies a significant number of equipment will need to be upgraded around the nation (e.g. signal controllers).  
  • Training. Will DOTs have to train their staff differently? Are driver’s licenses necessary if cars are self-driving? What is going to happen to training curriculums for law enforcement? And what about driver training, like students taking driver’s education in high school? What happens if an examinee takes a highly automated car to his driving test? Will officers be able to appropriately conduct the test with the various automated packages installed on vehicles?
  • Lanes: A standard lane on our highways is generally 12 feet wide to be safe. If cars are going to be self-driving in 20 years, the premise is that there will be zero fatal accidents and therefore safer roadways. Will we really need SUVs to feel safer then? Cars might be smaller as a result. And the technology will do a much better job of keeping vehicles in their own lane. This might open a new window to lane design, narrowing the lanes, providing more space to possibly increase the number of lanes on stretches of highways where bottlenecking is a recurring problem. 

Exciting possibilities

A few years ago in Pennsylvania, backup cameras in cars were taped off before driver’s tests. Nowadays, where most cars have backup cameras, there is no taping off and drivers can use the camera when they are being tested for parking. And by 2018, all cars will have backup cameras by mandate. This story illustrates how travel behaviors and policies will need to evolve with the implementation of autonomous vehicle technologies.  

If autonomous vehicle technology is married with car sharing concepts like Uber and Lyft, then it will produce additional benefits to society. But a lot has to come together for this to become a reality. And we have to consider U.S. culture: Will citizens want to let go of their cars, and the independence that comes with it, and rely on ride sharing instead? Are we culturally there yet?

Our infrastructure also has to be aligned for the changes to come. With that comes questions on how transportation and government agencies will support new technologies. There are political obstacles as well; every state and jurisdiction is different when it comes to transportation regulations. It’s hard to predict what it’s all going to look like, but a self-driving car society is rather exciting and provides researchers, myself included, a vast opportunity for finding solutions to novel and exciting questions.


-- Yeganeh M. Hayeri
Assistant Professor, Systems Engineering

Professor Hayeri’s research focuses on transportation systems, connected and automated vehicles, and infrastructure, climate and energy security. As part of the project, Connected and Autonomous Vehicles 2040 Vision, she studied the impacts of autonomous and connected vehicles on infrastructure, design, communications, investment decisions, freight, driver licensing, real time data usage and workforce training.


Building Humane Systems

Gregg Vesonder | September 26, 2016

Pull sign on door - SSE blog

Why does a door need an instruction manual? Consider that most 30-year-olds have probably opened at least 100,000 doors throughout their lifetime. Yet, most building entry doors have a one-word instruction manual, PUSH or PULL. Often this instruction goes unread and the person PUSHes when they should PULL.  The doors of the School of Systems and Enterprises at Stevens Institute of Technology have similar instructions and yet some of the smartest persons I know ignore the directive!

So if we have difficulty designing doors that are useable without directions, what hope do we have designing our interactions with complex systems?  

Let’s consider some doors that we have no difficulty opening.  The first are doors in our homes that have knobs.  When we approach a doorknob it practically screams, “turn me!”  Usability folks would say that a doorknob affords turning.  This usability concept is called affordance and it states that a well-designed interface’s operation is readily understood by the user.  The door is designed in such a way that its use is obvious. Another type of door, like the ones found in supermarkets, open automatically.  The designer has simplified the interaction of the door with the user, in this example the shopper, by automating it.

It’s about user experiences

So how does the design of doors relate to the design of interactions with complex systems?  Well-designed doors consider the context in which they are used and how they are used.  Supermarket doors are automatic because you are either laden with packages or pushing a shopping cart. Modern user design has formalized this by focusing on the user experience and addressing the issues through User Centered Design (UCD).

User experience focuses on interaction in the large.  Rather than optimizing how an individual interacts with technology, it is about understanding how this interaction affects the larger task that the person or team is trying to accomplish while focusing on the environment in which it is used.  At the outset of the design process, questions are asked to identify user needs. Is it a noisy factory or a quiet research center?  Is the user or the team fatigued or fresh? What do the users want to optimize - efficiency, error tolerance, effectiveness or (accuracy)? Is it easy to learn or engaging?[1]

The process for addressing these needs is UCD.  Naturally, UCD involves the users of the system; they play a key role in the complete design process. 

These are 5 necessary steps for ensuring proper design interactions with cyber physical systems:

  • Identify the user. First we must identify the user in established systems or define a user in new systems. In either case it is often best to understand the user by using ethnography, studying users where they use, or will use, the system.  
  • Determine user needs. You want to observe what they do, not what they report they do.  Observe them actually doing the task rather than discussing it.
  • Create a prototype. Enlightened with information on user needs, you begin the design process by creating successively more realistic prototypes of the user experience.  In the beginning these prototypes will be low fidelity, hand drawn or comic strip renditions of the experience.  Each of these renditions is shown to the user and feedback is collected in an increasingly formal manner with increasingly more refined prototypes.  
  • Ask questions. Key questions asked throughout the process are: “Can the user do what they want to do when they want to do it? “ and “When they do it does the system do what the user wants it to do?” An answer of no to either of these questions indicates the user experience has failed.
  • Evaluate. Evaluation in UCD is a continuous process.  Review of the user experience does not stop when the system has been deployed.  It is a continuous process to further refine the user experience.

In my description of User Centered Design you will notice that unique skills are necessary.  Psychologists, sociologists, creative designers and anthropologists may be needed to build humane systems. Take your local social scientist out to lunch to begin the process!  These resources also may help:

  • Moggridge, B. Designing interactions.  MIT Press, 2007, ISBN 0-262-13474-8.  Great  humane design 
  • Schneiderman, B., Plaisant,  C, Cohen, M., Jacobs, S. and Elmqvist, N. Designing the user interface (6th Edition), Pearson, 2016, ISBN 013438038X – classic UCD textbook
  • Norman, D. The design of everyday things, Basic Books, 2013, ISBN 0465050654 – if you only read one of these books, read this one.  Life changing (at least it was for me).
  • Raskin, J.  The humane interface, Addison-Wesley, 2000, ISBN 0201379376.  Inventor of the original macintosh interface.

Of course you also can contact me, that is, once I understand how to open my lab door.  Later!


--Gregg Vesonder
Industry Professor; Director, Research, Systems and Software Division

Professor Vesonder’s current research interests include: software engineering and system development, cyber-physical and socio-technical systems, Smart Cities, human computer interaction, and evolvability. He has over 35 years of industry experience, including serving as Executive Director of the Cloud Platforms Research Department at AT&T Labs Research, which focused both on cloud platforms and mobile and pervasive systems. Today, he is both a Bell Labs and an AT&T Fellow.

[1] Quesenbery, W.  Dimensions of usability: Defining the conversation, driving the process.  Proceedings of the UPA, 2003.