SkepticblogSkepticblog logo banner

top navigation:

Our Coming Robot Overlords

by Steven Novella, May 31 2010

The recent oil spill in the Gulf has prompted a great deal of wringing of hands – how do such disasters happen? David Brooks discusses in the New York Times that the cause is primarily due to the fact that our modern technological civilization is becoming too complex for us to manage adequately. The Deepwater Horizon oil rig is just one example of a piece of technology that is beyond the mastery of any single person. But there are also nuclear power plants, computer operating systems, jet airliners, financial systems, operating rooms, and numerous other examples.

Brooks concludes:

So it seems important, in the months ahead, to not only focus on mechanical ways to make drilling safer, but also more broadly on helping people deal with potentially catastrophic complexity. There must be ways to improve the choice architecture — to help people guard against risk creep, false security, groupthink, the good-news bias and all the rest.

This seems reasonable. Certainly we  need to get better at managing such complexity, by having clear lines of authority and responsibility, proper risk assessment, and a thorough understanding of group dynamics.

I also wrote previously about The Checklist Manifesto – Atul Gawande offers his own solution, the humble checklist. He convincingly argues that using checklists allows for better communication among individuals engaged in complex tasks, and results in systematic best practices and avoidance of forgetting details. Don’t rely on training and human memory, he argues – have a system.

These are all excellent and practical ideas. They amount to devising methods for human beings to optimize their management of increasing complexity. And these strategies all work.

But I am left with the feeling, taking the longer view, that they are also stop-gap measures. Technological complexity will continue to increase, and it seems likely that complexity will outstrip our desperate attempts to manage it by modifying human behavior. Even if we become experts at mastering complexity, there are practical limits to what we can do.

This also relates to the notion of interdependency. As technology advances, people have to specialize on narrower and narrower slices of that technology. It used to be that craftsmen would construct an entire house – from foundation to finishing. Now a contractor will contract out to dozens of specialists who build one part of the house – an architect, a site engineer, foundation layer, a framer, a roofer, a plumber, electrician, drywall, painter, finishing carpenter, well driller, landscaper, and perhaps even a decorator. You would be hard pressed to find a single crew that can build an entire house anymore.

Medicine is another example – we still need generalists, but more and more they are becoming like contractors – guiding the overall management of health, but farming out to specialists the management of specific diseases and procedures.

The solution that we are increasingly turning to is computers. We are automating those checklists, using computers to communicate among specialists, and expert systems to guide specialists. Computer programs are increasingly involved in design and engineering,  risk management, and even the writing of other computer programs.

As end users we are becoming increasingly comfortable relying upon computers for our needs. The GPS is the perfect example of this. Rather than consulting a map, learning the roads, and planning a route – you simply plug in your destination and mindlessly follow turn-by-turn directions. They are wonderful devices, extremely useful, and that is part of my point.

Computers are already evolving past useful to indispensable. The question is – are they tools to make us more productive and free us from drudgery so that we can engage in more creative endeavors? Or are they making us lazy and dependent by coddling us? Perhaps a combination of both.

One extreme vision of a dystopian future run by computer nannies is Wally. On board the ship that carries the remnants of humanity, people float around on recliners, endlessly engrossed in video entertainment, while the ship’s systems see to their every need.

I am not predicting such a future, nor am I advocating that we simply extrapolate from current trends to their absurd conclusion. But it is an interesting thought experiment – what will ultimately happen as our civilization becomes more and more complicated, and we need to rely more and more on computers? Some think we will merge with computers – we will become them, and will be able to expand our own mental abilities as needed. Perhaps.

We may also reach an equilibrium, at least for a while, between relying on computers and using computers to enhance our own abilities and productivity. We may also see subcultures in which every permutation plays itself out.

We also should not assume that as technology progresses complexity will necessarily increase as well. We may pass through a technological era of maximal complexity, but as our knowledge and technological prowess progress complexity will be replaced by elegant simplicity. Think of the elaborate systems that were in place to, say, publish a magazine, that have now all been replaced by the relative simplicity of desktop publishing. Perhaps technology brings an ebb and flow of complexity, rather than a continuous increase.

It also seems that humans will, to some extent, titrate their own complexity. As technology makes our lives simpler, we find new things to do that add complexity – to the limit of our tolerance, then we search for ways to simplify again.

All of these factors make predicting the future of technology and complexity extremely difficult. But it is interesting to think about.

20 Responses to “Our Coming Robot Overlords”

  1. MadScientist says:

    Until all investigations are concluded, we can’t really say much about BP’s blowout. All parties are pointing to eachother saying the other is to blame. Until we know what failed and what happened leading up to the failure, it would be silly to attempt to prescribe anything except perhaps for a blanket moratorium on new drilling of offshore exploration and production wells. Otherwise the usual questions after events such as these would run along the lines of “was the gear used adequate, and if not is there a need to develop gear suitable for use in such circumstances”.

    I also imagine that would be wringing of hands rather than ‘ringing’ – my hands don’t go “ding-dong” and I don’t know of anyone who does have ringing hands.

  2. Scott Carnegie says:

    a-hem, it’s WALL-E :)

  3. MadScientist says:

    Oh, I have to complain about your use of the word “complexity” as well – it’s being used in a vague manner. For example, the mention of the “elaborate systems” for publication vs. “desktop publishing”. I would argue that the computers and software used amount to far more complex a system than the older mechanical devices even though the job of, say, the layout editor may be slightly easier and you do not need as many humans on the job.

    Looking at heavy aircraft, the controls are far more complex today than they were 40 years ago and yet techniques and tools have been devised to help the pilot and for over 30 years there has been a huge effort to cut down on the cockpit crew – for example a 747 once had the pilot, copilot, and radio operator (aka navigator) but now you only have the pilot and copilot. Complexity however is going up and not down.

    As for oil production, it is largely a mechanical process. The easily achieved improvements have been achieved – for example, various processes are now monitored by machines so that humans don’t have to run around reading gauges and writing down and analyzing numbers. The machines will even tell you about sensor faults and so on. Offshore operations are made much more difficult by the fact that you do not have easy access to the equipment on the sea floor. Even in a case where you have power and control cables running down to equipment on the sea floor, in the event of a catastrophe there is a very good chance that your control cables will be destroyed. Due to the requisite sheer amount of mechanical components (which you simply cannot replace with a computer program), I’d bet that offshore oil production operations remain fairly complex and difficult to manage.

  4. MadScientist says:

    Oh, and as for the checklist – do you really think an oil rig hasn’t got them? Nor does having a checklist guarantee that things go the way they should, for example when Lockheed dropped a satellite in its integration bay. Having more lists of things to do doesn’t necessarily make a place safer or lead to fewer mistakes.

  5. Brian M says:

    Another example in the same vein as a single contractor building a house is our entire lives. There was a time where you would build your own house, farm your own food, rear and educate your own children, and raise and hunt livestock.

    I think that this specialization is indefinitely a good thing. No longer does a single harvest failure cause death or famine. Instead, the overall productivity is averaged, and everyone gets a slice of the pie.

    When trying to better enhance communication, you need a medium that all can understand. Checklists are great, as everyone understands them (or can understand them with very little explaining). One example of a horrible implementation can be found in IT governance. There are a dozen models to follow, which are all effectively checklists. Except each of these checklists are hundreds of pages long. Some are even arranged into books! To understand it, you have to get certified in that specific governance methodology. How can one expect to have everyone get educated in this just so they can communicate in it? In many ways, its more complicated then the technology its trying to govern.

    You mentioned desktop publishing (likely referring to blogs). You forget how many specializations are behind the scenes. The developers who made the app likely don’t know it entirely, as it was probably done by a dozen or more programmers. That site is then hosted on an application server, which was written by many. That app server runs a plugin (or many) that were each written by dozens. That server itself was written by dozens if not hundreds more people. The server then has to be expertly administered by admins, dozens if its a hosting company. And then theres the internet… Its a large onion with plenty of abstraction. Everyone in that chain must be an expert in their area, and then everything runs smoothly. Running a blog is a good example of a group working together in harmony. Every individual piece does its job.

    The problems always arise when there are non-experts doing expert positions.

    In any case, I think specialization is far more important then any individual knowing everything. I think its a reality we need to accept, and then deal with accordingly.

  6. Alan says:

    David Brooks discusses in the New York Times that the cause is primarily due to the fact that our modern technological civilization is becoming too complex for us to manage adequately. The Deepwater Horizon oil rig is just one example of a piece of technology that is beyond the mastery of any single person

    If there is one thing you can count on in this world it is that David Brooks will always find some way of rationalizing away the failures of what, for lack of a better term, might be called “Modern Business Culture.” So, to him it’s never that the system itself has problems let alone that individuals involved may have warped or broken it for their own selfish ends. No, it’s always something else — in this case Brooks is making what amounts to a long winded version of “it was an act of God.”

    But, to paraphrase an old saying we shouldn’t ever ascribe an event to some complicated and convenient rationalization when good old human incompetence and selfishness will do. There is already good evidence that the ultimate blame for the explosion goes back to a few high placed managers who put their own interests before safety. The project was far behind schedule and costing millions a day — that’s the sort of situation that can cost those in charge their careers (or, at least, that nice year-end bonus). So, despite growing evidence that safety was being compromised they just crossed their fingers and ordered full steam ahead.

    Note that I’m not suggesting there was deliberate wrongdoing, just a lot of self-serving rationalization where risks were conveniently underestimated. And, given that our “Modern Business Culture” is biased toward short term gain and being (or, at least, looking) successful at all costs that provided an ideal atmosphere for these rationalizations to occur.

    Brooks uses the Challenger disaster as a supposed example of complexity outstripping human ability, but that clearly isn’t the proper interpretation. It’s not that the engineers and managers were so overwhelmed that they couldn’t see possible negative consequences. Quite the contrary, the O-Ring issue was an ongoing concern and, (in)famously, the managers in charge were told point-blank in a meeting the night before launch that there was serious risk of mission failure. But, conveniently assuming that since they had gotten away with it before that meant the danger was minimal, they rationalized away any concerns and the next day seven astronauts died.

    The indications for the Deep Horizon disaster are much the same — there was clear and compelling evidence that the blowout preventer was damaged. Likewise, there were other equipment issues that further reduced the safety margin. But, again, those in charge dismissed concerns since — in terms of their careers and in the business mindset in general — for them it was a better bet to go ahead and hope for the best than stop everything and be automatically seen as a failure.

    Mind you, I am not suggesting that the issues of complexity aren’t a concern, only that they should not use as an excuse to ignore a long overdue discussion over the values that rule our business culture. If the managers involved with the Deep Horizon had, for example, been operating in an environment that put safety first and rewarded people for doing the right thing, not the expedient thing, this disaster would have been prevented.

  7. kirk says:

    Australopithecus afarensis had a much better track record of managing natural selection than Homo – they successfully produced Homo by their technology management after all. I think they had a checklists. But their computers were mostly 8-bit with minimal RAM.

  8. Max says:

    This post conflates two systemic problems: corporate culture and increasing complexity. Corporate culture includes “risk creep, false security, groupthink, the good-news bias and all the rest,” which lead to textbook engineering ethics cases like the Challenger, Bhopal, and probably the Gulf oil leak. Checklists and computers won’t fix those problems.

    On top of that, there’s the problem of increasing complexity. We often introduce complexity to reduce the chance of small problems, but as a result we increase the chance of huge problems. For example, relying on GPS may be safer than reading a map, but now a solar flare can disrupt everyone’s GPS.

  9. Max says:

    Some things are just inherently safer than others. Somebody already wrote a parody about a wind turbine explosion that caused a catastrophic wind leak.
    There’s a video of a wind turbine explosion on YouTube. It’s pretty startling, but no miners were trapped, no towns were flooded, and no oil or radiation was leaked.

  10. Max says:

    I once sat through a corporate ethics class that presented a scenario where an intimidating project manager was putting pressure on the team to meet a deadline, and it was obvious that he wasn’t interested in hearing any bad news, so basically the team withheld the bad news from him until it was too late.
    In the discussion that followed, NONE of the blame fell on the manager. It all fell on the team for not informing him. That’s corporate culture for ya.

    • MadScientist says:

      Haha. I remember numerous shouting matches with managers over the years. Basically any competent person will tell the manager he’s an idiot and refuse to cooperate until issues are addressed. Unfortunately a lot of people are struggling and need that money and are afraid of losing their job because they’re not doing their part to fetch the manager his bonus. It is not surprising that the bigger the bonus at stake, the worse the manager will be – I’ve often wondered if they have brain damage because they really don’t care at all about anyone else on the planet. Now maybe if I’d done as told I wouldn’t still be a homeless starving scientist – or everyone else in my profession might consider me incompetent due to some accident which I should have prevented.

  11. jrpowell says:

    We don’t need to wait for a detailed analysis of the recent oil disaster to enact new rules requiring relief wells to be dug SIMULTANEOUSLY with the main wells. It is quite apparent that relief wells are the only remedy to underwater oil gushers, and thus should be in place on day one of oil extraction.

  12. Priscilla Blevinglirk says:

    I think we’re running ahead of ourselves when we start declaring the root cause of the BP oil disaster before the case has been properly investigated. In particular, there are rumours that BP skimped on its own safety checks, ignoring potential failures. Also, while for now the basic problem, namely that we’ve got a leak at the ocean floor, is certainly more than we can handle, and has the potential to need a very complex solution, doesn’t really strike me as being caused by the complexity of the plant per se.

  13. lordweird says:

    If the oil wells / blow out preventers were built to be more resilient, (ie multiple backup valves, relief wells, automated pressure testing etc.) the accident likely would not have occured. This lesson can be applied to any complex system. With increasing complexity, it is best to have increasing resiliency or failure is likely.

  14. bandsaw says:

    As with many science fiction authors, I shudder at the idea that computers are the solution to all the complexity in our technology. I encourage folks to go spend a little time reading ACM’s Forum On Risks To The Public In Computers And Related Systems, and then think about how much they really want computers running critical systems.

  15. Kurt says:

    You would be hard pressed to find a single crew that can build an entire house anymore.

    Still, houses get built just fine. There’s a reason for that. Even though multiple crews are working on a building, they all interact via a price system. This determines how hard they work, how fast, quality of materials and who gets hired in the first place. If disputes occur, people may walk off the job site — which will cost the owner time and money. This creates financial incentives, that among other things, keep workers safe.

    Within a large company like BP, however, there are no internal pricing signals. The worth or cost of a decision may be entirely opaque. A manager can easily assume that a safety concern is just “laziness” on the part of a foreman trying to avoid work.

    In a way, Brooks is right. There is a way to create better “choice architecture” — and that involves discouraging the formation of large firms. (Or more accurately, removing the numerous laws and subsidies that promote concentration of capital.) Unfortunately, being a doctrinaire Beltway pundit, I’m sure Brooks would recoil in horror at the idea that Big Business would work better if it wasn’t big at all.

    • Max says:

      How is construction different from BP’s oil drilling? BP has subcontractors too, like Halliburton and Transocean. I don’t know if BP’s workers belong to a union, but they do in many large companies.

  16. Robo Sapien says:

    Human beings use increased complexity of function to achieve increased simplicity of form. Novella’s desktop publishing example works just fine to this end, complex computer systems and software (function) provide a much simpler interface (form) for people to distribute writings. In the same light, GPS systems (function) lead to simpler navigating (form).

    My favorite example: It took the most complex of technologies to produce the world’s simplest machine, the one-atom transistor.