Fri

Oct 19
2007

Jesse Robbins

Jesse Robbins

Web2Summit: Complexity + Tight Coupling = Catastrophe

Paul Kedrosky and Tim O'Reilly just talked about the "Quant Fund Meltdown" and how complex interactions between computer systems and people resulted in unprecedented hedge fund losses. I spend a lot of time thinking about risk and failure in complex systems, and I've found Charles Perrow's "Catastrophic Potential" model to be very useful. It's pretty straightforward...

Interaction+Coupling

strong>Update: Paul Kedrosky likes Charles Perrow too.

Technorati Tags: , , , ,


tags: finance, operations, web 2.0 summit  | comments: 4   | Sphere It
submit:

 
Previous  |  Next

0 TrackBacks

TrackBack URL for this entry: http://blogs.oreilly.com/cgi-bin/mt/mt-t.cgi/5983

Comments: 4

  Thomas Lord [10.19.07 07:15 PM]

On the basis of the slogan (title) and graph alone -- this is the smartest expression of a critical idea for out times that I've yet seen.

"Complexity + Tight Coupling" -- perfect.

Keep Out, Cheerleading is Dangerous, etc.
-t

  Andy Wong [10.21.07 03:49 PM]

It is well known in software development: Complexity + Tight Coupling = Catastrophe.

Just a legacy wisdom about system management: complexity up, reliability down.

Sometimes complexity is inevitable, AS RESULT of system development. It is tight coupling brings catastrophe upon events of bugs.

It is interesting to know this is also the case in finance worlds.

  Adam Green [10.22.07 08:31 AM]

i tend to disagree with the statement that tight and complex should not go together. i think it is better to say that tight and changeable should be avoided. there are lots of very complex, very tight, but operational systems (core banking for example), which are fine, as their core process model does not change much. loose coupling paradoxically can dramatically increase complexity in both design and run time, as the promiscuity of loose connections can be chaotic. better to say 'if you expect your service or interfaces to the service to change, and there is more than moderate complexity then design in separation of layers and functions to isolate the effect of changes... '

  Thomas Lord [10.22.07 09:37 AM]

I don't think core banking is all that complex. It's sophisticated -- many details are considered and harmonized. It's big and expensive. It has many parts. But people can only manage to agree to build and rely on such a thing when, at core, it is fundamentally simple -- simple enough that people can make pretty substantial promises about how the system will behave in a wide range of conditions.


Also, the main point of loose coupling isn't to make any one system more robust or to lower design time or to lower run-time costs. Rather, the main point is to make the whole collection of systems locally robust, everywhere, in this sense: when things "go wrong" you can only immediately make repairs to the parts of the systems you have at hand. If repairs are needed remotely, now you've got two problems. Things are "loosely coupled" to the extent that, no matter what aprt of the system you are close to, you can tweak it to do some semblance of its job in a wide variety of hard-to-anticipate adverse operating conditions. Typically, that means being able to find substitutes for remote components. You can see this in miniature in "classic Unix". If you are relying on a script that does "ls | grep | sort" and, suddenly, the grep binary becomes corrupted but you don't have the source so can't fix it, at least you can quickly switch to "ls | awk | sort" and get back to work. The loose coupling of your pipeline to "grep" didn't make the original pipeline easier to design or faster to run or less likely to fail -- but it did give you a better rescue when something did go wrong.

-t

Post A Comment:

 (please be patient, comments may take awhile to post)






Type the characters you see in the picture above.