I keep seeing more discussions of the need to tolerate a certain degree of error and inconsistency in complex systems. Just read Carlos' post citing an interview with Jaron Lanier discussing the need to apply error-tolerant pattern matching to protocol design.
This is roughly related to the work on ACID vs BASE that I've run across from two completely independent sources. One is Rohit Khare's BASE: Best-effort networking, Approximate estimates, Self-centered trust management, and Efficient encoding. The other is Armando Fox et al's BASE: Basically Available Soft state Eventual consistency". Apparently, these two notions of BASE were created completely independently.
This is also related to the Internet "Robustness Principle" attributed to Jon Postel: "Be liberal in what you accept, and conservative in what you send." Tim Berners-Lee also cites this principle as a guiding principle in his design of the World-Wide Web, though he calls it the principle of "Tolerance."
Finally, this is all related to Signal Processing. See my previous post on Signal Processing as the Science of Patterns. Signal Processing differs from Information Processing in that it expects information to be noisy, imperfect, inconsistent, etc., whereas Information Processing is based on an expectation of perfect "digital" information.
Taken together, these are all approaches for amplifying order out of disorder, perfection out of imperfection, consistency out of inconsistency, rhythm out of noise: tolerate more noise in what you recognize than in what you generate.