Facebook Changes Crawling Policy (for the better) — they’re implementing their crawling policy in robots.txt and not in additional contractual terms. This is in response to Pete Warden pointing out that robots.txt is industry standard and will avoid confusion such as landed him with large lawyer bills. This change came from Bret Taylor, their CTO, who was product manager for Google Maps and gets working sanely with developers. (via Pete Warden)
Law as Source Code (Sean McGrath) — there’s something important coming together in his series of blog posts about law. What we have here are two communities that work with insanely large, complex corpora of text that must be rigorously managed and changed with the utmost care, precision and transparency of intent. Yet, the software community has a much greater set of tools at its disposal to help out.