subscribe

Site Security Policy

Via: Jeremiah Grossman.

A proposal for a Site Security Policy has been proposed by mozilla employee Brandon Sterne. This is an extremely important specification for the web, and could be a big step ahead for security on the web.

<rant>

Over the last decade websites have transformed into feature-rich web applications, with the introduction of Javascript and XMLHTTPRequest, Flash and whatnot. While great for user experience, this has also brought huge security implications, resulting in over 80% of all documented security vulnerabilities in 2007 being carried out using XSS.

While implementing all this new fancy stuff, browser vendors have been slacking thinking about the security implications and essentially the responsibility of safe browsing has been put on the user. (Remember the 90's advice of disabling cookies while browsing?). Browsers have become better over time, but one single XSS hole on any site can still have devastating effects for you and your users.

With the demand for web development and web developers still way higher than the supply, education is in a sorry state too. In job interviews I've conducted it's been rare to find a junior who knows of the concept XSS, and finding one who can explain the implications of CSRF, is, well, I've yet to meet one. I don't claim to be a security expert myself, but I feel everybody who's in the profession of web development, should at least be aware of the basic attack vectors and how to prevent them.

"If builders built buildings the way programmers wrote programs, then the first woodpecker that came along would destroy civilization."

- Weinberg's Second Law

</rant>

With the Site Security Policy we're given the ability to lock down certain types of behaviours. It allows us to disable javascript included from unknown domains (a whitelist approach), and HTTP requests initiated by external domains, essentially fixing the CSRF problem completely. Additionally, it defines a way for a browser to log attempts to violate the policy.

For the actual implementation details, I'd suggest just reading the spec, even though its still a proposal, it's good reading material.

So whats next?

The spec needs to be finished. While the current policy is distributed through HTTP headers, some people seem to prefer an external file, like how crossdomain.xml or robots.txt is implemented. The latter would have my vote, because the policy can then be easily cached, which can save some bytes in the end. It would also be easier for people to upload a policy file to a server where there's no scripting available and allows the policy to be enforced for a complete domain, instead having to add it to each and every script.

And last but not least, browser vendors would need to implement it. Sterne works at Mozilla, so that's a good sign already. Personally, I can't wait.

Web mentions