Robots.txt file to allow only one folder

I have only one folder (/blog/*) on my web site that I want search engines to index. To restrict all folders but the /blog/* folder I added the following to a text file named robots.txt that I put in the root folder of my web site.


User-agent: *
Disallow: /
Allow: /blog/
Allow: /index.php
Allow: /favicon.ico
Allow: /robots.txt
Allow: /sitemap.xml
Allow: /sitemap.xml.gz

Tested it with Google Webmaster Tools and it seems to work. Easy as that!