Add robots.txt for web robots

Allow all robots to access any part of the site. This may change in the
future, as the html snippets that are generated for the posts and pages
should not be directly indexed.

See: http://www.robotstxt.org/robotstxt.html

Signed-off-by: Collin J. Doering <collin.doering@rekahsoft.ca>
This commit is contained in:
Collin J. Doering 2015-08-05 02:42:12 -04:00
rodič b41c58246f
revize 92acb5344e
2 změnil soubory, kde provedl 3 přidání a 1 odebrání

2
robots.txt Normal file
Zobrazit soubor

@ -0,0 +1,2 @@
User-agent: *
Allow: /

Zobrazit soubor

@ -105,7 +105,7 @@ main = do
hakyllWith myConfig $ do
-- All Versions ------------------------------------------------------------------------------------------
match ("action/**" .||. "files/**" .||. "images/**" .||. "fonts/**") $ do
match ("action/**" .||. "files/**" .||. "images/**" .||. "fonts/**" .||. "robots.txt") $ do
route idRoute
compile copyFileCompiler