Add robots.txt for web robots

Allow all robots to access any part of the site. This may change in the
future, as the html snippets that are generated for the posts and pages
should not be directly indexed.

See: http://www.robotstxt.org/robotstxt.html

Signed-off-by: Collin J. Doering <collin.doering@rekahsoft.ca>
This commit is contained in:
Collin J. Doering 2015-08-05 02:42:12 -04:00
父節點 b41c58246f
當前提交 92acb5344e
共有 2 個文件被更改,包括 3 次插入1 次删除

2
robots.txt Normal file
查看文件

@ -0,0 +1,2 @@
User-agent: *
Allow: /

查看文件

@ -105,7 +105,7 @@ main = do
hakyllWith myConfig $ do
-- All Versions ------------------------------------------------------------------------------------------
match ("action/**" .||. "files/**" .||. "images/**" .||. "fonts/**") $ do
match ("action/**" .||. "files/**" .||. "images/**" .||. "fonts/**" .||. "robots.txt") $ do
route idRoute
compile copyFileCompiler