Add robots.txt for web robots

Allow all robots to access any part of the site. This may change in the
future, as the html snippets that are generated for the posts and pages
should not be directly indexed.

See: http://www.robotstxt.org/robotstxt.html

Signed-off-by: Collin J. Doering <collin.doering@rekahsoft.ca>
This commit is contained in:
Collin J. Doering 2015-08-05 02:42:12 -04:00
부모 b41c58246f
커밋 92acb5344e
2개의 변경된 파일3개의 추가작업 그리고 1개의 파일을 삭제

2
robots.txt Normal file
파일 보기

@ -0,0 +1,2 @@
User-agent: *
Allow: /

파일 보기

@ -105,7 +105,7 @@ main = do
hakyllWith myConfig $ do
-- All Versions ------------------------------------------------------------------------------------------
match ("action/**" .||. "files/**" .||. "images/**" .||. "fonts/**") $ do
match ("action/**" .||. "files/**" .||. "images/**" .||. "fonts/**" .||. "robots.txt") $ do
route idRoute
compile copyFileCompiler