Add robots.txt for web robots
Allow all robots to access any part of the site. This may change in the future, as the html snippets that are generated for the posts and pages should not be directly indexed. See: http://www.robotstxt.org/robotstxt.html Signed-off-by: Collin J. Doering <collin.doering@rekahsoft.ca>
This commit is contained in:
parent
b41c58246f
commit
92acb5344e
2
robots.txt
Normal file
2
robots.txt
Normal file
@ -0,0 +1,2 @@
|
|||||||
|
User-agent: *
|
||||||
|
Allow: /
|
@ -105,7 +105,7 @@ main = do
|
|||||||
|
|
||||||
hakyllWith myConfig $ do
|
hakyllWith myConfig $ do
|
||||||
-- All Versions ------------------------------------------------------------------------------------------
|
-- All Versions ------------------------------------------------------------------------------------------
|
||||||
match ("action/**" .||. "files/**" .||. "images/**" .||. "fonts/**") $ do
|
match ("action/**" .||. "files/**" .||. "images/**" .||. "fonts/**" .||. "robots.txt") $ do
|
||||||
route idRoute
|
route idRoute
|
||||||
compile copyFileCompiler
|
compile copyFileCompiler
|
||||||
|
|
||||||
|
Loading…
Reference in New Issue
Block a user