Skip to content

rabeesh/negroni-disallowrobots

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

4 Commits
 
 
 
 
 
 
 
 

Repository files navigation

Disallow Robots Negroni middleware

Disallows indexation by search engines (with the robots.txt file) whenever you're not in production.

Usage

package main

import (
    "fmt"
    "github.com/codegangsta/negroni"
    "github.com/rabeesh/negroni-disallowrobots"
    "net/http"
)


func main() {
    mux := http.NewServeMux()
    mux.HandleFunc("/", func(rw http.ResponseWriter, req *http.Request) {
        fmt.Fprintf(rw, "Welcome to the home page!")
    })

    isProduction := false

    n := negroni.New()
    n.Use(disallowrobots.New(isProduction))
    n.UseHandler(mux)
    n.Run(":5000")
}

X-Robots-Tag info is available here.

Authors

About

Disallow indexation by search engines

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages