Rest API for GeoNet web site data.
Requires Go 1.2.1 or newer (for db.SetMaxOpenConns(n)).
Dependencies are included in this repo using godep vendoring. There should be no need to go get
the dependencies
separately unless you are updating them.
- Install godep (you will need Git and Mercurial installed to do this). https://github.com/tools/godep
- Prefix go commands with godep.
Run:
godep go build && ./geonet-rest
Run all tests (including those in sub dirs):
godep go test ./...
- URIs should return a resource and the query parameters should be used to filter (search) for them.
- Use ISO8601 date times in UTC e.g.,
2013-05-30T15:15:37.812Z
- Use http methods in routes (
GET
,PUT
etc). - Use camelCase for query and property names. Be consistent with SeisComPML or QuakeML e.g.,
publicID
notpublicId
orpublicid
. - The http
Accept-Header
should be used to determine which data version and format to return.
API documentation is generated from doc{} structs in the code. Run the application and visit http://localhost:8080/api-docs
.
- Make non breaking additions as required.
- Add to the tests.
- Add Markdown documention to the tests and regenerate the API docs.
- Are you really sure you have to. Discuss widely.
- Copy the current API verion code to the next API version (so as to support all queries at the new version)
- Monotonically increment the
Accept
constant e.g.,application/vnd.geo+json;version=1 -> application/vnd.geo+json;version=2
- Change the tests.
- Update the documentation.
- Make the changes.
- Update the routes.
Uses the database from the ddl dir. Use ./scripts/init-db.sh
to initialise the DB in a suitable postgres+postgis container.
Either or both of:
- Copy an appropriately edited version of
geonet-rest.json
to/etc/sysconfig/geonet-rest.json
This should include write access credentials for accessing the impact database. - Refer to docker-run.sh for overriding from env var.
Copy an appropriately edited version of geonet-rest.json
to /etc/sysconfig/geonet-rest.json
This should include read only credentials for accessing the hazard database. Properties can also be set from env var.
There are state of health pages available for montoring with web probes:
- http://.../soh - this will return a 500 error if any HeartBeat messages in the DB are old.
- http://.../soh/impact - this will return a 500 error if the measured shaking intensity messages fall below 50. Not all servers may be receiving these messages.
Can be sent to Logentries and Librato Metrics respectively by setting the appropriate credentials in the config.
Fatal application errors, 4xx and 5xx requests are syslogged.
Regions change very rarely and are served with a long surrogate cache time. If the regions are changed the regions will need to be purged from CDN.
Use this procedure to sync a new DB. It is best done without other processes writing to the DB. Should this occur then manually run the triggers afterwards for any new rows.
Dump all the quake data from a DB using:
pg_dump -h 127.0.0.1 -a -U hazard_w -t qrt.event -t qrt.eventhistory -t qrt.quake_materialized -f dump hazard
edit the dump file and add public
to the search path:
SET search_path = qrt, public, pg_catalog;
Drop the triggers on qrt.event on the target DB (as geoneradmin) c.f., the ddl.
Load the dump file:
psql ... -f dump
Reinstate the triggers.