Help getting the data for this site

This site offers a number of ways of using its data:

Daily PostgreSQL Database Dump

Every night we use pg_dump to create a database dump of the site's data, but excluding anything that might be personal data of users of the site.

There's one dump file for the schema, and one for the data:

To import that data, create a new Postgis database. Then you can import the site's data with:

gunzip -c pg-dump_schema.sql.gz pg-dump_data.sql.gz | psql pombola

Daily JSON Data Dumps

If you prefer to work with an export of the site's core data in JSON format, you can use one of the nightly data dumps that we automatically generate. All of data in files below are different ways of presenting the JSON serialization of the Popolo data model. The following sections should explain which one of these would be most suitable for you:

Popolo formatted JSON

This data is a single JSON object with 'persons' and 'organizations' keys at the top level, and membership information being included in each person object:

JSON suitable for mongoimport

For large data imports, mongodb's mongoimport command should be provided with newline separated JSON objects. You can download the data in this format from these links:

Person, Organization and Membership data as JSON arrays

This data was intended for use with mongoimport's --jsonArray option, although depending on your version of the software, the files may be too large to be imported by that method.


The data from these database dumps or the API may be used under the Creative Commons Attribution-ShareAlike license (CC-BY-SA).


We try to make sure that the information on this site is accurate, but inevitably it may be incomplete on innaccurate - we make no guarantee about the accuracy of the data.