Premium guide
This documentation will guide you on how to use premium datasets that you can buy on the premium site. The first section describes the format of the CSV files and an example on how to import it in PostgreSQL while the second, explain how to inject a Gisgraphy SQL dump. You can also download the PDF version for offline use.
Before buying you probably wonder how good is the coverage for a country and see how it looks like, select the country and see how many records we got for each dataset Those formats are not what you search for ? You need a custom extract, contact us
The rows in CSV files got some fields that reference ids of rows in other CSV files, as an SQL table can have a foreign key that references on another column (primary key) in another table. For instance, the street CSV file has a 'cityid' field that references the id field of the City CSV file.
It is very important to notice that those XXXID fields are optional and are sometimes not filled So for an address, you can get the street with the streetid field that refers to the id field in the street file whith the street id , you can get the city from the cityid that refer to the id field in the city file, and then get the adm with the admid that refers to the IDin the ADM file. First, you need to create the tables. You can find a SQL script that reflects the format of the CSV files. As described in the Format section (above). The files have some fields that reference ids on other files. If you want to inject files into a database and use Foreign keys mechanism, you will have to inject files in that order :
of course, if you don't have to buy all the files and skip those you haven't buy. if you buy streets, you will have a field called 'city' that reflect the city where the street is, but you won't have more information on the city. if you buy the city files, you will get alternate names, shape, population, admin center, location, ...(when available, all the fields are optional) The import of data into Gisgraphy is a very long process. If you don't want to import the data by yourself, you can buy a dump of data that you can inject directly into PostgreSQL database. Once the dump downloaded (the link is provided in your dashboard or in the mail received when purchasing), you have to follow some steps.
If it is not already done, You have to download and install Gisgraphy. The latest version of Gisgraphy is available at https://download.gisgraphy.com/releases/gisgraphy-latest.zip. The installation guide will help you to set it up.
Those steps are independants from the installation you've choose. It can be run on Linux, Windows, Mac, on a Docker container or not. Go to URL given in the mail to download the dump. Note that you got 30 days to download the files.
For those who want to go quickly, we provide (as is and without any warranty of any kind) a script to inject the dump. To use it, run the following commands (replace the words in uppercase according to your configuration) :
To use the dump and inject data in a Docker image :
Put the dump file into the 'assets/dump' directory :
Build an image with the dump from a Gisgraphy Docker image (replace 'mdppostgres' and 'gisgraphyofficial' according your configuration) : Note that you can also use volumes create volumes on PostgreSQL data dir (/var/lib/postgresql/9.5/main) and Solr data dir (/usr/local/gisgraphy/solr/data) . See https://docs.docker.com/engine/admin/volumes/volumes/ for more infos.
For those who want to be guided, step by step, do the following instructions :
If you got a ‘duplicate entry’ error message in the User or in the Profile table. Simply ignore it ! It is because you’ve already got some users in the database. This section is optional and could be skipped if no full-text engine dump (Solr) is provided. You only need the full-text engine if you use, geocoding, full-text / autocompletion web services. If you only use reverse geocoding, street search, and nearby web service, you can skip this section. It will save disk space (that will have no impact on performance) That’s all ! (Re)start Gisgraphy with the launch.sh script and use it If you got problems, you can use the contact page or send us a mail.CSV Dataset
File formats
Example of use
The CSV / TSV datasets is a structured format that allows being used in many ways. it can be used in Excel, or in a database, or anything that can process a text file or read a TSV file. This section only shows a common use to inject the files into a Postgres Database. (a same tutorial could be written for MySQL or Oracle).
Gisgraphy SQL dumps
You can also use the SQL dump with your own software or use, the SQL schema is available here1-Install Gisgraphy
Once done, follow the steps below. All those steps have to be done when gisgraphy is shutdown.
2-Inject the dump
2a-Without Docker
#clone the Github repository
git clone https://github.com/gisgraphy/gisgraphy-docker.git
cd gisgraphy-docker/dump
#put the dump file into the assets/dump/ directory
mkdir -p ./assets/dump
cp /PATH/TO/DUMP_TAR_FILE ./assets/dump/
#run the script
./inject-wo-docker.sh YOUR_POSTGRESQL_PASSWORD /PATH/TO/GISGRAPHY/DIRECTORY/
2b-With Docker
#clone the Github repository
git clone https://github.com/gisgraphy/gisgraphy-docker.git
cd gisgraphy-docker/dump
mkdir -p ./assets/dump
cp /PATH/TO/DUMP_TAR_FILE ./assets/dump/
docker build -t gisgraphydump --build-arg PGPASSWORD=mdppostgres --build-arg BASE_IMAGE=gisgraphyofficial . --build-arg SOLR_DIR=/usr/local/gisgraphy/solr/
2c-Step by step
Inject the SQL dump file
pg_restore -h 127.0.0.1 -p 5432 -U postgres -Fc -O -v -dgisgraphy /PATH/TO/FILES/dump_localhost.gz
psql -U postgres -h 127.0.0.1 -d gisgraphy -f ./sql/createGISTIndex.sql
Setup the full-text engine (optional)
3-Support