Slow indexing of 300g Postgis table

I am loading about 300GB of contour line data into a postgis table. To speed things up, I read that it loads data the fastest and then creates an index. It took about 2 days to load the data, but now I am expecting the index for about 30 days and it is still not ready.

Request:

create index idx_contour_geom on contour.contour using gist(geom);

      

I ran it in pgadmin4 and the memory consumption of the program has ranged from 500MB to 100GB ++ since then.

Is it good to use this for a long time to index such a database?

Any tips on how to speed up the process?

Edit: The data is loaded from 1 Γ— 1 degree (lat / lon) cells (about 30,000 cells), so no line has a bounding box larger than 1x1, most of it should then be much smaller. They are found in EPSG: 4326 projections and the only attributes are height and geometry (geometry).

+3


source to share


1 answer


I changed maintenance_work_mem

to 1GB and stopped all other writes to disk (a lot of inserts were added to ANALYZE, which took a lot of resources). Now I ran after 23 min.



0


source







All Articles