| Commit message (Collapse) | Author | Age | Files |
| |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
The webmap currently doesn't go beyond zoom level 7 (8m/px) so it makes
little sense to create tiles for higher zoom levels.
It speeds things up and saves CPU resources and disk space, too. For
group ‘ren’ on the desktop (GPKG destination dataset), before
(max-zoom=9):
INFO: Exported 4488 features to 4 MVT layers in 00:06:02.926
INFO: Tile count: 75972 [min=33 B, max=128.16 kiB, sum=15.10 MiB, avg=208 B]
vs. after (max-zoom=7):
INFO: Exported 4488 features to 4 MVT layers in 00:00:25.548
INFO: Tile count: 5031 [min=35 B, max=128.16 kiB, sum=4.80 MiB, avg=0.98 kiB]
|
| |
|
| |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
Thanks to EU Directive 2019/1024 of the European Parliament and of the
Council of 20 June 2019 on open data and the re-use of public sector
information, and the Commission Implementing Regulation (EU) 2023/138 of
21 December 2022, the mineral register is now available under the terms
of the CC0 1.0 Universal licence, see
https://www.sgu.se/produkter-och-tjanster/geologiska-data/malmer-och-mineral--geologiska-data/mineralrattigheter-och-prospektering/
Given we no longer need to parse SVG images from the webmap, we drop
webmap-download-mrr.py and add layers for expired and forbidden permits
(ut_metaller_industrimineral_forfallna, ut_olja_gas_diamant_forfallna,
bearbetningskoncessioner_forfallna, ut_metaller_industrimineral_forbud,
ut_diamant_forbud) as well as markanvisningar_bk_ansokta.
Unfortunately the GeoPackage file doesn't include peat concessions, so
we drop them from config.yml for now. Should they be of interest we can
always restore webmap-download-mrr.py and/or order the register from
Bergsstaten, cf. https://resource.sgu.se/bergsstaten/exporter-ur-mrr-info.pdf .
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
In a future commit we'll fingerprint layers to detect changes.
Comparing modification times is not enough since some sources (for
instance Naturvårdsverket's SCI_Rikstackande) are updated on the server
even though no objects are being added; the source layer remains
unchanged but the file differs because of OBJECTID changes we are not
interested in.
Rather than using another cache layer/table for fingerprints, we cache
destination layernames rather than triplets (source_path, archive_member,
layername), along with the time at which the import was started rather
than source_path's mtime.
There is indeed no value in having exact source_path's mtime in the
cache. What we need is simply a way to detect whether source paths have
been updated in a subsequent run. Thanks to the shared locks the ctime
of any updated source path will be at least the time when the locks are
released, thereby exceeding the last_updated value.
|
|
|
|
| |
files.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
modification time.
That way we can avoid the expensive unpack+import when the source
file(s) have not been updated since the last run. The check can be
bypassed with a new flag `--force`.
We use a sequence for the FID:s (primary key) and a UNIQUE constraint on
triplets (source_path, archive_member, layername) as GDAL doesn't
support multicolumns primary keys.
To avoid races between the stat(2) calls, gdal.OpenEx() and updates via
`webmap-download` runs we place a shared lock on the downloaded files.
One could resort to some tricks to eliminate the race between the first
two, but there is also some value in having consistency during the
entire execution of the script (a single source file can be used by
multiple layers for instance, and it makes sense to use the very same
file for all layers in that case).
We also intersperse dso.FlushCache() calls between _importSource() calls
in order to force the PG driver to call EndCopy() to detect errors and
trigger a rollback when _importSource() fails.
|
|
|
|
|
|
|
|
|
| |
This avoid duplications when the same source file is used multiple times
(either by the same layer or by multiple layers). This change breaks
webmap-import, but that one will be refactored shortly.
It also breaks webmap-import-mrr.py, which is no longer used since
mineralrattigheter.zip can be downloaded from SGU's site directly.
|
|
|
|
| |
It appears it is no longer stripped in Bergsstaten's mineralregister.
|
|
|
|
|
|
|
| |
Apparently there are duplicates for dnr BS 22-28-2000
(bearbetningskoncessioner_beviljade) and BS 23-126-2007,
BS 23-116-2005, BS 23-226-1930, BS 23-149-1960, BS 23-7-2000,
and BS 23-105-2010 (markanvisningar_bk_beviljade).
|
| |
|
|
|
|
|
|
| |
As of today SK 117-2024 has not contract date set, so the constraint
fails during import.
https://www.skogsstyrelsen.se/skogens-parlor/NVAvtal/?objektid=4020527
|
| |
|
|
|
|
|
|
|
|
| |
Unfortunately the dam register is way too noisy Unfortunately it looks
like there is no way to download a pre-curated dam registry with only
noteworthy dams used for production (SvK's dammar_pf.shp comes close but
still misses some), so we manually remove the ones which are more than
2km from a production site or a power station.
|
|
|
|
| |
Own work.
|
| |
|
| |
|
| |
|
| |
|
|
|
|
|
|
|
| |
Cf. https://skyddadnatur.naturvardsverket.se/ . We exclude
Nationalstadspark (MB 4 kap 7§) since the only entry is in Stockholm
(Kungliga nationalstadsparken, Ulriksdal-Haga-Brunnsviken-Djurgården)
outside our extent.
|
| |
|
| |
|
| |
|
| |
|
|
|
|
| |
Looks like a FID sequence, so not really useful for us…
|
|
|
|
| |
From Sametinget via Länsstyrelsen.
|
| |
|
| |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
The former have been deprecated per following note on
https://www.skogsstyrelsen.se/sjalvservice/karttjanster/geodatatjanster/nerladdning-av-geodata/
Påminnelse om vektordata
------------------------
Från och med 1 mars finns vektordata i GeoPackage-format.
Shape-filerna tas bort 16 september. För de flesta datamängder
kommer vi att ta bort de länsvisa lagren och bara ha rikstäckande
lager.
These GeoPackage layers have geometry type POLYGON but appear to contain
MULTIPOLYGONs as well, so fortunately there is no data loss due to
geometry splitting.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
Only the land-based ones for now. Source:
https://www.energimyndigheten.se/energisystem-och-analys/elproduktion/vindkraft/vindbrukskollen/
Layers:
- https://ext-geodatakatalog.lansstyrelsen.se/GeodataKatalogen/srv/api/records/GetMetaDataById?id=ed5814b2-08bf-493a-a164-7819e1b590d6
LST Vindbrukskollen landbaserade vindkraftverk
- https://ext-geodatakatalog.lansstyrelsen.se/GeodataKatalogen/srv/api/records/GetMetaDataById?id=c816bd1e-bc6c-487f-a962-770f05f677b6
LST Vindbrukskollen landbaserade projekteringsområden
- https://ext-geodatakatalog.lansstyrelsen.se/GeodataKatalogen/srv/api/records/GetMetaDataById?id=c290bc31-1af8-497e-a9a5-87fcec55d0ce
LST Vindbrukskollen havsbaserad vindkraft
Webmaps:
- https://vbk.lansstyrelsen.se/
- https://ext-geodatakatalog.lansstyrelsen.se/GeodataKatalogen/srv/swe/catalog.search#/map
|
|
|
|
|
|
|
|
|
| |
OGRFieldDefn: add GetComment() / SetComment() methods were added in OGR
3.8.0, cf. https://github.com/OSGeo/gdal/blob/master/NEWS.md#core-3 .
Don't comment out TZ on field definitions. Instead we check the
GDAL/OGR version and ignore TZ on field definitions if the OGR version
is too old.
|
|
|
|
|
|
|
|
|
| |
OGRFieldDefn: add GetComment() / SetComment() methods were added in OGR
3.7.0, cf. https://github.com/OSGeo/gdal/blob/master/NEWS.md#core-5 .
Don't comment out comments on field definitions. Instead we check the
GDAL/OGR version and ignore comments on field definitions if the OGR
version is too old.
|
| |
|
|
|
|
|
|
|
|
|
| |
This is useful to replace a YYYYMMDD formatted date with YYYY-MM-DD.
The target field can then be set to not-nullable and its type set to
Date, as the OGR_F_SetField*() with take care of the conversion.
We could also do that via an SQL query, but in our case the sources are
not proper RDBMS so SQL is emulated anyway.
|
|
|
|
|
| |
There are a couple (15) of NULL AnmaldHa's, all from Sep 1, 2005 in
Borås, so we don't care about these.
|
|
|
|
| |
And set them to NULL.
|
| |
|
| |
|
|
|
|
|
|
| |
Not sure why I thought there was duplicates but I must have done
something wrong as there are none in the 667034+223463+214082+
72656 = 1177234 features found after removing the spatial filter.
|
|
|
|
| |
(Commented out for now since Bookworm has only GDAL v3.6.)
|
|
|
|
|
|
|
| |
Tighten column width and replace fixed-width numerics with single
precisions floats.
Also, align column names with the documented ones.
|
| |
|
| |
|
|
|
|
|
|
| |
The extent is expressed in config['SRS'] in traditional GIS order
(easting/northing ordered: minX, minY, maxX, maxY), but the destination
layers might be pre-existing and use other SRS:es or mapping strategy.
|
|
|
|
| |
(Commented out in config.yml for now since Bookworm has only v3.6.)
|
|
|
|
|
|
| |
We're TRUNCATE'ing the output layers (tables) at every run and aren't
reseting the sequences so they would soon overflow 32-bits FIDs
("sks:UtfordAvverk" has almost 300k features within the extent…).
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
There is still a few things to do (such as reprojection and geometry
changes) but it's mostly working.
We roll out our own ogr2ogr/GDALVectorTranslate()-like function version
because GDALVectorTranslate() insists in calling StartTransaction()
https://github.com/OSGeo/gdal/issues/3403 while we want a single
transaction for the entire desination layer, including truncation,
source imports, and metadata changes.
Surprisingly our version is not much slower than the C++ one. Importing
the 157446 (of 667034) features from sksUtfordAvverk-2000-2015.shp takes
14.3s while
ogr2ogr -f PostgreSQL \
-doo ACTIVE_SCHEMA=postgis \
--config PG_USE_COPY YES \
--config OGR_TRUNCATE YES \
-append \
-fieldmap "0,-1,-1,-1,-1,1,2,3,4,5,6,7,8,9,10,11,12,13" \
-nlt MULTIPOLYGON -nlt PROMOTE_TO_MULTI \
-gt unlimited \
-spat 110720 6927136 1159296 7975712 \
-nln "sks:UtfordAvverk" \
PG:"dbname='webmap' user='webmap_import'" \
/tmp/x/sksUtfordAvverk-2000-2015.shp \
sksUtfordAvverk-2000-2015
takes 14s.
Merely opening /tmp/x/sksUtfordAvverk-2000-2015.shp and looping through
its (extent-filtered) features results in a runtime of 4.3s.
|
|
|
|
|
| |
We'll need that for layer creation (description, fields, creation
options, etc.).
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
Unfortunately SGU/Bergsstaten doesn't offer layer files to download, but
it has an online webmap (WMS) at https://apps.sgu.se/kartvisare/kartvisare-mineralrattigheter.html
so we add a dedicated module to probe and fetch features from it.
Double checked that the resulting combination of GeoJSON files does not
result in data loss compared to the previous (private) script:
sort_features() { jq -S '.features |= sort_by(.properties.Name, .properties.Layer, .properties.Area)' }
diff -u --color=auto \
<(ogr2ogr -f GeoJSON -lco COORDINATE_PRECISION=2 /vsistdout/ $dir/Mineralrättigheter.gpkg \
SE.GOV.SGU.MRR.BEARBETNINGSKONCESSIONER_APPLIED_VY | jq '.name = "MRR:" + .name' | sort_features) \
<(ogr2ogr -f GeoJSON -lco COORDINATE_PRECISION=2 -nlt MULTIPOLYGON -nlt PROMOTE_TO_MULTI \
/vsistdout/ mrr/bearbetningskoncessioner_applied.geojson | sort_features)
(and similar for other layers).
|