* Set `MAX_ZOOM` to 7 by default.
* Remove `QUICKSTART_MIN/MAX_ZOOM` - unneeded complexity with two env vars. We can just use `MIN_ZOOM` and `MAX_ZOOM`. See also #261
* Generate dc-config yaml file with a new `make generate-dc-config` step. It will compute BBOX based on the downloaded data file. This step is not needed for planet generation.
* Generate Imposm replication file only when `DIFF_MODE` is `true`. Not needed otherwise. If the data source does not support it, it will throw an error.
Closes#904
* Make all data-related targets like `download*`, `import-osm`, `import-borders`, and `generate-tiles` into `area`-aware -- making it possible for multiple data files to coexist inside the `./data` dir.
* Add `make download area=... [url=...]` command to automatically download any kind of area by checking Geofabrik, BBBike, and OSM.fr, optionally from a custom URL. Supports `area=planet` too.
* Do not re-download area with `make download-*` if it already exists.
* Automatically rename `<area>-latest.osm.pbf` into `<area>.osm.pbf`
* If `area=...` parameter is not given to `make`, see if there is exactly one `*.osm.pbf` file, and if so, use `*` as the `area`.
* Configure many variables in the .env file, overriding the defaults in tools
* If `<area>.osm.pbf` exists, but `<area>.dc-config.pbf` is missing, generate it using `download-osm make-dc` command.
Also:
* closes#614
* closes#647
* partially addresses #261
* Trims SSD drives, flushes cache before each performance test. Unfortunately these are still incomplete -- need to use real hardware machines for all these to take effect.
* A bit more output in PR updater
* Adds a script to downloads multiple areas and compute their test parameters
* added a large test that uses a combined 76MB file with equatorial-guinea, liechtenstein, district-of-columbia, greater-london
* cache wikidata downloads
`master-tools` branch is the same as `master` branch, except that it uses `latest` from the tools repo. This allows us to quickly track if master is compiling correct.
Results now show a table of how long each step took, as well as the PG database size change.
* use `time` to compute profiling for each step
* call postgres to get database size
* delete output escaping (forgot to remove it -- was used for the older system)
* stop early if there are no pull requests (e.g. in case this is a fork)
A cron-based approach to find pull requests, possibly from forks,
that finished profiling, and post their results as comments.
See in-depth explanation of how this works at
https://github.com/nyurik/auto_pr_comments_from_forks
* On pull request and on commit, run base test followed by the test of the change,
comparing the results, and publishing the results to the Pull Request.
If the pull request is updated, the resulting comment will be updated.
* also save quickstart.log as an artifact
Note that due to GitHub workflow security restrictions, it is not possible to post PR comments if the change originated from a fork. I am still looking for workarounds.
To view what would have been posted, in the build results at the bottom, open `PR performance` details, and expand the ` Comment on Pull Request` (and its subitem).
Optimizations: the process keeps two caches -- one for the data test file, and one for the results of the performance run for the "base" revision. If this or other PR has been executed for the same revision and the same test data, performance test will only run for the proposed changes, not for the base.
Co-authored-by: Tomas Pohanka <TomPohys@gmail.com>