Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix problems computing feature density for overviews creation #258

Merged
merged 5 commits into from
May 27, 2016

Conversation

jgoizueta
Copy link
Contributor

We still have to determine if the fix fo #257 fixes #256 too (in all the cases I've tried the error is like in the first problem: integer too large, and not like the second: casted to float8

jgoizueta added 4 commits May 24, 2016 18:18
Fixes #257

23 = 32 bits per integer - 1 sign bit - 8 bits px/tile
* The initial iteration of the recursive feature density query shouldn't
  start beyond the maximum level
* Correct off-by-one limit the rest of iterations
Note that _CDB_Feature_Density_Ref_Z_Strategy returns the first
level for which overviews should not be used, and that in some
cases _CDB_Feature_Density should look beyond the max level to
compute a feature density estimate.
@jgoizueta
Copy link
Contributor Author

CDB_ZoomFromScale returns a maximum zoom level of 23 (for similar reasons to our limiting max. overviews level to 23 too?).

    WHEN scaleDenominator <= 100 THEN RETURN 23;

So, if we create overviews for level 23 the base table will never be used by Mapnik (images an utfgrid).

Is this a feature or a bug?

  • Feature: If a table requires overviews for such high z, we better not try to render it ever (all points have to be very close together); overviews will make sure that rendering is fast.
  • Bug: eventually we should be able to see and get info from individual points and not aggregated data

If the bug side wins, we could make the overviews maximum level to be 22, or extend CDB_ZoomFromScale to return at least level 24 (but I still don't know what implications this may have).

cc @rafatower @javisantana

@javisantana
Copy link
Contributor

the thing is, why we don't use a real math function? https://github.com/mapbox/postgis-vt-util/blob/master/src/Z.sql

If the table had x and/or y columns they were picked by an inner
select instead of the recursive arguments.
Fixes #256
@jgoizueta
Copy link
Contributor Author

jgoizueta commented May 26, 2016

Regarding using a formula for Z: there's a PR for that: #197 (but this doesn't return zoom levels

I'll take a look to see if it might have any implications (specially returning values greater than 23)

@jgoizueta
Copy link
Contributor Author

jgoizueta commented May 26, 2016

I finally found the cause of the floating point arguments error! (#256)
It happened when the table had columns named x or y. In all cases observed these were floating point columns.

The recursive query introduces the names x and y:

    WITH RECURSIVE t(x, y, z, e) AS (

Which are used (unqualified) among other places in an inner subquery from the table:

      SELECT x*2 + xx, y*2 + yy, t.z+1, (
        SELECT count(*) FROM %1$s
          WHERE the_geom_webmercator && CDB_XYZ_Extent(t.x*2 + c.xx, t.y*2 + c.yy, t.z+1)
      )

And in this case, if the table has and x/y column the names resolve to that column instead of the recursive query arguments.

Here these names must be qualified as t.x and t.y to solve the problem.

@javisantana
Copy link
Contributor

good catch!

@jgoizueta
Copy link
Contributor Author

The maximum CDB_ZoomFromScale problem being dealt with in #259

@jgoizueta
Copy link
Contributor Author

@rafatower please review

@@ -1,3 +1,29 @@
-- Maximum zoom level for which overviews may be created
CREATE OR REPLACE FUNCTION _CDB_MaxOverviewLevel()
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

why not changing the internal representation of coordinates to bigint to be able to hold them up to z = 31?

functions such as CDB_XYZ_Extent and others can have their bigint counterpart. Both can coexist and the integer version be marked as "deprecated" and make them issue a warning.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

By using bigints we could raise the maximum overview level from 23 to 29 or 30 (it'd be pointless to support higher levels because using such levels would cause problems elsewhere).
I hadn't think of that approach, basically because I think having up to 23 overview levels should be enough for most cases, but I can do a quick check to see if it a) is easy to implement b) doesn't have a performance impact (the feature density query is already quite slow).

@jgoizueta
Copy link
Contributor Author

jgoizueta commented May 27, 2016

Using bigints to avoid the zoom-23 limit seams feasible but also seems to have an impact on performance, since checking that will take some time I'll leave it for later (it's here in this branch), as I prefer to deploy the current fixes ASAP.

See #260

@jgoizueta jgoizueta merged commit a2a1ff6 into master May 27, 2016
@Algunenano Algunenano deleted the 257-max-overviews-level branch November 15, 2019 12:48
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Error during feature density evaluation for overviews creation
3 participants