Subscribe to this thread
Home - General / All posts - Unexpected Operation Error - new scenario
artlembo


3,404 post(s)
#09-Feb-24 03:48

Danb had presented a case for this cryptic error. It says Unexpected Operation Error. I ran into this as well. It was during a merge (dissolve) operation. The layer is over 1GB in size, and I am attempting to merge all the geometries together. I suspect that the error is due to the size of a single polygon being too large.

I also notice that when I create a script to do it, the Merge operation is using GeomUnionAreas. If I change GeomUnionAreas to GeomMergeAreas, the merge happens in seconds. But, it appears that any overlapping polygons get removed.

I'm thinking I may try to work on the GeomMergeAreas, and my have to split the polygons based on the overlaps. Sort of a clipping. I wondered if anyone had thoughts on the error, as well as the subtle differences between GeomUnionAreas and GeomMergeAreas.

Dimitri


7,420 post(s)
online
#09-Feb-24 06:40

Well, it's pointless to speculate on the cause without having full info. Knowing the layer is over 1 GB in size doesn't suffice. Details on the data (what objects (all polygons, some lines and points)? How many, the largest and smallest, average size? etc.) coordinate system, extent, full info on the system's characteristics, etc., are necessary. What's the exact workflow? A query - post the query text. Not a query? What are all the settings in all dialogs used.

If somebody runs into a similar error and doesn't want to share all that data, a good start for solo efforts to guess at the issue is to consider the info listed in the Limitations topic. For example, individual polygons can have 2 GB in vertices so depending on what the original polygons were it's not likely that.

A lot of things that appear to be mysteries tend to get cleared up quickly when you have full info on all details. And some apparent mysteries get cleared up if you step back and look at the overall picture.

For example, the original post you cite asks if anybody else has noticed an increased appearance of that error message when using 9. Well, sure, that's likely to happen because 9 is used for bigger data that immediately kills 8 or any other classic GIS. With 8 you'd never notice a wide class of errors like trying to do 1 TB operations with only 500 GB of free space because you never get past the initial loading of the starting data. Given 9's ability to handle large data for opening and display on even small machines you more frequently run into such limitations.

ColinD

2,081 post(s)
#09-Feb-24 10:19

I too encountered this error recently while working through https://manifold.net/doc/mfd9/example__find_percentages_of_open_space_in_zip_code_areas.htm using my own data at the step

Slice Open Space Areas to ZCTA Boundaries using the Transform Pane

I was attempting to slice a vegetation map to a 10km square grid with about 800k cells. Unfortunately M9 would not complete the task but I was able to get it done in M8 albeit a bit longer (85 min).

In searching the error I also came across https://georeference.org/forum/t140384.11#140385


Aussie Nature Shots

Dimitri


7,420 post(s)
online
#09-Feb-24 13:13

Details?

ColinD

2,081 post(s)
#11-Feb-24 06:20

These are the details:

A 10km square grid 2265 cells

A vegetation map 862943 objects

EPSG 3112 M9 or Lambert Conformal Conic M8

Needed to cut the vegetation map by the grid cells to determine the % of each individual cell area occupied by the vegetation.

M9 Overlay Intersect which failed after some time with the quoted error.

M8 Topology Overlay Union completed in 5113.656 sec.

This would suggest that hitting the 2 GB limit might not be the issue here unless M9 runs a more complicated process than M8.

Attachments:
Grid and veg detail.jpg
Grid and veg.jpg


Aussie Nature Shots

tjhb
10,094 post(s)
#11-Feb-24 08:29

Colin,

I think a good way of tackling this task would be to first subdivide the vegetation areas into convex parts, then do the overlay.

(Alternatively, subdivide to triangles. That may produce more extra overhead than extra benefit. Convex parts should be enough.)

tjhb
10,094 post(s)
#09-Feb-24 14:21

I wondered if anyone had thoughts on the error, as well as the subtle differences between GeomUnionAreas and GeomMergeAreas.

The differences between those two functions are not trivial or subtle.

GeomMergeAreas is just a dumb data operation. It takes numbers, and returns numbers, numbers in this case representing vertices. The resulting topological details are left to rendering.

GeomUnionAreas is a topology operation. It takes areas, and returns one area, with all of the (potential) overlaps and intersections between input areas resolved. So this takes enough resources to resolve neighbourhood, shared triangles, compliance with topographic norms, and other things I am not aware of. I don't know the details (thank heavens) but it is not trivial

So, apples and pears.

What you could do, in a hard/large case, is first normalise geometry on the input set. Then normalise topology [corrected] (that might fail for the same reasons). Then do the union.

If that chokes too, try unioning objects pairwise first--each with its touching neighbour(s)--then normalize geometry, then union the result.

artlembo


3,404 post(s)
#09-Feb-24 15:20

all good thoughts on the solution. And as you suspected, that might fail for the same reasons. I did a deeper dive into the data, the table is actually 2.5GB, so merging everything into a single record will exceed the limits. So, that is the rate limiting factor. The only other option is to see what generalization accomplishes.

Dimitri


7,420 post(s)
online
#10-Feb-24 07:08

I did a deeper dive into the data, the table is actually 2.5GB, so merging everything into a single record will exceed the limits.

No, that's incorrect. It will only exceed the limits in the rare case where there are no overlaps or adjacency, that is, in a case where people normally would not use that tool. If this is a highly artificial or extremely unusual use of the tool it might be that, but otherwise it is likely something else. That is why it is important to discuss such issues in the context of the actual data and precise workflow being done.

Some illustrations....

First, a typical case of many areas that are adjacent. These could be parcels, provinces, etc.:

The data size of the above drawing will be scaled by the number of coordinates needed to define all vertices of the area objects that are in the drawing. If you have a very detailed worldwide drawing with very many adjacent polygons that could be 2.5 GB.

A typical use of Merge and the GeomUnionAreas aggregate SQL function is to combine those many areas into one or more areas in a dissolve operation using an attribute. Suppose, for example, you use GeomUnionAreas to combine all of the smallest administrative zones in a country into one big country object:

The illustration above shows them all merged into a single object, like the workflow described by Art. The drawing now has much smaller size, because only the outermost coordinates are left. All of the many coords for the inner boundaries are gone. Merging everything into a single record will not exceed the limits.

Consider another case where there are many non-adjacent objects but some of the objects overlap:

To illustrate the situation, the above shows a single drawing where all of the areas are in the same drawing. The one big blue rectangular area is colored in blue using thematic formatting with all the other areas in a different color. The big blue rectangular area overlaps many of the smaller areas.

If you do a merge (dissolve) to combine them all into a single area you get the above. Again, the number of coordinates has been reduced, because very many coordinates within what is now one big area are gone.

If there is significant overlap, or even just a few big objects that overlap very many non-overlapping and non-adjacent objects, in that case also you can't say that merging everything into a single record will exceed the limits. That's because the more the overlaps the greater reduction in the total number of coordinates. Only a few big areas could eliminate most of the coordinates, like in the illustration above.

Retaining all the coordinates and thus keeping the total number in a single record extraordinarily large is only likely to happen in a merge (dissolve) with data like this:

That shows a situation with very many non-adjacent and non-overlapping areas. Imagine a file that showed all building footprints in a big country. If you do a merge (dissolve) on that you'll get a result that looks very much like the original to the extent some of the objects aren't adjacent.

But that's a highly artificial use case. What is the use case, other than trying artificially to exceed really huge limits like 2 GB of vertices in a single object, for, say, taking 2.5 GB of building footprints that are millions of different objects and making them a single branched object with millions of branches?

It seems to me that doing that for some processing reason would be highly inefficient, if not outright broken, logic. I don't see how it would make sense for export, because I don't know any other system that can handle a single object with over 2 GB of vertices.

One more thing: if trying to create single objects with more than 2 GB of vertices is what's going on here, well, that particular way to exceed the really huge limits of 9 that might be what's going on here (we don't really know because we don't have the necessary details about the data) is just one of the ways you can exceed the limits of 9. Other ways, for example, include trying to work with data where the operation requires a few gigabytes of free space in storage where you only have a few hundred megabytes of free space in storage. Without knowing the specifics of the data and what is being done in each case that's reported, you can't say exactly what the issue is in the workflow.

There's not much one can say about what ColinD's report and Art's report might have in common except that it is likely in both cases one of the limits set forth in the Limitations topic has been exceeded. But there's nothing to indicate both reports involve the same limit. To sort that out you need details on data and workflow.

Attachments:
merge_01.png
merge_02.png
merge_03.png
merge_04.png
merge_05.png

Manifold User Community Use Agreement Copyright (C) 2007-2021 Manifold Software Limited. All rights reserved.