-
Notifications
You must be signed in to change notification settings - Fork 480
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Attribute-aware error metrics for simplification #158
Comments
Yeah, so there's a few ways to fix this. One is to discard and recompute normals post-simplification, possibly splitting vertices when the crease angle is too sharp. This works around the problem in a way, but of course it's not very convenient. Another one is to factor the normal delta into the simplification as an extra error. This can be done by comparing the normals alongside the edge that's considered for collapse, or it can be done by introducing normal into the quadric weight. It's on my list to experiment more with this, there's a simplify-attr branch in this repository from my last attempt but when I worked on this at the time it became clear that this isn't very simple so I decided to take a break and think about this more. This isn't implemented right now though, and it's definitely good to address this but I'm not sure what the best solution is, since ideally it's not just the normals that need to be taken into account, and balancing ease of use, performance and quality here is tricky... |
I also encountered this problem from the mesh in #206 (comment) |
@zeux Would you be able to look at this? Thanks for your amazing work on meshoptimzier. |
I believe this part of the comment above accurately reflects the plan here:
Since this issue is still open, you can assume I'm going to look into this at some point in the future; when exactly this point will be I can't say, as this requires some further research on how to best integrate the attribute metrics with geometry metrics in a way that is reasonably easy to tune once instead of having to tweak weights per model. |
Since comparing attributes is not simple, would there be any other approaches? I wanted to look into this, but a bit lost. Edited: I tried using your attribute branch and didn't see any major problems. https://github.com/fire/meshoptimizer/tree/simplify-normal-attribute |
Yeah, it needs more work to be production ready wrt metric, I think the branch predates some geometric improvements - and also needs some interface and optimization work. FWIW I plan to resume this in the next few weeks. |
Is there's a better way to define normal being close enough? It seems to block optimizations of any curved surface. Only flat planes get optimized. Not sure how to allow the first pass of decimation in the chair example and then block the ones that fail. I wish there was a way to optimize the indices with the normals on the second try. My thoughts are using quad remeshing or isotropic remeshing, but that has a lot of work, but it gives the optimizer more room to work. Notes: |
That's because the metric needs work I believe; the code in that branch right now is very challenging to tune properly, which is part of why this hasn't been integrated yet. I'm not aware of existing research that's more promising than the general approach used there but since that code isn't production ready it can have all sorts of issues, and likely requires taking a path that hasn't been explicitly documented in academia (at least it was the case for geometric error, where the approach that meshoptimizer uses is inspired by prior research but doesn't follow it precisely). Remeshing is orthogonal to simplification - it can definitely make topology-aware simplification easier, but doesn't solve the problem by itself and you still need attribute awareness within the simplifier to solve significant attribute distortion from this thread. |
I'll do some literature searches for vertex normal merge, collapse, and flip metrics. If you have any keywords I can search that'll help too. Edited: Will list some promising papers: https://dl.acm.org/doi/pdf/10.1145/2425836.2425911 Edited: I'm going to use the 6 element truncated 3x3 orientation matrix to store the normal. This uses 6 attributes. It seems to work ok. |
Can you take a moment to see if this is legitimate, the Godot Engine contributors had concerns about applying patches on top of meshoptimizer that aren't merged. I wanted some motion on this topic. Thanks! |
Considering the title of this issue: I have voxel meshes which can contain encoded texture splatting parameters in extra attributes (repurposing color and UV) in additional vertex arrays. Problem: removing vertices in that scenario directly reduces quality even if geometry is preserved. Simplification only seems to care about vertex positions, which means there should be more information to give meshoptimizer, or some way to customize the comparison between vertices. I'm wondering if simplification is actually suited in that situation, otherwise it doesnt sound actually... simple (the kind of data I'm storing is packed sets of indices and weights). Does this match the current issue or should I open another? |
Yes that’s the same problem as highlighted in this issue. Attribute aware simplification will be exposed as a separate function with separate attribute stream inputs. |
With risk of stating the obvious, my suggestion would be to:
|
Copying the comment from #524 on some future work involved here; the issue will stay open as the algorithm improves further:
|
Excited to see more progress in this area! After #737, how should we think of the resulting error metric for use with LOD selection? Position-based error is already kinda nebulous to begin with, I'm not sure what to do when using simplifyWithAttributes. |
I hesitate to give any guidance on this for now as there's more refinement planned for the attribute metric as well as more experiments around LOD selection. The eventual goal is to make the result usable as a distance-like metric for LOD selection, maybe with some use specific fudge factors. Before the last update, the resulting value was decidedly useless, so much so that Godot currently uses a patch where the attribute metric is used internally but the result is just positional - this is not ideal in many respects as well. FWIW after this update I think the attribute evaluation is closer to what Nanite uses, and they use a combination of positional and attribute error pretty much directly to select LOD by treating it as a distance like metric with a tuned scaler. |
Using #158 (comment), to summarize the work that happened since then:
There is some remaining work left. Given that this issue has been open for years and a significant amount of work has went into the current algorithm, I will close this but note future work (that will be developed separately in the future as time and priorities permit):
In general, I've exhausted significant further improvements in this area for now apart from the future work mentioned above; it is of course possible that new ideas surface in the future, and they will be incorporated, but it is time to mark this issue "complete" :) The notion being that the basic capability is now present, but future enhancements are possible in, well, the future. |
In the interest of keeping future improvements to this area cross linked from one place I will keep referencing this issue in any future PRs in this area. Note that regardless of any progress on any future work at least the next library version will keep the interface/implementation "experimental", with the aim to eventually stabilize it when the interface is known to be final. This doesn't mean it's not useful in production (in fact the previous version has been used in Godot for a year or two now!), just means the interface is not necessarily final. |
Hi! I'm playing with https://developer.nvidia.com/orca/amazon-lumberyard-bistro dataset and meshoptimizer and I've noticed this particular failcase related to the way the corners were authored.
For example, here is the original chair mesh:
Notice the rounded corners with shared vertices. They survive the first pass of meshopt_simplify to half the number of triangles fine:
However, once the triangles between two sides facing at 90deg get folded and the sides start sharing the vertices, the vertex normals can no longer be correct:
What would be a solution to (automatically) preventing this?
I was thinking of adding additional custom skip code in the 'pickEdgeCollapses' loop if angle between vertex normals is above certain threshold but I'm sure there's a better/simpler solution, perhaps already there? :)
(instead of preventing collapse, could also allow it but duplicate verts so normals aren't shared?)
Thanks for the great library!!
The text was updated successfully, but these errors were encountered: