-
Notifications
You must be signed in to change notification settings - Fork 480
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add JS bindings for clusterizer API #738
Conversation
Thanks for the PR! I like the idea of a new clusterizer JS module. However I'd like to see an attempt to have a very minimal and functional (as in not object oriented) interface that matches the rest of JS bindings more closely. This would be my initial view based on the README changes:
So in my ideal world we'd shoot for:
... as far as the initial API goes. I'm not sure if buildMeshlets should return the actual meshlet data as a packed Uint32 array or as a JS array of objects - it would be interesting to benchmark both on v8, as this is a place where maybe using JS objects is a reasonable efficiency vs usability compromise. The interface would be clean and coherent if both buildMeshlet and computeMeshletBounds return an array of JS objects. I'd be also fine with exposing computeClusterBounds if it is useful. |
Thanks for the quick response!
Yes, makes sense.
Sounds good. I'll change the behavior.
Totally! I'll remove the default values.
Yeah, that makes sense. I wasn't really sure if I should map the three buffers (meshlets, vertices, triangles) to an array of JS objects or just return the three buffers. I ended up with this hybrid wrapper object that holds the three packed buffers but behaves like an array of JS objects via its accessors. The benefit here is that the data is in one place and can easily be written to WebGPU buffers to inspect the results (my personal use case) while at the same time it's easy to treat the meshlets as JS objects on the host side. Now that I've looked it up, that's actually the same approach the Rust bindings take. I definitely agree that the How about I remove those methods & the bounds array but keep the Meshlets & Meshlet wrappers for now? In the meantime, I'll remove the following functions + tests...
...and let |
I incorporated the feedback. The JS API now looks like this (copied from Clusterizer
To split a triangle mesh into clusters, call buildMeshlets(indices: Uint32Array, vertex_positions: Float32Array, vertex_positions_stride: number, max_vertices: number, max_triangles: number, cone_weight?: number, index_byte_size?: number) => Meshlet[]; The algorithm uses position data stored in a strided array; The maximum number of triangles and number of vertices per meshlet can be controlled via Additionally, if cluster cone culling is to be used, All meshlets are implicitly optimized for better triangle and vertex locality by The algorithm returns an array of const meshlets = MeshoptClusterizer.buildMeshlets(indices, positions, stride, /* args */);
console.log(meshlets[0].vertices); // prints the packed Uint32Array of the first meshlet's vertex indices, i.e., indices into the original meshes vertex buffer
console.log(meshlets[0].triangles); // prints the packed Uint8Array of the first meshlet's indices into its own vertices array A meshlet's console.log(meshlets[0].buffers.meshlets); // prints the raw packed Uint32Array containing the meshlet data, i.e., the indices into the vertices and triangles array
console.log(meshlets[0].buffers.vertices); // prints the raw packed Uint32Array containing the indices into the original meshes vertices
console.log(meshlets[0].buffers.triangles); // prints the raw packed Uint8Array containing the indices into the verices array.
console.log(meshlets[0].buffers.meshletCount); // prints the number of meshlets - this is not the same as meshlet[0].buffers.meshlets.length because each meshlet consists of 4 unsigned 32-bit integers
// all meshlets are also accessible through the packed buffers
console.log(meshlets[0].buffers.getMeshlet(0).vertices[0] === meshlets[0].vertices[0]) // prints true After generating the meshlet data, it's also possible to generate extra culling data for one or more meshlets: computeMeshletBounds(meshlets: Meshlet | Meshlet[], vertex_positions: Float32Array, vertex_positions_stride: number) => Bounds | Bounds[]; If more than one meshlet is passed to If bounds are to be computed for more than one meshlet, it might be more efficient to call const meshlets = MeshoptClusterizer.buildMeshlets(indices, positions, stride, /* args */);
const bounds = MeshoptClusterizer.computeClusterBounds(meshlets, positions, stride);
console.log(bounds[0].center); // prints the center of the first meshlet's bounding sphere
console.log(bounds[0].radius); // prints the radius of the first meshlet's bounding sphere
console.log(bounds[0].coneApex); // prints the apex of the first meshlet's normal cone
console.log(bounds[0].coneAxis); // prints the axis of the first meshlet's normal cone
console.log(bounds[0].coneCutoff); // prins the cutoff angle of the first meshlet's normal cone
console.log(bounds[0].coneAxisS8); // prints the axis of the first meshlet's normal cone in 8-bit SNORM format
console.log(bounds[0].coneCutoffS8); // prints the cutoff angle of the first meshlet's normal cone in 8-bit SNORM format It is also possible to compute bounds of a vertex cluster that is not generated by computeClusterBounds: (indices: Uint32Array, vertex_positions: Float32Array, vertex_positions_stride: number, index_byte_size?: number) => Bounds; |
Two more interface comments, hopefully last:
I need to do a final code review pass as well but this is looking close. I might merge this as is after interface changes above, not sure. Before this, can you also rebase this into separate commits, for example 1) implementation, 2) tests, 3) github actions changes, 4) documentation? This is large enough that I don't want to just squash-merge the whole PR; this will also make it easier for me to do a final code pass in the PR itself. |
True, I removed them now. I initially added the index size because of the
I think so, yeah. Changed it now. However, currently,
Awesome! I gotta run now. I'll see if I get to rebasing the commits later today. Otherwise I'll do it tomorrow. |
Should |
I would probably keep |
51acf60
to
dcfa593
Compare
done |
Thanks! This looks great. Implementation looks good I think, if I discover minor nits I can fix them post-merge. One change that I'd like to see before I merge this though: During computeMeshletBounds, there's repeated reallocation / copying that I think is redundant. Because you are working with MeshletBuffers that stores all data contiguously, you can copy the buffers to Wasm heap and then just address them individually. It's a little more memory but most of the memory would be the position data, and it means you don't need any sbrk calls per meshlet, or even extractMeshlets. Also the code that creates JS bounds object could be shared between computeMeshletBounds & computeClusterBounds but I can fix that post-merge as well. |
Adds WASM bindings and a new JS API for the clusterizer API. The JS API consists of - buildMeshlets: generates mehslet data and implicitly optimizes the generated meshlets. Returns packed buffers containing raw meshlet data. - extractMeshlet: given buffers as returned by buildMeshlets and a meshlet index, returns a meshlet object containing a triangles and vertices array. - computeClusterBounds: computes bounds for cluster data not generated by buildMeshlets. Returns an object containing the computed bounding data, except for s8 compressed data. - computeMeshletBounds: given buffers as returned by buildMeshlets, computes bounds for all meshlets and returns the computed bounding data, except for s8 compressed data. Bumps the stack size of WASM modules to 36 kb because computeMeshletBounds and computeClusterBounds require more than the previously allocated 24kb
Adds tests for the new JS clusterizer API. All tests work on a cube with normal data for which 6 clusters - one for each face - are created. Bounds are validated by comparing each cube face's normal to the computed normal cone's axis.
Adds automated ES5 validation for the new JS clusterizer API. Adds automated tests for the new JS clusterizer API.
Documents the new JS clusterizer API in the JS readme.
dcfa593
to
8bc3573
Compare
Awesome! Thanks!
done
done |
Thanks for the contribution and for quick iteration! |
Hi,
This adds JS bindings for the clusterizer API (
meshopt_buildMeshlets
,meshopt_optimizeMeshlet
, etc.).Here's how the JS API works (copied from my changes to
js/README.md
):Clusterizer
MeshoptClusterizer
(meshopt_clusterizer.js
) implements meshlet generation and optimization.To split a triangle mesh into clusters, this library provides two algorithms -
buildMeshletsScan
, which creates the meshlet data using a vertex cache-optimized index buffer as a starting point by greedily aggregating consecutive triangles until they go over the meshlet limits, andbuildMeshlets
, which doesn't depend on any other algorithms and tries to balance topological efficiency (by maximizing vertex reuse inside meshlets) with culling efficiency.The number of triangles and number of vertices per meshlet can be limited with both algorithms using the optional
max_triangles
andmax_vertices
parameters. If not set, they default to the maximum supported number of vertices (255) and triangles (512).The
buildMeshlets
algorithm uses position data stored in a strided array;vertex_positions_stride
represents the distance between subsequent positions inFloat32
units.Additionally, if cluster cone culling is to be used,
buildMeshlets
allows specifying acone_weight
as a value between 0 and 1 to balance culling efficiency with other forms of culling. By default,cone_weight
is set to 0.Both algorithms return a
Meshlets
object, a helper object to further process meshlets. At its core, aMeshlets
object is just a wrapper around the typed arrays containing the meshlet data:To optimize meshlets for better triangle and vertex locality,
optimize
can be called directly on aMeshlets
instance:After generating the meshlet data, it's also possible to generate extra culling data for each meshlet and populate a
bounds
array within theMeshlets
instance:Meshlet genration and optimization and culling data generation can be chained as well:
To work with individual meshlets,
Meshlets
objects expose an iterator and support the iteratable protocol to iterate over the individual meshlets. Each meshlet is an instance ofMeshlet
, a wrapper around the corresponding subarrays within the owningMeshlets
instance:In environments that support the experimental Iterator prototype methods (
forEach
,map
,reduce
, etc.) can be used on the iterator returned bymeshlets.iterator()
as well.Using Iterator prototype methods in Typescript requires casting to a
Meshlet
array:However, be aware that while
Meshlets
is iterable, is not an actual array and does not support indexing using the[]
operator. Instead, use theget
method:Instead of optimizing or computing bounds for all meshlets,
Meshlet
objects also support processing each meshlet individually. Both operations are chainable:After populating a meshlet's bounds they can be inspected through individual
MeshletBounds
instances, which are again wrappers around the underlying subarray in the owniningMeshlets
object:Alternatively,
MeshoptClusterizer
also exposes a low level API for each function: