Introduction to the project
A little intro into the project. BRL-CAD's manifold meshes are represented by the type called bag of triangles (bot). The meshes are not perfect always, they might become defective. It might happen during import-export or during *, etc. And defective meshes are not desirable. So, we have a a problem here. And the solution is my project.
Specifically we focus on the geometric defects such as gaps, overlaps, T-joints, and holes, and topological defects such as singular vertices, singular edges, and inconsistent orientation of the triangles in the mesh.
After submitting the proposal
A little while after i had submitted the proposal, on the BRL-CAD IRS channel, OpenSCAD's Marius Kintel (nick: kintel) asked if there could be a portable module for mesh healing that could be used by both BRL-CAD and OpenSCAD. After my proposal got accepted, we decided to go forward with the portable module idea.
Now, how we'd implement the portable module was of interest to all of us. Sean and Daniel from BRL-CAD suggested a few methods and we discussed the pro and cons of both.
At this point my mentor Daniel had suggested we go with a generic container of an intermediate polygonal mesh using non-generic programming. That is we define an intermediate structure that will be used to complete the mesh healing functionalities. And we implement the structure differently for the two organizations. We do them using a set of functions that get the attributes from the native structures. In this way, we have an intermediate structure that can talk to both the native structures.
The method i had suggested, in essence, was to create a structure that is in a sense a “union” of the attributes of both the native structures. The only conversion process here will be to set those attributes of the intermediate structure to those of each organisation (the ones that apply). For the attributes that don't apply to one of the organisations, we set it to NULL. In this way too, we have an intermediate mesh that encompasses the properties of both the meshes and one that can be used in the portable module.
At this point, we are yet to decide on the method to go forward with.
After outlining the methods for the portable module, the decision left to take was which one to go with. I decided to go with the Daniel's method of using an abstract class to represent the intermediate polygonal structure, which will access the native mesh types to obtain any information. This was because the method I suggested will tend to clutter things, and will unnecessarily complicate things. A problem arose due to the difference between the two organisation's format of representation. BRL-CAD uses an indexed format, while OpenSCAD uses an unindexed format.
If there is no need for the intermediate mesh to store the vertices in an indexed format, then the issue is trivial. That is, if the only task of the intermediate polygonal structure is to be able to convert to the doubly-connected edge list (that we will use for the algorithms) smoothly, the abstract classes method can be used. This cannot be answered now. Once i write the draft algorithms, things will pan out a bit more.
So primarily, for storing the geometry we are using the abstract class and for processing we are using the doubly-connected edge list.
What essentially happens on both softwares is we fire up the command/invoke the functionality from the GUI. This links up to the conversion file specific to the softwares. The intermediate polygonal_mesh abstract class is just a container for the native mesh types. This class will have virtual methods that will be implemented specific to the two softwares. Now, with the intermediate container set, we create the DCEL with the help of the virtual functions. With the help of the DCEL, we implement the healing algorithms.
After the process is done, we'll recalculate the normals for the BRL-CAD mesh since it includes this attribute information in its structure. No post-processing is required for the PolySet class in OpenSCAD.
The vertex, edge and face records for the DCEL have been defined. The functions required for obtaining/adding/removing information will be defined as and when we require them while implementing the healing algorithms.
PolygonalMesh abstract class:
We won't need any attributes, since we won't be caching any information. The DCEL will do that job. As for functions, for now we define those functions that we will need for converting the PolygonalMesh object to the doubly-connected edge list object. Later on we will add functions that we will use to add/remove information from the native structures, depending on the situations. Suppose we need to add a vertex, we would be requires to know if it's a part of a face/a stand-alone vertex that we will create a face for next.
When we modify the structure of the mesh i.e. add/remove any information, we'll do it simultaneously for both the structures because we don't want to be converting from the DCEL to the PolygonalMesh intermediate class.
This week, I mainly worked on the conversion files for both the organizations. The abstract class defined all the functions that will be required for the conversion to the DCEL. Taking into consideration the order in which the data conversion should take place, I.e. the setting of the attributes of the DCEL such that all dependencies could be handled, the functions were written. Some functions were independent of the native mesh structures - those like setting the incident edge for a vertex. These depended only on the attributes of the DCEL that were already set. So these functions were implemented common to both. The other specific functions were defined and declared in the two derived classes.
Next step was the UI for the mesh healing. This heal command could have options that indicate healing of only a certain type of defect - say gaps and T-joints. This will be taken care of later on. For BRL-CAD, this will be an mged command with TCL binding. For OpenSCAD, an import command with an attribute indicating that the mesh needs to be fixed (again separate options for separate defects can be accommodated) can be used. These commands will invoke the conversion process and then the mesh healing functions.
I started with the first algorithm in my proposal, namely the zippering gaps algorithm. This algorithm is meant for gaps within a certain tolerance. This will be decided appropriately. For those greater than that there is the stitching algorithm.
How this algorithm works is we first identify a boundary (free-edges) chain. We obtain free edge chains from the mesh until there's just one left, in which case it's the boundary of the mesh. So, given a start and an end half-edge we implement the zipper gaps algorithm.
We look for a feature pair for each vertex on the boundary chain, and calculate the error measure. We maintain a priority queue that will pop the feature pair with the least error measure. The feature for a vertex can either be a vertex or an edge. This depends on the closest edge. The distance from a vertex to a closest edge will be the perpendicular distance if an orthogonal projection is possible, else it'll be the shortest between the vertex and the closer endpoint. This way we either set the feature_edge or the feature_vertex, but not both, for a vertex. This closest edge cannot have a bounded face incident on it - meaning it's not a free-edge. It cannot be incident on the current vertex being considered. Also, the line joining the vertex to the closest edge (be it the orthogonal projection or the closer end point) cannot cross any face other than the unbounded face - meaning it should be a valid diagonal.
While zippering the gap, we pop the feature pair with the minimum error measure. Only if this measure is within the tolerance for gaps, we continue. Else we break out of the loop, since every error measure after this will only be greater. We stitch/ fill those defects.
Since the feature vertex of a vertex on the free-edge chain could be a vertex on the free-edge chain itself, its coordinates could possibly change. And hence, the error measure too. Thus every time we pop a new vertex, we need to go through all the elements of the priority queue and:
- Check if there are any invalid features and recalculate the features for those vertices
- Check if the distance measure is up to date.
We pop the feature pair with the minimum error measure. If the feature is a vertex, we move both the vertices to the mid point of the line joining those two. If the feature is an edge, we introduce a new vertex on the orthogonal projection of the vertex on the edge and split the face incident on that edge. We update the vertex, edge, and face records accordingly for both cases.
As and when we update the DCEL, we update the native structure too. This is important to keep all data consistent, that is there must be a synchronization between the two structures.
One main concern was the determining of the closest edge for a vertex v. Checking whether a line intersects a face geometrically might not yield accurate results. So I came up with an approach.
While considering an edge for the position of the closest edge, if it is still eligible (after checking for the unbounded face incidence and non-incidence on the vertex), we check if the edge's incident face contains the vertex v. If not check check for the same with the edge's twin. If either of the conditions fail, make this edge ineligible and move on to the next edge. If the 2-D projection of the line joining the vertex v to the edge intersects any face, it has to intersect another edge of the face as well. Otherwise it would mean that the vertex is inside the face, which is not possible. Thus, now we check if the 2-D projection of the line joining the vertex v to the edge intersects any of the other two edges in the triangle. If yes, then make it ineligible and move to the next edge. If the line joining the vertex doesn't intersect either of the other two edges, it is an eligible edge. So, calculate the shortest distance and update as necessary.
Setting the unbounded face id to the number of faces gave rise to problems while adding and deleting faces, so it was set to zero and all the functions were updated accordingly.
I will be testing my code with the 'heal' command.
I will be testing the zippering gaps module this week. So, i first create a function - analyse_heal_bot() that will link the TCL command function ged_heal() to the zippering gaps code. This function looks for free edge chains and passes to the zippering gaps function to heal the defects wherever applicable (within the tolerance for zippering). For the defects that are left out, there's stitching and hole filling.
Error while linking C++ code with C code took some time to fix. But issue was resolved in the end.
The logic of finding the closest edge function was modified to handle vertices and edges separately. The external boundary was excluded from being passed to the zippering gaps function as it is of no use. It was detected using the orientation checker in the Geometry.h header. The external boundary is oriented clockwise, while the inner defects are all oriented counter-clockwise. This way we identify a boundary edge for the given mesh. And check if a candidate edge (as a part of a free-edge chain) is part of the boundary and ignore, if it is.
Numerous memory corruption errors popped up, mainly due to not deleting allocated space. They were resolved.
I continued testing this week. Quite a few modifications had to be made to the vertex contraction and edge contraction procedures.
Checking and setting twins for edges that have been modified. This function takes in a vertex and gets the two edges containing this vertex, with the unbounded face incident on them. For both edges, it is checked whether the edge and its other neighbour need to be squeezed out from the boundary chain. So essentially we delete either 0/2/4 edges. By squeezing out, what is meant is removing the edge from the boundary chain and setting the references of the affected edges properly. This function was called from the previous function, after checking whether that would be needed.
While deleting any record (vertex/edge/face), the references of the other affected records needs to be set right. And the IDs of the vertex list and face list need to be updated too.
Quite a few memory corruption errors came up again. So, the function prototypes of the functions using 'new' to allocate memory, was changed, to accommodate the pointer which was being returned initially.
With the previous definition, the de-allocation of the memory is the responsibility of the caller, in the hidden sense. What this means is that, the memory returned by the function will have to be de-allocated by the caller, but might be easily missed. Thus with the changed prototype, the caller allocates and de-allocates memory without having to check the function for any liabilities. A few functions requires this change. This got rid of most of the memory issues. One that persists is an error that comes with the push operation of the priority queue. That is yet to be fixed.
Meanwhile, work for the stitching algorithm has been started. This algorithm takes in two chains of edges and stitches them together. One main thing to be realized was that the two chains move in opposite directions by virtue of being part of the same loop. Thus it has been taken that the chain A will be moving towards the 'next' edge and the chain B towards the 'previous' edge.
The stitching algorithm's advancing rule is that whichever triangle (made by two vertices of one chain and one of the other) has the least perimeter will be added, and the edge pointer advanced on that chain (next on chain A and previous on chain B).
While adding a triangle, four new half-edges will be added. Suppose that two vertices from chain A and one from chain B form the triangle. There will be two half edges each from the two vertices of chain A to the vertex of chain B and their twins. The incident face reference of the edge (with unbounded face) will be set to the face that will be newly added.
I tested the zippering code and identified issues with it, the vertex contraction specifically. The mistake was with the degenerate cases of thin triangles with respect to the orientation. When the triangle is made up of collinear lines, the notion of orientation becomes ambiguous. But since the triangle has a particular orientation to start with, that needs to be maintained throughout the procedure. To cope with this problem, the initialization functions were removed from the deleteEdge routine and the actions were slightly tweaked to keep the orientation consistent.
The memory issues were resolved using a debugging tool called Valgrind, suggested by my mentor Daniel.
Another issue that popped up. The validity of the triangles introduced with vertex and edge contraction was not being checked. This introduced complications and the healing did not happen completely. So, this action of checking if a triangle was valid was being performed. It was done by checking if the orientation of the triangle was getting inverted. This prevented any "illegal" contractions from happening.
After the issues were resolved, the changes didn't seem to reflect on the bot. Later, that was taken care of. The function to check for orientation change was added.
The square was being handled separately because when the gap is a square, none of the vertices would get any features, and thus the gap would not be healed. Even with the square case, care needs to be taken such that the outer boundary (if it is a square) is not detected. So the condition to check if the external boundary is the square detected, was added.
While finding the closest edge for the feature, we have routines that check if two lines intersect. It was decided earlier that the 2-D projections of the lines will be checked so as to deal with skewed lines. But this was causing the mesh to be healed incompletely. This was identified while testing with a sphere that is broken along its equator. Two opposite vertices on the equator didn't get valid feature pairs because their adjacent vertices had the same 2-D coordinates as these vertices and a feature could not be assigned. So that option was removed. Intersections are checked with the 3-D coordinates itself and skewed lines will be handled separately.
Case where the mesh is already healed was tested and seen to give right results.
Most dead code was cleared out. Includes were refined.
Tested on different meshes modeled through Blender and exported and then imported. Code produced healed meshes, although with certain thin triangles. Testing with other models as well is going on.
Week 10 and 11
Away from work
This week i started with making a few small but essential changes to the code, like taking in the tolerance as a user input rather than hard-coding it, etc. I ran into a few build errors for a while, but resolved them soon.
When i had been testing my code with a torus, it hung even before the healing started - precisely in the function to find the free edge chain. This was slightly unusual, i'd run into errors before only after at least one round of healing. So when i was debugging, as per Daniel's suggestion i looked into the init functions for the previous and the next edge. As Daniel had pointed out, they needed to be done in the same function, but i had done them separately. So after modifying that, the previous and next edge references were being set wrong because of errors with the incident face references. And that came because of orientation inconsistencies. This was because of the inside-outside orientation inconsistency. With a torus, if the orientation is being calculated with respect to the origin, this error will arise. To this Daniel suggested that i take a different approach to setting the DCEL, by starting with a triangle and moving on to other triangles and just flipping the common edges. He also added that the issue would be when there are multiple connected components.
Another error that i had come across was that, when a mesh was healed using the heal command and if the command was called again on the mesh, the algorithm hung. This i noticed arose due to the presence of singular vertices. And singular vertices were introduced because the meshes were not being healed completely. There seems to be some floating point precision error while computing the orientation. As a temporary fix, since this case is prevalent with triangles (thin ones - needles/caps), a separate case for triangles has been added where the vertex which can be orthogonally projected on to the opposite edge and has the least distance to the opposite edge is taken as the vertex (with the opposite edge as the feature edge) and edge contraction is performed. As this does not extend to free edge chains with four or more vertices, those might still be not healed.