Hey everyone. FaceLandmarks.com is a little project I put together last weekend, while working with Apple's ARKit for iOS face tracking.<p>For those not familiar, since iOS 11 and A9 processor was released, all iPhones (i.e. iPhone 6s and newer) support augmented reality capabilities.<p>When tracking a face using the front camera and the ARKit framework, a face mesh is generated using exactly 1,220 vertices that are mapped to specific points on the face. These vertices are accessible through ARFaceGeometry, ARFaceAnchor, and ARSCNFaceGeometry within ARKit, and provide a foundation for developers to do facial tracking for common use cases like: social media filters, accessibility, avatars, virtual try on, etc.<p>While the ARKit's tech is impressive and has a smooth DX, the most frustrating part for me was identifying the vertex indexes for specific points on the face mesh model. Apple does not provide a comprehensive mapping of these vertices, besides a handful of major face landmarks. Vertex 0 is on the center upper lip, for example, but there is seemingly little rhyme or reason for the vertex mapping.<p>While devs could download the vertex mapping, open up with a 3d rendering software, and identify vertex indexes (which is what I originally did), I decided to make a simple web app which simplifies this process.<p>FaceLandmarks.com uses Three.js to render a model of the face mesh, with clickable vertices so you can zoom, pan, and easily identify its vertex. In the future, I hope to continue adding semantic labels for each vertex (there are about 2 dozen so far) for searchability.<p>It was a fun afternoon project and hope it may be helpful to others in this niche case. Enjoy!