Is finding points (68 points) in the OpenFace project done by DLib model or deep neural networks?
I implemented this in Unity
But I can only test the similarity of the face.
I'm looking for a way to implement face recognition with deep neural networks.
Please help me thank you
Hi all,
I am currently doing my PhD and I would like to use OpenFace to analyze my data.
I am interested in the eye-head coordination during eye movements -- I would like to analyze if the head follow the eyes during eye movements (if yes, in what extent). For that, I would like to calculate gaze direction and head direction.
1) If I understand properly, I could use gaze_angle_x and gaze_angle_y for this. But I would like to clarify one point: is this gaze vector relative to the head? or is it independent from the head orientation?
2) For the head direction, is there a way to transform the Pose values into a direction vector? If not, can we translate the eye movements into pitch and yaw to get values, which are comparable to head movements?
Best,
Anaïs Servais