Can you provide more information on the prescription lens add-on?
The prescription lens set contains pairs of high quality lenses with the following diopters: -6, -5, -4, -3, -2, -1.5, -1, -0.5, 0, 0.5, 1, 1.5, 2, 3, 4. A pair of 0 diopter plano lenses are included with the AdHawk MindLink. As an alternative, users may visit their optician with the AdHawk MindLink glasses to order custom prescription lenses.
Do your MindLink glasses work outside?
The AdHawk MindLink glasses are designed with outdoor eye tracking in mind. The detectors in the system architecture are tuned (both in frequency and wavelength) to only detect light from the scanner modules, so the glasses are robust to sunlight. However, outdoor tracking in bright sunlight comes with challenges like smaller pupils, squinting eyelids and other human factors. All of these can affect the quality of eye tracking.
Do you have a product brochure, data, or spec documents for download?
What is the Quick Start button?
The Quick Start is the fastest way to get eye tracking! A Quick start should be done at the beginning of every session. It performs an autotune to lock on to the location of an individual user’s eye, followed by 1 point calibration. More details about how and why to use quick start can be found here.
Where can the AdHawk Hub User guide, and the Android User Guide be found.
Both documents can be found on our resource page.
What is the accuracy, precision and robustness of the AdHawk MindLink?
Better than 1 degree Mean Absolute Error (MAE) across an FOV of 40 degrees H and 25 degrees V.
Is the data real-time?
Yes, both PC and mobile present the eye tracking data in real time over the world view camera. On PC various other outputs can be viewed in real time such as pupil size, vergence, and more.
Does the MEMS sensor generate something like an image of the eye?
The software does not generate a video of the eye, but it does provide a live view of key eye tracking features - the pupil outline and the glints.
Does AdHawk MindLink come with an API?
The AdHawk MindLink ships with an API that will allow users to subscribe to various real-time eye tracking data streams. Available streams include: Gaze Angle (Binocular/L/R), Pupil Position, Vergence Angle, and Pupil Size. A SDK in both C and Python will also be included to assist developers with connecting the AdHawk MindLink to their applications.
Can all of the features be initialized / accessed via the API?
Yes, the API allows you to control all aspects of the eye tracker including initialization and configuration, turning on and off various data streams as well as performing a calibration.
Where can I find the API specifications, and documentation?
The API specification documentation can be found in the resource section of the support site.
Data & Analysis Questions
How do I retrieve the data from the device?
The AdHawk MindLink system uses a tethered connection to the included Android mobile device or a Windows computer. We also have beta software for a tethered connection to a Mac computer. The software supports recording eye tracking gaze data and displays a live preview of a calibrated gaze marker superimposed on the world camera video. The data streams can also be received in real-time via the API and written to a CSV file for offline analysis. Data collected on the android device can be transferred to a PC when plugged in using the AdHawk hub by going Tools>Download Android Recordings.
Does the AdHawk MindLink software provide any analysis tools?
The software allows researchers to capture and replay high speed eye tracking data from the AdHawk MindLink glasses. The recorded data includes gaze angle, pupil position, pupil size, validation data, meta data, and vergence. The software also displays and records video from the world camera with a gaze marker overlaid. Additional analysis features shall be added in future software updates and the expected data will be compatible with commercially available analysis tools.
Is it possible to synchronize the timing of the visual stimulus presentation and the gaze tracking data in real time so that the stimulus presentation can change depending on the real-time gaze data?
The data streams can be fed into your own software to control any stimulus but they are provided by the API with a ~3ms latency so are real time. We do not have the ability to insert event markers on the gaze data with the included software. However, end users may subscribe to the streams via the API and implement that functionality on the user's end.
Can you see the end user’s data on their computer screens or would it be a bit blurry if the person’s head is moving?
The software displays a real time video feed from the front facing camera with a calibrated gaze dot superimposed. This view can also be recorded for later review. We record the video at 30fps so excessive head movement may cause the video to be slightly blurred when their head is moving but when they are focusing on something, all the contents of the display on the screen will be visible.
What are the details of the front facing camera?
On a PC the resolution is selectable:
On the mobile phone the resolution is fixed at 1280x720p. On both platforms the framerate is 30 fps.
What is the camera’s FOV?
82 Degrees diagonal
How long is the battery life?
The mobile phone battery life is expected to be: 4 hours recording eye tracking and front facing camera, or 18 hours recording eye tracking only. These are estimates and subject to change depending on the particular circumstances of the eye tracking session. A laptop, or other PC should be used while plugged in, providing unlimited operation time.
Are the glasses battery powered?
The glasses themselves don't have a battery, they rely on the included Android device or a computer for power.
There is a problem with my glasses.
My eye tracking data is slightly offset when looking at a close target like my screen.
If the target you are looking at is at a known distance from the eyes, we recommend turning off parallax correction, and setting the gaze depth. More information about parallax and its effects on eye tracking can be found here.
Why does the front facing camera preview not work after I plug in my MindLink glasses?
Please ensure that the device is properly connected. If multiple cameras, such as webcams, are connected to the device, check the camera index on the left side of the AdHawk Hub to ensure the AdHawk Front Facing camera is selected.
I get a popup saying no right/left eye detected when trying to quickly start or calibrate.
We recommend performing a device calibration at the beginning of each session, with a minimum of once per week. If you are still not detecting one of the eyes after this, you may have selected the wrong nose piece. To select the right nosepiece for you please look at the Nosepiece selection guide. If you are still not detecting one or both of the eyes, a report can be submitted.