Hi everyone! I'm really excited to show off a new project we've been working on over here at Team CoughDrop. You probably get the gist from the title, so I'll just jump right in and say this new project means that people don't have to be limited to just touch or scanning-based options (both of which we think are great for many) just because they can't afford a dedicated, insurance-funded device. You can see more details in the video here:
From the beginning we've believed in open development as a way to have the broadest impact possible. AAC isn't just some fun game or video player, it's an important part of many people's communication strategy, and we believe that makes it too important to be held back by things like vendor lock-in and proprietary feature competition. CoughDrop is open source, we use Open APIs and users can export anything they create in CoughDrop to a standardized file format for use in other systems.
In addition to that, as we create libraries that we share across tools, we release those open source as well. Today we are announcing a new pair of tools that make it possible to track head position or eye gaze without needing to purchase an expensive, hardware-based tracker. Now keep in mind, these tools will not be quite as accurate as a dedicated solution, and they will most likely use up your battery more quickly, but even still we think a lot of people can benefit from having a more affordable means of utilizing alternative access solutions.
These head and gaze-tracking tools are now included in our apps, and we encourage others to implement them as well. CoughDrop can track your head or eye gaze on any mobile device, and will leverage native tracking libraries whenever they are available inside the apps. Co-VidSpeak, which is fully web-based, also allows head tracking or eye gaze tracking as a way to hit buttons using just the camera. I'm really excited that we can offer these features to more people who can benefit from them. Again, these camera-based trackers will not be quite as accurate as a hardware based solution (and eye gaze is still more sensitive to head movements than we'd like), but there are many people who don't have access to expensive hardware solutions, and we want to give every user their best chance for success with communication. Please let us know what you think, and please encourage other AAC vendors to check out these libraries and implement the features in their apps as well!