Monday, April 20, 2015
Open Steno App Wins Second Prize at Connect Ability Hackathon!
Last weekend, I attended the AT&T Connect Ability Hackathon at the NYU Ability Lab, a competition to create accessible technology with and for people with disabilities over the course of two short days. When I signed up, I was a bit worried that there would be nothing for me to do, since I don't know how to code, but early on Saturday morning I had the great good fortune of running into Jacob Mortensen, a freelance Android developer, and Rocio Alonso, an industrial designer for The Adaptive Design Association. My friend and colleague Stan Sakai was also there captioning the event, and between plenary sessions he was awesome enough to sit at our table and give us a hand with the work. The challenge was built around four exemplars, people who used various types of accessible technology and who had specific ideas of how it might be improved. One of these exemplars was Paul Kotler, an autistic college student who uses an augmentative and alternative communication device to speak via text-to-speech synthesis. Ever since 2010, I've been interested in the possibilities of using steno to improve the speed and efficiency of AAC. I knew that a stenographic solution might not work for Paul due to difficulties with motor planning, but his video spurred me in the direction of wanting to work on a realtime stenographic text-to-speech solution for the Hackathon.
We initially started with Brent Nesbitt's StenoKeyboard app, an Android-based open source clone of Plover, because we figured that a phone, with its integrated speaker and small display footprint, would offer us the easiest and most portable solution. We also selected StenoBoard for our hardware, because it's currently the smallest, cheapest, and most readily available steno system on the market. It's a bit too bulky to be perfectly wearable, but it beat out every other option that could be rigged up over the course of a single weekend.
For a thorough explanation of our design process, please check out our ChallengePost Page. We called our project (modified StenoKeyboard app + wearable StenoBoard mount) "StenoSpeak for Android". We worked right up to the submission deadline, and our final system wasn't without its bugs and foibles, but apparently it had enough potential to earn us second prize out of 15 teams competing in the Hackathon! Many, many thanks to Jacob, Rocio, and Stan for working so hard on this. It was a wonderful collaborative experience. Also huge thanks to Brent for StenoKeyboard and Emanuele for StenoBoard, without whom we would have been totally dead in the water.
What's next? We'll see. There are definitely some plans in the works, but our next big objective is to find an AAC user who might be interested in learning steno to help us with future iterations of the project. People with disabilities tend to be some of the earliest adopters and most proficient power users of accessible technology, so I'm hoping to find someone who can join our team as a full and active member while we work on developing this technology into a completely workable and replicable open source product. If you or anyone you know uses AAC to communicate, has full use of their hands, and is willing to spend a few months learning steno with our online textbooks, tutorials, and drilling tools, please get in touch!
Congrats to all of the Hackathon competitors, especially the first prize winner, Cameron Cundiff, with his brilliantalt_text_bot, and the third prize winner, the Tranquil Tracker team, with their seriously cool anxiety-tracking biometric device and app. And, of course, thanks to AT&T and the NYU Ability Lab for putting together this amazing competition!
Check out some photos from our whirlwind hacking weekend:
Rocio's wearable prototype sketches.
Stan modeling our ideal (though non-functional) wearable steno design.
The final (functional) wearable StenoBoard design.
The exultant StenoSpeak Team!