Monday, October 31, 2011

Recording voice, eye-gaze and touch on Android tablets

We presented our codebase which records voice, eye gaze and touch on Android tablets at the Academy of Aphasia annual meeting a few weeks ago. Our poster is here


Our Architecture and Results

2011      (with A. Marquis and A. Achim) "Aphasia Assessment on Android: recording voice, eye-gaze and touch for the BAT," Academy of Aphasia 49th Annual Meeting, Montréal.

I wanted to make it as easy as possible to reuse our code so I made a couple of videos to walk through the project and explain it in non-technical terms.

The first video talkes about the Android side which simply collects the video, audio and touch data.




The second video talks about the "server side" where a lot of the open source repositories are used and the really exciting data extraction and analysis takes place.



The third video gives an overview of how to get the code.



The fourth video is a lot longer than the others because it shows how you can adapt the project to your own experiment and also how you can use GitHub to manage your own projects (good for long distance collaboration and delegating among team members).



The last video is a quick demo of the touch data we got for the stimulus "shin" for our subjects, with a lively rendition of "Parole, Parole" :)

No comments: