This project started when...
My interest in medicine was mixed with my passion for design. I have always had an interest in the medical industry and how technologies are being used to enable a better service.
For this project I took my understanding in the lack of user engagement while partaking in rehabilitation. With this I decided to face the following question: "How disruptive technology can be used to support Physical Medicine and Rehabilitation(PM&R) for neurodegenerative diseases". How could I use technologies to not only help with the rehabilitation process but also increase user engagement within the home environment.
An interactive environment was my solution to the problem. Creating a product that would provide the users with an easy set up that created any room in their living environment into an interactive rehabilitative environment. Due to the budget of the project I had to develop a low fidelity product using alternative methods. A torch light was reflected onto a mirror, used as a low cost spot light, which users would control using their arms (the arms being the concentrated area of the rehabilitation).
I knew this project would hold some challenges but who doesn't like a challenge? I begun by thinking about the various tasks the project would break into; creating a gantt chart breaking all the tasks into sprints.
Research into the end users of the product was carried out in detail and a comparison of what was already on offer to these users was looked at. Based on these different factors the designs for the prototype was created.
An evaluation of the designs were made taking into various factors such as resources, time, and functionality.
A low fidelity prototype was created to allow testing of the product. Engagement of the user was tested and resulted in a pass.
How did it work?
A Microsoft Kinect sensor will be used to pick up the movement of the user. The Kinect sensor transforms an ordinary room into a space capable of translating gestures into digital information. The digital information is then mapped into the Arduino which is used to control the system. I had programmed the using C/C++ to pick up the translated gestures through Unity (game engine) to control the motors with the users gestures. Two servo motors were used to control the direction of the light reflected (left, right, up, and down). The users would partake in a simple shape matching activity which allowed the rehabilitative activity to become fun and engaging; promoting play within rehabilitation.