The project was an individual assessment for my MSc's Physical Computing and Prototyping module. The brief was to design and build a novel interactive physical interface prototype that addressed a concrete problem. The prototype must feature a micro-controller, sensing, actuation, and digital fabrication. This project, which spanned 2 months, addressed the problem of accidental medicine overdose.
Inspiration came when I was ill and had to take 4 medicines. While alternating between sleep and wakefulness, I couldn't remember when I had taken a dose. I wondered if enough time had passed to take another dose. I wished there was an easy way to ensure I was taking my medicine correctly.
Desk research revealed that others, especially the elderly, experience the same problem. According to the NHS, "the most common form of poisoning in the UK is from medication". In the US, many calls to poison control are from older adults who get their medication confused. When talking with students, I found that those who had experienced serious illnesses or had recovered from surgery had faced the problem. However, most did not see accidental overdose as a problem personally.
After researching, I decided the problem I would focus on was the risk of accidental overdose. Specifically, my design would address ways to reduce (1) intake of extra doses due to forgetfulness and (2) intake of incorrect medicine due to confusing one medicine for another
Desk research, discussions with students, usability guidelines, and personal experience were used to create design principles. Specifically, I designed with situational (dark room), temporary (drowsiness from medication/illness), and permanent (low vision / dexterity due to age) disabilities in mind. The physical interface should (1) make information accessible to those with impaired vision, (2) have a simple straightforward UI, and (3) communicate meaning through multiple senses (text, colors, sounds).
Personal experience and interest in RFID led to the basic idea to tap medicine on a reader to record doses. To expand upon that idea, I used sketching to visualize potential designs. Sketching was used throughout the design process to explore placement of components and integrate new functionalities.
To map out ways a user could interact with the device, I created a user flow diagram. I shared the diagram with students to get feedback before drafting the final version. Feedback informed changes such as removing redundant steps / conditions and rewording messages for better clarity.
The final user flow began with a message inviting users to interact with the device. A user then scanned a medicine, which triggered a response indicating whether it was ok to take (based on time since last dose). After requesting and receiving confirmation from the user that a dose was taken, the device would record a timestamp for the dose.
Prototyping started with an Arduino, RFID reader + tags (to recognize medicine), and a laptop (to show messages and receive input). Next a display and buttons were added to allow users to input and receive information on the device itself. After that, lights and a buzzer were integrated to supplement the text output with audio and visual cues.
The Arduino prototype was developed in phases. In the first phase, a way to read and record dose time for each medicine was programmed using the Arduino IDE. Medicines were identified using RFID and variables for each medicine (tag ID, name, recommended wait time, time of last dose) were stored using a struct. Additional components were then added one by one.
The form factor was the result of several sketches, paper prototypes, 3D CAD models, and a functional foam core model. I shared these mockups shared with students for feedback. The final result was a compact box constructed of laser-cut acrylic.
For convenience reasons, the prototype was tested with students using sample medicines. Testing consisted of informal observations of users interacting with the device in a university-run makerspace. I observed students interacting with the device and asked questions based on their interactions.
Overall, users were able to interact with the device with little guidance. However, there were a few problems. For example, users often pressed buttons too early as they didn’t realize there was more information on the next screen. This behavior made the machine seem unresponsive and distracted users from important information. To fix this, I decreased delay time.
The final prototype was presented to professors and students at the Physical Computing and Prototyping module showcase. For the showcase, a poster and video accompanied a live demonstration.