In Part I, we talked about how a feature of the NavvTrack® Care Traffic Control Platform initially used to help manage the location of IV pumps might have potential to monitor and impact an even bigger problem - patient falls.
The Navv Systems R&D lab in Ann Arbor is packed with tools and technology. We experiment regularly with numerous WiFi and Bluetooth Low Energy (BLE) devices, Internet-of-Things (IoT) tools, mobile phones, tablets, and computer models. This allows us to simulate activities key to understanding actions and impact in a real-world healthcare environment. Several of the devices we have are equipped with thermal imaging cameras, which let us monitor the environment in some unique ways.
Even with all this technology in place, we knew there would be some challenges simulating a clinical situation. Like many startups these days, Navv Systems is nearly fully remote. This often means that that our office/lab is empty on certain days. And yet, to validate this use case, we needed someone not only available physically in the building, but someone reliable, obedient, and with a flexible enough work schedule to participate in the simulation.
It took almost no time to realize the perfect helper was right by my side. We have three Great Danes: Eva, Radar, and Nora. Eva is a therapy dog at Mott Children’s Hospital in Ann Arbor, and is a genuine internet celebrity. Radar is a black mantel, the lead squirrel hunter, and “fun police” in our family. Last but not least is Nora, aka Nora the Explorer. She is a fabulous running buddy and often comes to the Navv office with me.
With our responsibilities outlined and understood, we loaded up and headed to Ann Arbor. We have a dog-friendly office, but any time a 125 pound dog trots through the streets, up the stairs and into the office, there is extra attention.
Our combined goal was to demonstrate that the thermal camera, paired with an AI software trained to recognize people, could not only recognize “room occupancy” (i.e. an individual person in a room), but also “seat occupancy” (an individual in a room in a specific space).
To do this, we created virtual seats set up around a conference table. The second step was to confirm the thermal image detection would recognize not just Paul (the Patient) but also Nora (the Nurse). The first test worked perfectly - With the image below, it was clear that both were visible with the room occupancy!
As important as it is to detect people (or dogs that think they are people) it is also very important to exclude things that might normally appear in a room. From the thermal image, we can see not only people, but a number of other things giving off heat (a laptop, a cup of coffee and my iPhone). These were all importantly not detected as a person.
Finally, we needed to validate the system's capability to detect people in a different position other than seated at the table, so I hopped up on the table and laid down as if I was a patient in a bed. This worked as well. Nora, always a quick learner, obediently positioned herself near the table and was detected in one of the ”seats”.
Thanks to Nora, we can move on to testing in the hospital, and incorporate this feature into the future of our NavvTrack® Care Traffic Control platform, enhancing and improving some very real patient care needs.