Integrating mobile into enterprise imaging and teleradiology workflows
In spring 2019, Dicom Systems collaborated with WinguMD to host a webinar about how the healthcare industry is working to make imaging workflows more accepting of mobile devices. A representative from Nicklaus Children’s Hospital was also present to give us practical feedback. The participants discussed how personal phones and other mobile devices can be woven into the existing infrastructure of enterprise imaging systems. They also speculated about where the future of imaging will lead to medicine.
- Florent Saint-Clair, Executive Vice President of Dicom Systems
- Manabu Tokunaga, CEO and Co-founder, WinguMD – makers of ZenSnap by WinguMD
- Dr. Nolan Altman, Pediatric Neuroradiologist and Chief, Department of Radiology, Nicklaus Children’s Hospital
Optimizing Personal Devices for Medicine
Florent Saint-Clair: So the context for the conversation today is a well-admired hospital in Miami. Nicklaus Children’s Hospital has been highly innovative and creative in the care that they place into the utilization of technology to advance patient’s health, and especially children. Some of the most interesting innovations we’ve done as a company, at Dicom Systems, have involved Nicklaus Children’s because they push the envelope pretty routinely. We were fortunate to be introduced to Nicklaus Children’s by our partners at Cerner, a company that is completely committed to the world of interoperability. Their commitment to interoperability is what’s allowed us to do some wonders in some of the clinical contexts that are typically closed off to other vendors. And so this has been thanks to the Chief Nursing Information Officer, Elise Hermes, and of course, the leadership and stewardship of Dr. Altman in the adoption of this platform to advance some of the clinical workflows we’re gonna be looking at today.
Optimizing Personal Mobile Devices for Medicine
Our typical customers at Dicom Systems are PACS/MIMPS administrators, network administrators, the directors of IT, the CIO, and people that are concerned with the enterprise infrastructure that’s serving up the images for the physicians. Over the past 11 years, Dicom Systems has pretty routinely run about 9 billion images through our systems, annually. Some of our largest installations do as many as 5000 daily commits of exams to our archives. We also power about 12 million annual Telerad reads, which is not counting relevant priors on top of that number, and we’ve also increasingly gotten involved in workflows that involve machine learning (ML). Some of the largest that we’ve done had as many as 5.5 million exams that needed to be de-identified scientifically to be able to be used to feed into a machine learning algorithm to learn new things.* Using the Unifier platform as kind of a universal adaptor has allowed us to essentially plug in the ZenSnap app that we will be spending our time talking about today, along with Dr. Altman and Manabu Tokunaga. This interoperability layer plugs into Cerner and plugs into virtually any enterprise imaging node that needs to talk to other nodes, and that could be HL7, it could be FHIR, it could be DICOM. The interoperability layer is the Dicom Systems Unifier platform.
* As of 2021, over 30 billion medical images pass through the Dicom Systems Unifier platform annually.
One of the most important things to consider in this conversation is the role of mobile devices in today’s hospital context. All of us, physicians included, are using smartphones for virtually everything and professional purposes. Physicians are extremely resourceful individuals, and it doesn’t matter if a hospital has sanctioned a method of communication or not, they’re gonna do whatever is best for their patients, even if that includes utilizing a mobile device that doesn’t belong to the hospital. It happens every day and is one of the key problems that we solve with the Unifier platform. These are some of the issues that plague, not just enterprise imaging, but also every department that uses imaging one way or the other. The number one issue is unauthorized and unsecured personal mobile devices. That means iPads, iPhones, Android devices. These are all devices that typically belong to the individual using it. And in some cases, believe it or not, even with all the HIPAA education that goes on every day, a lot of people still use email or text, or other unsanctioned methodology to share images with one another. Another key issue is that those same mobile devices have zero enterprise integration. So the ability to utilize some of the EHR functionality or the PACS/MIMPS functionality is siloed away from the ability to use your mobile devices.
Imaging with Mobile Devices
How do we make this a seamless integration where everybody wants to use mobile devices, but it’s not a controlled environment that they’re doing it in? One of the key layers is the existing infrastructure. Most hospitals have Lightweight Directory Access Protocol (LDAP) or Active Directory, they have a PACS/MIMPS, and they have an EHR. In this case, Cerner has both the EHR and the PACS/MIMPS at Nicklaus Children’s. If you are going to use mobile device imaging, you don’t want to have to completely reinvent the infrastructure. The hospital has already invested capital in the acquisition and customization of the EHR and the PACS/MIMPS. To accommodate mobile devices, we have to be unobtrusive and benign in the impact on the existing infrastructure. The ability to use DICOM Modality Worklist has typically not been available to mobile devices, because an iPhone or an iPad is not a typical modality that communicates with the worklist to a RIS or a PACS/MIMPS. We need to be able to reduce human error and data entry into a mobile device by giving the mobile device access to a worklist. Another key aspect of this platform is that you can download the app from iTunes today for free. This is leveraging an app that is very natural for people to learn. So there’s virtually no end-user adoption. We started in radiology, but adoption is now pervasive throughout the hospital. Any other department that could be using this now wants to because it’s such a natural toolset to use with no training. So this is also sanctioned by the hospital, so the CIO doesn’t have to think twice about it. The fleet of devices is now belonging to the hospital. The iPads were purchased by the hospital and deployed to the physicians and technologists to do their job with. Security and compliance are taken care of because these are the tools that are being given to the physicians and caregivers by the hospital. You can actually view any type of modality that is now available through the integration with the PACS/MIMPS. By leveraging DICOMweb as a standard, WADO-RS, QIDO-RS, STOW-RS, all of those standards that are deployed through the Dicom Systems Unifier are available through this user interface. I’d like to give the mic to Manabu to speak to this a little bit.
Manabu Tokunaga: Thank you very much, Florent. I have been in this industry for a long time. Throughout all this time, there are two key themes that I was always asked to solve, which is, how I can save time and how I can accurately communicate what I’ve seen to other people. Now mobile communication has come, which is really perfect to address this first ‘save time and accuracy’ part. And the security part is coming. So that’s how I’ve been making an effort to develop this into a completely cohesive package that can be used in clinical situations. And down the line, we’re also starting to add the patient engagement part to it, which would make it very easy for the patient to also provide information, or from the clinical side, provide information because pretty much most people have Android or iPhone. So this would be a perfect platform now to address all these three issues. What people want to do in a clinical setting, is that people come in right now in the morning and have a conference, round conference, and then they talk about the case, but what they want to do is to actually that communication to continue throughout the day. And that’s the core of this application.
Department Collaboration Through Imaging
Florent Saint-Clair: This is really the crux of our conversation. When we started discussing this project with Nicklaus Children’s I asked, why would radiology need photography in the practice of diagnostics? Because radiology uses ultrasound, MRI, and nuclear medicine. They use internal medical imaging to articulate the diagnosis. Why do you need photography? You have a radiology workstation, you have a PACS/MIMPS, you have medical images open on the desktop, and you’re projecting images on the wall, and side by side, you now have all of these physicians, that are collaborating on problematic cases that are able to see not just the internal images that they need to discuss and next steps in the care for the patient, but also the outward manifestation of symptoms. Taking a picture of the patient, taking a picture of the skin, in addition to showing the medical images, provides a very effective holistic approach to collaborating in that room. What was really interesting to us to witness this is to see how our technology was being deployed in a very innovative way, in a clinical conference. At any tumor board, any clinical conference, people get together for an hour or 45 minutes in the morning and the collaboration is typically very effective because they’re all in the same room in the same context, looking at the same information. And then everybody gets up at 8:00 a.m. They go to do their jobs on different floors, they attend to their patients, and the collaboration that was so good for an hour gets stopped dead in its tracks at 8:00 a.m. Now, using this platform, the collaboration continues throughout the day. It’s almost like you have a Twitter feed on each patient case that was discussed during the clinical conference. Dr. Altman, this would be an interesting point for you to bring to the audience.
Dr. Nolan Altman: Thank you very much, Florent. And I’m glad to be part of this program because we’re glad to have you under the hood as far as our DICOM collaboration allows us to do many things that Cerner wouldn’t quite allow us to do through your intricate rules engine. And then this new endeavor with Manabu allows us to do things that we have been doing for some time, besides our daily 7:00 a.m.meetings that we have with the residents and the attendings, and we have those in the different disciplines in our department to help everyone stay on the same page. I think that all of us realize that our communicator device is our cell phone and that’s where everything really is residing. I’ve been using my personal cell phone to communicate with a lot of different things regarding imaging. And from the resident sending iPhone pictures off of the PACS/MIMPS workstation to me at home, to the other attendings within the hospital sending me pictures of cases that they may see at other hospitals, everything ends up residing in the iPhone. You all have allowed us to basically do this the proper way by having the ability for us, off of our DICOM-integrated PACS/MIMPS worklist, to be able to have images all reside in the patient’s worklist and we can see it quickly and share it with our other colleagues. And that’s kind of what I wanna show. For us, we’ve been doing, for quite some time, documentation of the patient’s skin over the areas of concern. And I’ll show some cases anywhere from along the patient’s spine to see if there are underlying neurocutaneous disorders, to patients that have interventional procedures. And once we see the lesion, once the technologists can see the lesion, we, not only at the hospital, but we have 12 satellite centers where we have ultrasound exams being done, and so not only do we get the ultrasound images and some of our satellites have MRI? images, but we get the clinical pictures and we can put them all together as another pulse sequence, say, in the case of MR images, we just drop it all into the PACS/MIMPS system and we can see all of the images of the patient at one time.
Mobile Devices vs. Specialist Medical Imaging Equipment
Florent Saint-Clair: So Dr. Altman, there’s actually an important distinction in the way that images get created between a mobile device versus an ultrasound. So ultrasound, CT, MR, nuc med, all of those devices typically use something called a DICOM Modality Worklist in order to know what’s next, who’s the next patient, and get the proper demographics from the EHR automatically populated in the DICOM headers. It’s a little bit different when you’re dealing with a mobile device. And so historically, when you were using a mobile device, you didn’t really have a place to put metadata into the images, and so you would have orphan images that would have to be manually handled, Dicom-ized, and then placed into the PACS/MIMPS, and so… on? But that also means that you have a different kind of workflow that doesn’t involve the worklist, and that’s called an encounter-based imaging event. Encounter-based imaging is a totally different sport, but it can utilize the same underlying infrastructure to provide a worklist in order to reduce human error in data entry. In an iPhone, it would be very easy to introduce typos in the patient’s name or in various things. And so by providing worklist to an iPad or an iPhone, as if it were any other conventional imaging equipment in the hospital, you dramatically reduce the amount of time that it takes to not just begin an encounter, but also make it ingestible by the PACS/MIMPS and available to the rest of the enterprise through the PACS/MIMPS viewer. So those two different types of workflows are interesting to discuss. So I’ll let you talk about that.
Manabu Tokunaga: You’ll be hearing about IHE encounter-based workflow, and Nicklaus Children’s is one of the earlier places that is going to be able to do this mainly because of the flexibility of the Dicom Systems Unifier. As a ZenSnap user, we just look at it as if it’s a DICOM Modality Worklist. The Dicom Systems Unifier platform will get all these encounter events, like admission, discharge or transfer (ADT) events, as well as the radiology events. And we’ll make it work the same way, you look for the patient and take the photo, and we’ll just route it to the right place, either on your Podcharts folder or to the DICOM part of the pulse sequence or the series, actually, in the DICOM series image. So this is going to be a very, very flexible solution, and it’s not just a mobile photo capture. We do a lot of this kind of work, and then we integrate it with other Artificial Intelligence (AI) stuff that’s behind the scenes. So make the contacts available, much easier and sooner sort of things.
Florent Saint-Clair: You can take a picture of a document and Optical Character Recognition (OCR) will be able to make that document searchable in your mobile device.
Manabu Tokunaga: Yeah. Not only searchable, but we can match that with the Medical Record Number (MRN), we can locate MRN accession number, order number, that sort of thing.
Florent Saint-Clair: Nicklaus Children’s still own all those Nikon cameras that they use occasionally. I think you can buy a $40 Bluetooth adapter for the Nikon camera, and actually connect it to the iPad and take pictures and still make it available through the same mechanism.
Manabu Tokunaga: Not only that, you can add an otoscope, you can do a fundoscopy, even a microscope, it can be added on your iPhone, just an attachment, and you would be able to do a much wider range of photo acquisitions, all backed by the modality works, all end up in the right place, right patient, right chart.
Dr. Nolan Altman: As the end-user, I think it’s very important that we have it all organized, but we just wanna see it quickly, and we wanna see it accurately, and that’s what you’ve allowed us to do. While we’ve had many devices that we can use, I think that the common worklist and the applications of your rules engine, to allow us to do that, is what’s really pushing this product forward.
Florent Saint-Clair: We’ll leave the floor now to the physician in the room. Dr. Altman, these are the cases that you leverage this technology for.
Dr. Nolan Altman: This is a little girl that was seen in an outside imaging center, and she has these different spots on her scalp. The echnologist knows that if there’s any patient that has a lump or a bump, they need to take a picture of it for us so we know what we’re looking at. This little girl has these little spots on her scalp, and they’re red, and they’re raised, and we can see that, quickly and cleanly, with this picture that the technologist at the satellite center’s taken of this little girl that came in, and she sent it with the ultrasound picture next. On the ultrasound that was done, on the left side, a gray-scale ultrasound, and on the right side, this is a color Doppler study of the same area. And you can see there’s a lot of color in here and we know that, as radiologists, that this indicates it’s a vascular malformation. There are different types of vascular malformations that you can see, and there’s venous malformations, hemangiomas. But with the picture of the child, we know for sure that this is a hemangioma, and there’s nothing to do about these things. They’ll go away, and we can tell the mom and dad before they leave the imaging area, “This is what it is. You don’t have to worry about it.”
Hemangioma lesion on infant’s head. Source: ZenSnap by WinguMD
Color doppler study of Infantile Hemangioma. Source: ZenSnap by WinguMD
Another patient comes in and has a little lump, and we then have the technologist do the ultrasound picture and send it to us. When we see the clinical image, and we compare it with this color ultrasound image, which may look the same as that of hemangioma, we know in an instant that this isn’t. This is an infection. It’s an abscess. It needs to be drained. And we then went on to do Interventional Radiology (IR), and drain the abscess on this baby.
Web-based Image Processing
Manabu Tokunaga: The next couple of phases that we are going to look at, one is, of course, the web-based, which is going to integrate the Dicom Systems web-based radiology directory into the app. Since we already have the contacts of which patient you’re looking at, it will be easy to show all of the relevant radiology images. This also has additional benefits in the sense that sometimes you may have a PACS/MIMPS downtime, or for whatever reason, you cannot access the PACS/MIMPS, in which case you could use this as a backup mechanism to look at the images as they’re acquired. We are going to show the images as they hit the Dicom system server, which is the first stage. Of course, we pull the priors, too, for the given patient. Annotation on screen, you already have seen, and so you can actually draw arrows and stuff like that, or circles, which of course you are familiar with. We also have a patented size measurement algorithm so that with the aid of a specifically coded ruler, you’ll be able to actually measure the size of the actual object. I know that for some detailed wound care users, you have more detailed 3D type equipment, but essentially, you can turn your iPhone or iPad into a viable size measuring device at the kind of price point you pay for them. So that’s really great. Facial AI. When you walk into the doctor’s office, many doctors can tell you immediately right away what’s wrong with the patient. And so we’re going to bring that excitement in terms of AI. By taking the photo, it can tell what the patient might potentially have to aid in triaging, we are going to be diagnosing stuff. And then we have the Alexa of AI technology. Using our chat mechanism, you can just talk to the engine and then the engine can answer, like “what’s the blood cell count?”, “Is the lab complete?”, “is infection indicated?” You can just ask questions rather than going into the EHR and trying to find these questions. It’s voice-based, natural language, and smart processing. Another exciting thing that Nicklaus Children’s is getting.
Florent Saint-Clair: Interesting use cases. We did get a couple of questions, by the way. We’ll go through them in a minute. But one interesting project at Nicklaus Children’s that you completed was that you also have a lot of historic pictures that were taken by various physicians including in plastic surgery and reconstructive surgery. And the ability to Dicom-ize and process those, categorize them, and place them in a metadata-rich environment is also important, right? As time goes forward, when you have this new app, you can now use it, but there’s also a lot of historic images that can be handled through this platform.
Manabu Tokunaga: Correct. Imagine that your parents one day give you a box full of loose photos, and say, “This is for you. You do whatever you want to do with it.” Now you want to put in an album and organize. Our plastic surgeon at Nicklaus Children’s gave us the equivalent of that, and then my task was to put them in the right place. Because it’s a digital image, of course, we can tell you which clinic they are using. They took a picture of the chart so I can use OCR and then fish out the information. We can get about 90% accuracy in actually sorting out the images. In just a matter of half an hour, we’re able to sort out about 2000 images. If you have in the audience a bunch of loose photos that have to be done as a part migration, do talk to us. We have the technology to help you with that.
Listen to the remainder of the conversation by viewing the full webinar recording.
For more information about Unifier, offered in partnership with ZenSpap by WinguMD, request a demo.
Unifier with AI Conductor for PACS and EHR drives and conducts AI workflows to get the right information to the right location at the right time and in the right format.Learn More