Migraine researcher Dr. Alexandre DaSilva may not be able to feel your pain, but thanks to virtual reality, he can see it.
Or more accurately, “visualize” it, since the pain specialist and faculty member at the University of Michigan School of Dentistry and Center for Human Growth and Development wears a Microsoft HoloLens to look at a hologram of brain activity, not the organ.
So far, he’s used recorded activity of patients’ brains in a clinical environment, but soon he’ll be able to use a real-time application on patients wearing sensors on their heads. When those sensors send information wirelessly to the HoloLens computer, it will show DaSilva color-coded areas that indicate if pain centers in the brain are activated in the patients’ heads.
DaSilva and others in medicine use digital platforms — like the Microsoft HoloLens’ augmented reality, which superimposes holograms over the real world, and the Oculus Rift, which simulates a world other than the real one — in ways never imagined by their creators.
The devices are normally used for playing games like Minecraft. The world of medicine is instead using the technologies to treat PTSD and addictions; teach anatomy and educate patients about their conditions; practice robotic surgery skills; plan corrections for skull defects; assess balance as a hedge against falling; rehabilitate disabilities from stroke and other brain injuries; and — in DaSilva’s case — analyze pain.
“We can better understand, in an objective way, when the patient is really suffering, because sometimes they cannot express the pain,” says DaSilva, citing people left speechless after a brain injury as an example. “And the idea is even to decide what the stage of their pain is and block it or intervene with brain stimulation.”
DaSilva knows a lot about such stimulation: His earlier research showed that electrical current applied to certain areas of the brain helps prevent or alleviate pain, including migraine, TMD, and cancer pain.
To visualize pain in real time, DaSilva combines his neurotechnology research with virtual reality visualization techniques used by specialist Ted Hall and the team of artists and computer graphics experts at the University of Michigan’s 3D Lab.
Hall and his colleagues can use software to modify scans from imaging like X-ray or PET and make what’s transparent appear opaque and vice versa, modify their color, add skin, or cut them into parts that can be reassembled and rotated 360 degrees.
“We can view and slice through the model in virtual reality using the MIDEN or the Oculus Rift,” says Hall, referring to the Michigan Immersive Digital Experience Nexus in the 3D Lab, which convinces users they’re in a computer-generated, virtual 3-D world, and to Facebook’s Rift, sort of a digital View-Master.
U-M’s 3D Lab has made Ann Arbor a hot spot in Michigan for augmented and virtual reality in medicine, but others around the state are using it, too.
Tricking the Brain
Three years ago, at a Wayne State University computer science class, physical therapist Dr. Sujay Galen pitched his idea for an app to assess people’s risk of falling due to poor balance and found a willing programmer in then-senior Alex Pavlov. The resulting app is based on accepted medical assessments for balance.
Pavlov wrote the app for the Microsoft Kinect, which is normally used with Xbox game consoles. But Kinect can also be used with Windows PCs for user-written apps like the one Galen had in mind.
The Kinect has a camera, depth sensor, and microphone that detect a person’s body, voice, and the space they’re located in. Then, it typically sends that data to the Xbox. But in Galen’s case, it goes to a computer.
After Galen, now interim director of WSU’s Physical Therapy Program, published a journal article about the validity of using the Kinect and Pavlov’s app — dubbed the Interactive Functional Reach Test — to assess balance, he went to a rehabilitation conference.
By chance, he sat next to physical therapist Dr. Mary Roberts and her colleagues from Michigan Medicine, formerly the University of Michigan Health System.
Now Roberts, who is planning a study using the Kinect to help people who are weak on one side after a stroke, and her team at Michigan Medicine’s Canton health center plan to use Galen’s app with the Kinect to assess the balance of participants in their research. Then they will try to trick the brains of people in the study into believing their weak arm or leg is getting stronger and can move normally.
They’ll do this with an app that’s being developed by U-M’s 3D Lab to use with the Kinect. It will analyze movements made by the participant’s healthy side. Through digital technology, participants will view their good limb and its mirror image so it looks as though their weak arm or leg is moving normally.
“We hypothesize that by viewing normal movement of your weaker limb, the ‘mirror neuron’ network in the brain will become activated and will ultimately improve the function of your weaker side,” Roberts says.
Her study also has a twist not typically found in medical research — it’s fun.
She explains that motivating some people with neurological impairment after a stroke can be difficult. A fun, interactive game-like therapy that gets participants to challenge themselves can help. She compares the planned use of the Kinect app to the games played with the Wii video game console — which include simulated activities such as tightrope walking and boxing — that have been used in physical therapy. Like Wii games, her study’s app will give the player immediate visual feedback on a screen.
“It kind of sparks their motivation to beat the clock or beat their score,” Roberts says. “I think it’s just human nature to be competitive.”
Thinking Outside the Lab
Both Galen and Roberts think their apps may have potential for home use. Galen estimates the current cost for the hardware at $300. The cost would be even less if the Kinect is used with a computer-equipped smart TV.
Galen’s vision is to have someone use his app in their own home and having their results uploaded to the cloud, where a medical team can assess the patient’s balance — daily, if necessary — and their potential for falling.
“We can intervene before the fall happens,” he says.
That would be great news for Americans, especially those 65 and older. More than one in four people in this age group fall each year, according to the Centers for Disease Control and Prevention. Of those who do fall, 20 percent break a bone, such as a hip, or sustain a head injury, at an annual nationwide medical cost of $31 billion. Worse, falling once doubles the chance of falling again.
Roberts also sees possibilities for home use by stroke patients like the ones she hopes to enroll in her study.
“Ultimately, it would be great if we could make an app for home,” she says. “If it’s fun and something these people like and is engaging, why not continue it at home?”
Dr. Rajiv Ranganathan, an assistant professor of kinesiology and mechanical engineering at Michigan State University, is another researcher studying the use of virtual reality for people with problems after a stroke. He, too, likes the idea of having therapeutic apps for home use.
“If we can increase the dose [i.e., time spent practicing with the app], hopefully that should result in better outcomes,” he says.
Ranganathan’s research differs from Roberts’ in that his participants practice only with their arms. He is planning an upcoming study that will try to trick participants’ brains into believing their efforts are more, or less, effective than they are. By seeing their efforts as more effective, some people may get more confident and motivated to move more, he says.
Not everyone responds the same way, though. Some people are motivated when their movements appear to be less effective than they are.
What Ranganathan wants to avoid, he says, is people who decide that “I can reach for my cup of coffee and that’s enough” when they are likely capable of more. To him, using VR in rehab is “sort of like having a coach to push you.”
Beating Your Best Score
Others are using virtual reality to solve other health care challenges.
Theresa Ricketts, a physical therapist at the Beaumont Health Center in Royal Oak, uses it with the Lokomat robotic-assisted walking therapy equipment for patients like Matt Fitzsimons of Richmond. The Lokomat has a built-in computer and video monitor that play games controlled by a user’s movements.
Fitzsimons, 19, who can walk if assisted but has physical and communication impairments from a closed head injury, visits Ricketts three times a week for a 45-minute session on the Lokomat’s treadmill to strengthen his core and lower body muscles and to improve his endurance. But he’s doing more than walking.
In one game, hips and knees control an avatar shown on the monitor who competes with a second virtual reality character to pick up coins. In other games, the user controls the height of a flying avatar or a walker’s position on a road. The Lokomat records the user’s time and distance along with a score. It also displays the user’s personal best score so they can try to beat it.
“The great thing about the games is they’re powered by his muscles,” says Ricketts, adding, “The computer measures the force and the amount of motion the patient puts in, which helps drive the motion of the avatar.”
Ricketts says the avatars push her patients to do better because the game aspect motivates them.
Paula Fitzsimons, Matt’s mother, agrees: “Matt loves doing that because he knows that he has to beat his score.”
Practice Makes Perfect
Also at Beaumont, surgeons and residents training for specialties like general surgery, urology, and obstetrics/gynecology practice virtual reality skills while practicing on people. To do this, they use the MIMIC trainer, which virtually simulates technical skills needed for the da Vinci Surgical System and scores a user’s performance on them.
Without the VR-based training, the robotic surgery “wouldn’t be safe,” says Dr. Kathryn Ziegler, a surgeon and director of Beaumont’s Applebaum Simulation Learning Institute.
The technical skills practiced with the MIMIC include economy of motion, depth perception, eye-hand-foot control, needle handling, knot tying, and more, says Diane Schuch, director of operations at the institute, which is in Royal Oak. The stereoscopic vision and thumb controls that have a scissors-like action are identical to those on the da Vinci, she says.
Unlike the traditional approach — practice on a cadaver or body part from a euthanized animal — the MIMIC automatically assesses a user’s skills and scores them. The hospital is working on making the results more accessible for bosses, too.
In addition to timing the user and recording errors such as collisions between instruments or use of excessive force, the MIMIC also offers other telltale information indicating that a user might be having trouble.
“If someone has repeated an exercise over and over again, I’ll be able to see that,” Ziegler says.
While Ziegler focuses on training surgeons, Dr. Hera Kim-Berman uses virtual reality to train dentists at U-M and is trying to see how effective it is, she says.
Kim-Berman collected patient skull CBCT images showing orthognathic problems and gave them to Hall and the team at U-M’s 3D Lab to create a VR training program. She’s doing a study to figure out whether the VR strategy — which has a user wear an Oculus Rift headset, hold a touch controller, rotate a 3-D image of the skull on a screen, take it apart, and then reassemble it — is a better way for dental students to learn specific anatomy relevant to their specialty.
Right now, students trace a 2-D skull X-ray on paper, then cut it into parts that can be moved around on a table, which is typically how most students still learn, Kim-Berman says.
Dr. Scott Sakowitz, a dentist who’s training in orthodontics at U-M and helping Kim-Berman with her research, likes the VR approach.
“It definitely allows me to see what’s going on with a patient a lot better. I found I was able to detect more subtle problems than when I was just looking at the X-ray,” he says.
While U-M dental students focus exclusively on images of the skull, health sciences students at Macomb Community College in Clinton Township learn anatomy with the Anatomage virtual dissection table.
Dr. Diane Roose, associate dean for health sciences at the college, demonstrates by calling up the image of a 33-year-old male on the table’s horizontal, touch-enabled screen, which measures about 7 feet by 2 feet. Through a series of finger taps and slides on the screen, she uses the machine’s powerful computer to cut the body in half vertically, make one half disappear, then rotate the remaining portion to reveal the organs, along with other functions.
The table comes with a female image, too, and images that demonstrate injuries, diseases, and anomalies such as conjoined twins.
Users can also load specific patient data, making the table a useful tool for patient education.
Roose got grant funding to buy the Anatomage table after a conference speaker reported good results using it with students who had previously flunked an anatomy class. Macomb has employed it since the fall for kinesiology, anatomy, and medical terminology students.
With help from tools such as the Anatomage, who knows what’s next for the students? It’s possible one of them may take DaSilva’s research further and figure out a cure for migraines, or build on Galen’s idea and eliminate falls in the elderly completely, or even create a virtual human simulator that functions like a real body.
|
|