Article

Article

All news stories

ISL a happening place after move south

The Beckman Institute’s Illinois Simulator Laboratory (ISL) was always an exciting place because of its futuristic, highly advanced immersive virtual reality environments. These days the excitement at ISL is also being generated by the highly diverse and unique experiments taking place since the lab’s recent move to the south side of campus.

Published on July 21, 2010

At every Beckman Institute Open House and for every VIP tour of the building, the virtual reality environments operated by the Illinois Simulator Laboratory (ISL) were always a must-see for visitors. A six-sided immersive virtual reality environment called the Cube and the flight and driving simulators were just some of ISL’s capabilities that regularly wowed touring scientists from China and Open House schoolchildren alike over the years.

In 2009 and 2010 ISL and another Beckman core facility, the Biomedical Imaging Center (BIC), switched locations, with BIC moving into ISL’s former space in the Institute’s basement and ISL moving to the former BIC building on the south side of campus. The move has given the Illinois Simulator Laboratory more space and more freedom to take on a range of diverse new projects that are as exciting as the environments that play host to them these days.

“It’s like the floodgates have opened up,” ISL director Hank Kaczmarski said. “At the end of our stay in the main building we had to cut projects short and not start new ones because these projects last six months to a year, a timeframe we didn’t have if the move was to follow the tightly choreographed schedule laid out for us. Once we set up here we just told the researchers to come on and they have. We are very stable in this new building. This is just a fantastic building for our lab.”

Taking full advantage of the move, Kaczmarski and his staff have run, are currently running, or are planning experiments for projects involving wheelchair use, the next generation of air traffic control systems, and a virtual surgery lab, to name a few. Kaczmarski said the larger spaces at the south campus location allow ISL to run more experiments simultaneously and affords greater freedom in scheduling lab space. He also said that the new facility was originally built with discrete orthogonal wings to prevent MRI machines from interfering with one another and that design now prevents, for instance, sonic experiments in one lab from bleeding over to quiet-requiring research in another lab.

“The exciting thing is we have been able to expand beyond one researcher using one environment,” Kaczmarski said. “Now it’s much broader. If the biggest problem we have over the upcoming years is scheduling all these diverse projects, that’s a good problem to have. We have stable spaces and a good understanding of how to modify those spaces without affecting other labs in the building.”

ISL was always known for the cutting-edge research being conducted by people like George Francis and Frances Wang. But the longtime Beckman facility is now taking advantage of its new digs to branch out into other areas, and giving researchers like John Rogers of the 3-D Micro and Nanosystems group virtual reality (VR) tools for advancing their work. During a recent tour, Kaczmarksi described some of the past and current projects taking place in the lab.

“We finally have an actual dedicated space for virtual surgery,” Kaczmarski said. “We’ve been working on the edges of a major effort with this for a couple of years but now we can get really serious about it.”

Rogers is a pioneer in developing innovative applications such as, in this project, measuring the forces a surgeon feels during surgery using micro-pressure sensors and then creating a surgical glove that provides sensory feedback to a surgeon during a virtual run-through of a procedure.  

 “The question is: if we want to provide haptic feedback, some sensation of actually carving into a virtual body, is electro-tactile stimulation a good way to give feedback to the fingertips?” Kaczmarski said. “We’re testing little nano-balloons the Rogers group developed to provide balloon pressure to the virtual surgeon. Is one methodology better than the other, are the two complementary, or do we need to develop something completely different?”

Another recent project involved Jacob Sosnoff of the Human Perception and Performance group using ISL’s motion capture suite to study the motions of wheelchair users in a treadmill-like device modified to incorporate a wheelchair.

“It’s been discovered that novice wheel chair users get carpal tunnel and upper shoulder injuries at a very high rate when they are first using the wheelchair,” Kaczmarski said. “We’re looking at the locomotion of novice and expert wheelchair users. An observer in real time can’t see motion irregularities and the reason you can’t see them is that it happens only once in maybe a hundred revolutions and involves very subtle variances in motion. Our motion capture system is capturing images at a thousand frames a second; researchers can then take the data offline and parse out these subtle differences.”

Sosnoff is also collaborating with Beckman Institute Director Art Kramer and researcher Mark Neider on a project that involves putting wheelchairs in the CAVE™ immersive virtual reality environment. The CAVE had been playing host to a research line looking at pedestrian distraction that used a treadmill set inside the CAVE’s wall-sized screens displaying a virtual street crossing.

“We’re replacing the full-size treadmill with a kid’s treadmill and then the kid’s treadmill with a roller assembly for stationary wheelchair testing,” Kaczmarski said. “We’re going to do basically the entire demographic spread from senior citizens who have a tendency to fall that we’re now running, to 6- and 7-year-olds.

 “This is exciting because it’s stable space for the CAVE,” Kaczmarski added. “The CAVE is just happy here.”

That project is one of many that show the flexibility Kaczmarski and his staff have when they plan experiments at the new location.

“Torrey Loucks (of Beckman’s Cognitive Neuroscience group) is going to do face recognition for speech in the motion capture suite,” Kaczmarski said. “We’re going to reconfigure that space for it. It’s really nice that we can switch things around. We finally have the space to coordinate quite different research efforts during the course of the data-gathering phase of research, typically about half a semester.”

The expansive, stable spaces mean new uses for facilities like ISL’s virtual reality Immersadesks™ and the flight simulator. A current project features two Immersadesks set up in two separate rooms. Kaczmarski said they looking at how two operators of a drone aircraft, for example, in locations hundreds or thousands of miles apart can collaborate effectively.

“We can introduce a little lag in communication,” Kaczmarski said. “We can give different screen information. Maybe one will have infrared video, and one will have normal video. So how best do you, for instance, find somebody on the ground in a search and rescue operation from a moving real-time video scene when two people are collaborating?

“The technology has really gotten ahead of the psychology underlying it. Can we improve the efficiency, the accuracy of what’s going on?”

The flight simulator has undergone a few changes since its move from Beckman. Permanent display screens were installed and the system now features an array of eye-tracking cameras.

“With our moving it out here, installing permanent outside-the-cockpit projection screens, and changing to an electronic control panel here rather than the old analog gauges, we are able to reconfigure this for any number of planes,” Kaczmarski said.

And the eye-tracking capability makes the system unique for studying the human factors of aviation.

“This is the most complicated eye-tracking system in the world right now,” Kaczmarski said. “Normally there are six cameras set up here that look at a 45-degree vertical and a 180-degree horizontal field of view. No other system in the word does that.”

The flight simulator is currently playing host to studies involving Beckman researchers Alex Kirlik and Jason McCarley of air traffic control communications for NextGen, the FAA’s “comprehensive overhaul” of the nation’s airspace system.

“They are trying to see if they can make the communication between the control tower and the cockpit of an airplane less complicated, even as more information is displayed in the cockpit,” Kaczmarski said. “This is to meet the FAA’s mandate to make the air traffic control system more functional. This interactive touch screen shows the kind of spacing you are required to have with the weather that is going on right now, how far you are away from landing, what runways are clear, what taxiways are clear.

“All these kinds of things that normally are communicated verbally will be stationary and retrievable. There will be verbal follow-ups so there is an acknowledgement that you got the information but there will be text because that is what computer screens display best. We even have discrete space now to mock up the air traffic control tower of the Dallas-Fort Worth airport and tie that space in with the actual flight simulator; impossible before, exciting now.”

            The Cube, which always impresses first-time visitors with its completely immersive, six-sided 3-D virtual reality environment, is currently playing host to a study that is looking at whether more information can be gained from visualizing sound signals rather than by using our auditory capabilities. Beckman researchers Mark Hasegawa-Johnson and Tom Huang, along with Kaczmarski and Camille Goudeseune from ISL, are involved with the project, called Adaption of Tandem HMMs for Non-speech Audio Event Detection.

“We only have two ears, but if you turn all of the sound into pictures, you can really shuffle through it visually,” Kaczmarski said. “You really can’t separate sound noises like you can discrete images.

“So the Department of Homeland Security wants to know if an environment like this can actually be used to cut through the massive amount of sound data that’s coming in. They are gathering way more information than they can analyze, so a month after an attack they might be forced to say ‘we had that information, we just didn’t have the analysts to sort through it.’ Your eyes are so much better at making sense of complex data but with sound it’s only one sound at a time.

Kaczmarski pointed to a wallpapered poster covering one wall of the Cube’s waiting room and said that “it takes a wall’s worth of equations to convert sound to images and then the thought processes of computer scientists and artists to visualize the results so DHS analysts can extract meaning from the visual representations.”

The project and others are just a few examples of the happenings taking place at ISL. Kaczmarski said many more and diverse projects are on the horizon.

“We now have Department of Homeland Security, Federal Aviation, and NASA grants, which we’ve never had before,” Kaczmarski said. “We’ve pretty much provided immersive environments for researchers exclusively in the HCII research theme in the past, but now with John Rogers’s group and the virtual surgery project, we’re expanding into other research themes, especially imaging and nanotechnology. We hope to over this next year get into some of the biological imaging if we can get images out of the MRI and ultrasound machines into our immersive environments and be able to get real-time data analysis from these machines. That’s a real goal for the future.”

Kaczmarksi said ISL offers researchers not only unique virtual reality environments and other hardware but also offers the expertise of computer scientists and technicians so that “the researchers only have to concentrate on the design concept of the experiment and then have their students analyze the results. Everything else in the middle is taken care of by us. So we offer these stable yet adaptable environments for their research.”

In this article