Future looks stellar for 3D system
By Joe Bauman
February 9, 2004
USU's Bob Pack is an inventor of a new 3D camera that may have space, military and movie uses.
Laura Seitz, Deseret Morning News
LOGAN — When NASA's Mars rover Opportunity drove toward a rocky outcrop last week, it stopped a foot and a half short. But if it had been equipped with a new 3D imaging system patented by Utah State University, it would not have been an inch short of its goal.
Not that it could have carried the invention, as it is still being developed for commercial uses. Notice of the patent was printed in the Jan. 11 Deseret Morning News: "3D multispectral lidar. Robert Taylor Pack, Logan; Frederick Brent Pack, Waipahu, Hawaii. Assigned to Utah State University, North Logan. . . . Patent No. 6,664,529."
On Friday, the newspaper interviewed the first-listed Pack, who is a research associate professor in civil and environmental engineering and an engineer with USU's Space Dynamics Laboratory. The second Pack is his brother Brent, a retired electrical engineer who lives in Waipahu, Hawaii, and worked for years with the Air Force. "We put our heads together and figured out a way to solve this problem," Bob Pack said.
The problem was integrating contour readings made by laser, with visual imagery. A group of faculty members and students at USU, part of the Center for Advanced Imaging LADAR (laser detection and ranging) helped with the project. "The biggest problem is getting the hardware to cooperate," he said. "It involves mechanical engineering, electrical engineering, geomatics (surveying) engineering, computer graphics, data processing, programming."
Not only is the Jet Propulsion Laboratory interested in the device for future Mars exploration, JPL is even considering it for much more distant space adventures. The lab is seeking information to design an orbiter to study Europa, a moon of Jupiter that could host a frozen-over ocean. "We could actually be imaging (Europa's) surface from a satellite," said Pack, interviewed Friday in the USU Engineering Laboratory Building.
Pack has been working on the project for eight years and is ready to make it public, now that the patent has been awarded. The device also has military uses. And it can be employed with video games, movies and robotics. The camera records the looks of a scene and the exact distance to objects in the view; it can take numerous such images; and it can integrate them all in real time.
Pack showed off how the system works by displaying on a computer a photo that he and his associates took at a canyon containing petroglyphs, at China Lake Test Range, Calif. The area is part of the Naval Air Warfare Center, and the petroglyphs make up a national heritage site, so they needed special permission to hike there. With a mouse click, the computer displayed a view of the canyon. Another click and a second photo was added. "There's another scene of the same area, slight overlap," he said, adding a third view. "We just walked up the canyon a bit and pointed back and took another shot." Because distances to different parts of the scene were known to the computer, the system knew exactly how all the elements fit together. It immediately integrated them into one coherent scene and knew how the parts were related. When Pack used his mouse to drag the viewpoint to a different angle, a grid of distance information appeared. The perspective shifted as he moved the mouse. When he released the clicker, the image showed the canyon in full color as if it were being viewed in person — but from the new perspective. He had ordered the computer to show the view from an angle not seen in person, and it appeared in the screen immediately, as if a photo had been taken from that direction.
The system sends out bursts of laser light and times the return of the pulses to the billionth of a second. It does this 1,000 times a second, recording the distance to each part of the scene. At the same time, the camera is taking visual views at 1,000th of a second. The computer stores this data in its memory, instantly integrating information, immediately building a detailed, three-dimensional image of the scene.
If the device were used with a cruise missile or aircraft flying over enemy territory, it would be able to detect a tank hidden beneath a tree. It would know the detailed structure of the scene through laser readings and its color and texture through photography.
Other devices have been used to record static scenes, recording every detail
of a precious statue in case it is destroyed someday. But the USU system is
much more flexible, able to instantly make such studies while moving. "We can
calculate the distance to objects within a centimeter or so," Pack said. "At
the same time that round trip (by laser light) is happening, the imager here
is collecting the imagery." In the field, the USU scientists used a laptop computer
and their lightweight camera to make scenes. Used in an aircraft, the laser
would have to be more powerful, but the system would use the same principles.
© 2004 Deseret News Publishing Company