New technologies tackle landing challenges

  • Published
  • By Laura L. Lundin
  • Air Force Research Laboratory Public Affairs

The Air Force Research Laboratory is demonstrating technologies that will allow Air Force transport aircraft to land in a range of environmental conditions -- anytime and anywhere.

The lab’s Air Vehicles, Human Effectiveness and Sensors directorates here are working with three technologies that, when combined, will help Air Mobility Command pilots to land in remote and austere weather and field conditions.

The directorates are working collaboratively to demonstrate the Autonomous Approach and Landing Capability, or AALC. This will be in conjunction with BAE Systems Platform Solutions, and the Opportune Landing System, or OLS, in conjunction with Boeing Phantom Works and the U.S. Army's Cold Regions Research Engineering Laboratory, Hanover, N.H.

In a perfect situation, pilots generally have no trouble seeing the runway. But when they fly into low-visibility conditions like fog, rain, snow and blowing sand, pilots have difficulty making a safe approach and landing without ground-based navigation aids.

That's where AALC -- a sensor-based, head-up display system -- comes into play. It provides pilots a clear image of the runway to allow safe landings.

Using baseline technology developed by MBDA U.K. Ltd., a HUD developed by BAE Systems U.K. and image processing and fusion developed by BAE Platform Solutions U.S., the objects the imaging radar pick up generate a near real-time video image. This will be enhanced to appear as if the pilot were landing in daytime on a typical visual approach. The video will appear on the HUD screen and allow the pilot to guide the aircraft in for landing.

OLS will help pilots land in austere locations. The system will analyze satellite imagery to determine an area's suitability for landing operations by looking at length, width and flatness of the area as well as potential obstructions and standing water. Additionally, OLS determines soil type and moisture content to estimate the strength of the area.

"When you add these two programs together, you have the capability to penetrate the weather and battlefield obscurants, so you can go anytime. And OLS will allow landing capabilities anywhere," said James McDowell, the AALC program manager.

"Today, pilots can land in severe weather conditions -- but not without an extensive and well-maintained infrastructure in place," Mr. McDowell said.

For military operations, this necessary infrastructure leads to constraints on the mission by narrowing the landing options, costing the military time and money, he said.

However, the AALC system operates independently of ground-based navigation aids. OLS is a pre-mission planning analysis tool that provides information about potential landing sites. This independence increases operational capabilities.

"Currently, air transport crews are being denied clearance for missions if the weather is bad enough and there is no instrument landing capability at the destination," Mr. McDowell said. "So, getting AALC's capabilities demonstrated is a high priority."

Gary Machovina, principle writer of the AALC concept of operations, said AMC identified a deficiency in mobility operations in Bosnia during 1995 and 1996. The constraints led to delays in deploying and supplying troops in the theater of operations.

"The missions then and now are limited to those areas that can support landings using ground-based navigation aids," Mr. Machovina, who is with the command’s long-range planning section at Scott Air Force Base, Ill.

"AALC looks very promising and has the potential of opening up the possibilities for operations significantly," he said.

The technology is a “true game-changer," said Douglas Zimmer, deputy program manager with the Human Effectiveness Directorate.

"With AALC providing the pilot with adequate imagery and the dependence on airport infrastructure gone, mobility assets will be free to operate under a majority of atmospheric conditions related to extreme low-visibility," he said.

Presently, AALC works by using a two-dimensional wave imaging radar system, infrared camera and fusion and processing algorithms that combine the best qualities of each sensor. The pilot then sees a two dimensional-view of the fused sensor image of the runway.

Therefore, if an obstacle like tree was in an aircraft’s path, it would only appear as a shadow or a spot on the display. It would not allow the pilot to determine the height threat of the object, which poses a significant safety hazard.

To address this limitation, the Sensors Directorate is working to modify the system to feature a three-dimensional view. The 3-D radar will display the height of obstacles or terrain in the path of the aircraft, which makes pilots more aware of landing situations.

"The three-dimensional radar is primarily designed to address two issues: providing a safe approach by identifying intervening terrain or obstacles on the final approach and providing information about potential hazards or runway incursions," said Maj. John Koger, a program manager.

Mr. McDowell said AALC is scheduled for flight test demonstration aboard a C-130H at Edwards AFB, Calif. -- beginning with the 2-D radar -- between October 2006 and February 2007.

Plans are for AMC to receive the technology fiscal 2010.

Engineers are scheduled to flight test the completed 3-D modifications in late spring to early summer of 2007. Mr. McDowell said the primary focus will be on the radar's ability to identify obstacles or terrain at the correct location and height on final approach.

Mr. Zimmer said, "From what I have seen thus far, the proposed technologies are impressive. The true test will come during our demonstration when the sensors are stressed in actual weather conditions."