首页> 外文会议>Congress of the International Council of the Aeronautical Sciences; 20060903-08; Hamburg(DE) >EVS BASED APPROACH PROCEDURES: IR-IMAGE ANALYSIS AND IMAGE FUSION TO SUPPORT PILOTS IN LOW VISIBILITY
【24h】

EVS BASED APPROACH PROCEDURES: IR-IMAGE ANALYSIS AND IMAGE FUSION TO SUPPORT PILOTS IN LOW VISIBILITY

机译:基于EVS的方法程序:红外图像分析和图像融合以在低可见性下支持飞行员

获取原文
获取原文并翻译 | 示例

摘要

Based on a recently released new flight rule by the Federal Aviation Administration (FAA) a dream of imaging technology enthusiasts becomes reality: Visual guidance (VFR) of landing aircraft in adverse weather is no longer restricted by the performance of the human eye (measured as runway visual range, R VR). Based on the output of imaging sensors it is now allowed to continue the approach beyond the classical minimum decision altitude (MDA) or decision height (DH) down to about 100 ft whenever at DH or MDA the runway can be clearly identified within the sensor image. This rule change in aircraft operation will allow to conduct bad weather landings to non-ILS- equipped airfields and without an expensive on-board ILS installation. However, after this "quantum leap in flight rule making" new questions are coming up: "How to inform the pilot what the imaging sensor just sees?" Usually, a (quite expensive) Head Up Display (HUD) is required to show the output of the sensed image, which is presented in a raster scan overlay onto the existing stroke vector image. Is this "direct overlay method", which "blocks" the transparency of the HUD, really the best way to do this, or are there other options to make-up enhanced visual aircraft guidance concepts and systems? The following contribution tries to give some hints for answering these questions and proposes a more transparent EVS display format. A new method for extracting the aircraft position relative to the runway in real-time from the on-board imaging sources is presented. Finally, some experimental results from flight trials during the US-American SE- Vision project, where the described method has been implemented and tested, are explained.
机译:根据美国联邦航空管理局(FAA)最近发布的新飞行规则,影像技术爱好者的梦想变成了现实:在恶劣天气下,降落飞机的视觉指导(VFR)不再受人眼性能的限制(按跑道视距,R VR)。现在,基于成像传感器的输出,只要在DH或MDA上可以在传感器图像中清楚地识别出跑道,就可以将进近方法继续超出经典的最小决策高度(MDA)或决策高度(DH)降至大约100英尺。 。飞机运行的这一规则更改将允许在没有安装ILS的机场进行恶劣的天气降落,并且无需安装昂贵的机载ILS。但是,在“飞行规则的量子飞跃制定”之后,出现了新的问题:“如何告知飞行员成像传感器所看到的是什么?”通常,需要一个(非常昂贵的)平视显示器(HUD)来显示感测图像的输出,该输出以光栅扫描覆盖图的形式呈现在现有笔划矢量图像上。是“阻止” HUD透明度的“直接覆盖方法”,真的是实现此目的的最佳方法,还是有其他选择来组成增强的视觉飞机制导概念和系统?以下内容试图为回答这些问题提供一些提示,并提出了一种更透明的EVS显示格式。提出了一种从机载成像源实时提取飞机相对于跑道位置的新方法。最后,解释了美国SE-Vision项目中飞行试验的一些实验结果,其中所描述的方法已得到实施和测试。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号