OBJECTIVE: Develop and produce AN/ALE-47(V) Countermeasure Dispenser Set Software Test Environment Automated Scenario and Mission Data File Test Software.
DESCRIPTION: The AN/ALE-47(V) Countermeasures Dispenser Set (CMDS) is installed in nearly every Department of Navy active aircraft—approximately 3,700 aircraft—as well as several Air Force, Army, and foreign military aircraft. The CMDS is a critical component of the Aircraft Survivability Equipment (ASE) Suite and is integrated with advanced Missile Warning Systems, Radar Warning Receivers, Mission Computers, and advanced expendables. Each Type/Model/Series (T/M/S) of aircraft utilizes a unique ALE-47 Mission Data File (MDF). These MDFs are mission-critical software that must undergo extensive ALE-47 Hardware-in-the-Loop testing prior to fielding. Each platform T/M/S receives an MDF update approximately every 18 to 24 months. The ALE-47 Software Support Activity (SSA) develops and tests approximately 40 MDFs per year, which includes operational MDFs, flight test MDFs, integration MDFs, and special purpose MDFs. The testing is conducted on the ALE-47 Software Test Environment (STE), which is a Windows PC-based test station to simulate ALE-47 CMDS multiple aircraft configurations and expendable magazine payloads with data capture of CMDS 1553 AV/EW Bus, Sequencer Data Link, Cockpit Control Data Link, and Discrete Input/Output (I/O). The test report is a sequential data capture of all I/O including dispense patterns with a 1 millisecond (ms) resolution.
The test requirement is extensive because every manual, semi-automatic, and automatic program must be tested and verified under differing scenarios of expendable magazine loadouts and aircrew/platform/threat scenarios. Up to 256 threat-emitter IDs can be sent to the ALE-47 from the Radar Warning Receiver for threat processing, as well as Missile Warning Threats and aircrew Manual Dispense Program requests. In addition to the dispense requests, there are up to 16 different expendable countermeasure magazine loadouts (with associated inventory) available to be loaded on the aircraft. Aircraft Mission Computers can also send navigational data (i.e. airspeed and altitude) over the Avionics MIL-STD-1553 bus to the ALE-47 to be used for additional countermeasure response determination. The number of dispensers used by various aircraft varies between 2 and 18, and testing every possible combination of the magazine loadouts, as well as sequencing of dispense processing requests, can be a substantial effort. There is a significant opportunity to automate some of the MDF test process to reduce cost, schedule, and defect rates. The current cost, schedule, and defect rate of an MDF project varies based on the complexity of the MDF. An average estimate of current test cost is 150 work hours and takes approximately four weeks. Defect rates are captured during initial developer test. They are reduced via the test, fix, test process and then the MDF is handed over for Independent Validation & Verification (IV&V) where defects can still occur, but normally at a reduced rate because the majority of defects should have been resolved during initial MDF developer test. In addition to these defects there are sometimes situations where a defect can exist in a delivered MDF due to the inability to run every single different scenario that could occur during a flight scenario. The average defect rate during developer test is 20 and during IV&V is on average 5. We have very few documented defects in delivered MDFs, over the past 12 months there have been 2 defects found during on-aircraft test. The SSA delivers approximately 50 MDFs per year. The ALE-47 SSA utilizes a STE to conduct this ALE-47 MDF testing. The current test process requires a significant amount of time for a person to develop platform-specific simulation scripts, operate the STE during test (user interaction may include changing system mode and/or initiating dispense requests), initiating macros (e.g., scripts containing a dispense event), and to load, reload, and modify expendable inventories. An automated test environment would greatly reduce cost and schedule associated with MDF testing. The current STE Simulation Manager Software programming language is in C++. Testing priorities include: 1) automate the ability to run the test until all Magazines are empty., 2) automate the process of verifying the dispense program (timing) is within specification, 3) automate the process of verifying program, payload and zone substitution is per the specification, and 4) automate the process of running each dispense source multiple times in various orders and reloading the inventory as necessary to support the automation.
Work produced in Phase II may become classified. Note: The prospective contractor(s) must be U.S. Owned and Operated with no Foreign Influence as defined by DOD 5220.22-M, National Industrial Security Program Operating Manual, unless acceptable mitigating procedures have been implemented and approved by the Defense Security Service (DSS). The selected contractor and/or subcontractor must be able to acquire and maintain a secret level facility and Personnel Security Clearances, in order to perform on advanced phases of this contract as set forth by DSS and NAVAIR in order to gain access to classified information pertaining to the national defense of the United States and its allies; this will be an inherent requirement. The selected company will be required to safeguard classified material IAW DoD 5220.22-M during the advance phases of this contract.
PHASE I: Conduct detailed analysis of the current ALE-47 MDF design structure and test philosophy and review of the STE existing Simulation Manager Software. Determine, develop, and demonstrate capability that can be added to the existing STE software package to automate some of the test capability including test development, test execution, test validation, and test report generation.
PHASE II: Further develop software designs and demonstrate the prototype software functionality on the ALE-47(V) STE. Conduct a design review of the final proposed software code.
It is probable that the work under this effort will be classified under Phase II (see Description section for details).
PHASE III DUAL USE APPLICATIONS: Develop an Acceptance Test Procedure (ATP) and complete an ATP for Government acceptance. Transition software tools/modifications to the Government-controlled Configuration Management (part of the ATP will be for the Government to compile and load STE III software and any new tools). Transition to PMA 272 for official use in the ALE-47 SSA and offering to other ALE-47 STE customers. Automated MDF testing could be applicable to other DoD systems that utilize a mission data file concept to control unique system functionality across aircraft configurations. If commercial companies are developing ALE-47 MDFs, this product would also be useful for the private sector. This topic may benefit organizations such as Air Force ALE-47 Joint Software Support Activity, Air Force ALE-47 User Commands that develop ALE-47 MDFs by incorporating the same or similar technology in their test environments.
REFERENCES:
1. AN/ALE-47 Countermeasures Dispenser System [CMDS]. http://www.globalsecurity.org/military/systems/aircraft/systems/an-ale-47.htm
2. Cloer, L. “13 Facts About the ALE-47 Countermeasures Dispenser System (CMDS).” Duotech, July 12, 2017. http://duotechservices.com/13-facts-about-the-ale-47-countermeasures-dispenser-system
3. MIL-STD-1553B, Digital Internal Time Division Command/Response Multiplex Data Bus (1978). http://www.milstd1553.com/
KEYWORDS: Automated Test; ALE-47; Mission Data File; Software Development; Software Design; C++
N181-026
|
TITLE: Data Science Driven Aircrew Performance Measurement and Proficiency System
|
TECHNOLOGY AREA(S): Air Platform, Human Systems, Weapons
ACQUISITION PROGRAM: PMA 205 Naval Aviation Training Systems
The technology within this topic is restricted under the International Traffic in Arms Regulation (ITAR), 22 CFR Parts 120-130, which controls the export and import of defense-related material and services, including export of sensitive technical data, or the Export Administration Regulation (EAR), 15 CFR Parts 730-774, which controls dual use items. Offerors must disclose any proposed use of foreign nationals (FNs), their country(ies) of origin, the type of visa or work permit possessed, and the statement of work (SOW) tasks intended for accomplishment by the FN(s) in accordance with section 5.4.c.(8) of the Announcement. Offerors are advised foreign nationals proposed to perform on this topic may be restricted due to the technical data under US Export Control Laws.
OBJECTIVE: Develop a software technology to pre-process, fuse, and store data from multiple sources for human performance assessment and proficiency tracking during training, with the capability to parse and synchronize disparate data from live, virtual, and constructive (LVC) aviation training system sources such as range instrumentation, aircraft systems, virtual simulators, and constructive applications to output automated performance metrics. Develop a human-machine interface that provides visualization tools that facilitate data synthesis by human-in-the-loop users.
DESCRIPTION: Navy leadership has issued guidance to move from reactive decisions to proactive or predictive solutions leveraging data-driven analytics to aid in decision-making and proficiency tracking. Agreement across the Department of Defense for quantitative, data-driven decisions is an important first step; however, implementing systems capable of collecting, storing, fusing, analyzing, interpreting, and safeguarding that information is a difficult challenge. Leveraging advances in data science for training performance assessment is a critical domain where technology provides a means to increase accuracy and reduce workload. Instructors do not currently have enough time for a rigorous and detailed performance evaluation of each flight. Research has clearly demonstrated that high workload has the potential to negatively affect the accuracy and effectiveness of subjective performance ratings and the subsequent feedback provided to trainees [Ref 2], thereby reducing the quality and quantity of training data that feeds back to decision-makers within the Naval Aviation Enterprise (NAE).
The current state-of-the-practice for performance assessment relies heavily on subjective rating, which is hampered by a manually intensive and time-consuming process. A software tool that provides an automated mechanism to pre-process and fuse multiple data sources for human performance assessment and proficiency tracking in warfighting capabilities would alleviate this burden. Specifically, development of automated computational methods can assist with timely and continuous calculation of aircrew performance, proficiency and identify associated trends. Technical objectives include the design and development of a capability that provides:
1) Data interfaces for consumption and processing of a range of disparate data sources used in LVC training system sources such as range instrumentation, aircraft systems (e.g., aircraft flight logs, radar, weapons, communication, imagery), virtual simulators (e.g., High Level Architecture, mission computer, instructor stations, range systems, acoustic processors), and constructive applications (e.g., semi-automated forces, system emulators);
2) An architecture and process for linking available data sources to tactical aircrew performance for data synthesis to inform performance assessments (semi-automated) and/or calculate automated performance metrics;
3) Scalable functionality to support individual, team, and multi-team aircrew compositions and mission sets; and
4) An intuitive human-machine interface that provides visualization tools to facilitate data synthesis by human-in-the-loop users and display automated data outputs.
The final product will enable after-action performance reviews and debriefs of training events to include full visibility into the details of effects chain execution in order to identify errors in mission execution at all levels, while reducing the time required for after-action performance reviews based on present-day training.
The solution should include hardware only if existing systems are inadequate for the task, as there is a desire to avoid the need for additional hardware at fleet training sites.
Work produced in Phase II may become classified. Note: The prospective contractor(s) must be U.S. owned and operated with no foreign influence as defined by DoD 5220.22-M, National Industrial Security Program Operating Manual, unless acceptable mitigating procedures can and have been implemented and approved by the Defense Security Service (DSS). The selected contractor and/or subcontractor must be able to acquire and maintain a secret level facility and Personnel Security Clearances, in order to perform on advanced phases of this project as set forth by DSS and NAVAIR in order to gain access to classified information pertaining to the national defense of the United States and its allies; this will be an inherent requirement. The selected company will be required to safeguard classified material IAW DoD 5220.22-M during the advanced phases of this contract.
PHASE I: Design an architecture and process for linking available data sources to tactical aircrew performance in warfighting capabilities based on fleet tactical recommendations (i.e., Tactics, Techniques, and Procedures (TTP)) and mission-essential tasks references (e.g., Wing Training Manuals, Training & Readiness Matrices), that is flexible to incorporate future tactics and scalable to address individual to multi-team performance. Demonstrate the feasibility of implementing a software-based solution to process, parse, and fuse disparate data sources and types (e.g., aircraft data, sensor data, simulator data, video files, range instrumentation data, and voice communication recordings) for a single platform. Design advanced data science approaches (e.g., machine learning, artificial intelligence, voice recognition, image processing) for automated and human-in-the-loop data output for performance assessment, facilitating feedback, and support longitudinal trend analysis computations. Risk Management Framework guidelines should be considered and adhered to during the development to support information assurance compliance. Phase I should include development of prototype plans for Phase II.
PHASE II: Develop and demonstrate a prototype software solution based on the designed architecture and a process that fuses multiple data sources and types, and outputs automated and human-in-the-loop data output for performance assessment, facilitates feedback, and supports longitudinal trend analysis computations. Evaluate the efficiencies and return on investment gains associated with semi-automated and/or automated data processing. Demonstrate software scalability to multiple missions and/or multiple platforms. Develop and evaluate the usability of a human-machine interface that provides visualization tools to facilitate data synthesis by human-in-the-loop users and displays automated data outputs. Risk Management Framework guidelines should be considered and adhered to during the development to support information assurance compliance.
It is probable that the work under this effort will be classified under Phase II (see Description section for details).
PHASE III DUAL USE APPLICATIONS: Conduct the testing and integration necessary to support transition to a fleet training site. Implement any outstanding Risk Management Framework guidelines to ensure information assurance compliance; complete the process to seek a standalone Authority To Operate (ATO) and/or support a transition training site to incorporate the developed training solution into an existing ATO depending on transition customer’s desire. Continue development to expand the architecture to new data sources and future references sources on aircrew performance and/or software scalability to multiple missions and/or multiple platforms. Improvements in technology to collect detailed performance on operators are applicable to all military and commercial systems where operator reliability is critical to mission success. Successful technology development would be applicable to most military systems where it would be possible to take advantage of the large quantities of data being produced in training events and efficiently processing that data into meaningful performance metrics. Similar applications would be useful in commercial aviation, space, and maritime industries.
REFERENCES:
1. Ault, F. “Report of the Air-to-Air Missile System Capability Review.” July-November 1968. https://www.researchgate.net/publication/235090392_Report_of_the_Air-to-Air_Missile_System_Capability_Review_July-November_1968_Volume_1
2. Bretz, R. D., Milkovich, G. T. & Read, W. “The current state of performance appraisal research and practice: Concerns, directions, and implications.” Journal of Management, 1992, 18(2), 321-352. http://dx.doi.org/10.1177/014920639201800206
3. Brobst, W. D. Thompson, K. L. & Brown, A. C. “Air Wing Training Study: Modeling Aircrew Training for Acquiring and Maintaining Tactical Proficiency.” A Synthesis of CBA’s Work, October 2006. http://www.dtic.mil/dtic/tr/fulltext/u2/a490976.pdf
4. Fan, Jianqing, Han, Fang, and Liu, Han. “Challenges of Big Data Analysis.” National Science Review, Volume 1, Issue 2, 1 June 2014, pp 293–314. http://nsr.oxfordjournals.org/content/1/2/293.short
5. Griffin, G.R. & Shull, R.N. “Predicting F/A-18 Fleet Replacement Squadron Performance Using an Automated Battery of Performance-Based Tests.” Naval Aerospace Medical Research Laboratory, Naval Air Station, Pensacola, Florida, July 1990. http://www.dtic.mil/dtic/tr/fulltext/u2/a223899.pdf
6. Horowitz, Stanley A., Hammon, Colin P. & Palmer, Paul R. “Relating Flying-Hour Activity to the Performance of Aircrews.” Institute for Defense Analyses, Alexandria, Virginia, December 1987. http://www.dtic.mil/docs/citations/ADA199004
7. Kahneman, D. (1973). Attention and effort (p. 246). Englewood Cliffs, NJ: Prentice-Hall. https://www.princeton.edu/~kahneman/docs/attention_and_effort/Attention_hi_quality.pdf
8. Ellett, Jennifer M. and Khalfan, Shaun. “The Transition Begins: DoD Risk Management Framework: An Overview.” CHIPS: The Department of the Navy’s Information Technology Magazine, April-June 2014. http://www.doncio.navy.mil/chips/ArticleDetails.aspx?ID=5015
KEYWORDS: Proficiency; Performance Assessment; Aircrew; Human Factors; Training; Debrief
N181-027
|
TITLE: Free Space Optical (FSO) Communications in a Radio Frequency (RF) Denied Environment
|
TECHNOLOGY AREA(S): Air Platform
ACQUISITION PROGRAM: PMA 265 F/A-18 Hornet/Super Hornet
The technology within this topic is restricted under the International Traffic in Arms Regulation (ITAR), 22 CFR Parts 120-130, which controls the export and import of defense-related material and services, including export of sensitive technical data, or the Export Administration Regulation (EAR), 15 CFR Parts 730-774, which controls dual use items. Offerors must disclose any proposed use of foreign nationals (FNs), their country(ies) of origin, the type of visa or work permit possessed, and the statement of work (SOW) tasks intended for accomplishment by the FN(s) in accordance with section 5.4.c.(8) of the Announcement. Offerors are advised foreign nationals proposed to perform on this topic may be restricted due to the technical data under US Export Control Laws.
OBJECTIVE: Develop a low-cost, low space, weight, and power (SWaP) Free Space Optical (FSO) communications capability for tactical fighter aircraft operation in a radio frequency (RF)-denied environment.
DESCRIPTION: RF Interference (RFI) generated by either adversaries or fratricide (friendly jamming) has significantly degraded aircraft tactical communications. Recent advancements in FSO communications technologies can be used to provide an anti-jam, low probability of interception and detection (LPI/LPD) communication alternative to RF. The primary advantages of FSO communications for military applications are covertness, lack of RFI from any RF sources, immunity to jamming, lack of frequency allocation requirements, and high bandwidth. An FSO communication solution is needed and should consider coherent detection of weak signals for improved detection and processing, compensation for atmospheric effects such as absorption, scattering and scintillation, a preferred transmission bandwidth, atmospheric modeling (e.g., CLEAR1, Hufnagel-Valley), conformal, and low cost/low SWaP. An effective range greater than 100nm, if achievable, as part of a low cost/SWaP solution would be a future goal of the development. A digital data link, operating at Electro-Optical/Infra-Red (EO/IR) frequencies, that supports encryption and 2-way communications is the goal of this SBIR topic.
Work produced in Phase II may become classified. Note: The prospective contractor(s) must be U.S. owned and operated with no foreign influence as defined by DoD 5220.22-M, National Industrial Security Program Operating Manual, unless acceptable mitigating procedures can and have been implemented and approved by the Defense Security Service (DSS). The selected contractor and/or subcontractor must be able to acquire and maintain a secret level facility and Personnel Security Clearances, in order to perform on advanced phases of this project as set forth by DSS and NAVAIR in order to gain access to classified information pertaining to the national defense of the United States and its allies; this will be an inherent requirement. The selected company will be required to safeguard classified material IAW DoD 5220.22-M during the advanced phases of this contract.
PHASE I: Define and develop a concept for FSO communication capability in a tactical war fighter environment. Detail the key design considerations and trade-offs associated with the approach. Prioritize technology risk areas going forward and potential mitigation procedures/alternatives. Analyze implementation issues and determine the feasibility of effectively implementing a low-cost/low-SWaP FSO communications solution. Develop prototype plans for Phase II.
PHASE II: Demonstrate functionality and achievable performance using modeling and simulation. Prototype critical elements and demonstrate the technology in a controlled environment. Quantify benefits of the innovative techniques compared to existing techniques in similar environments. Develop an approach to air vehicle integration and identify any remaining technology challenges.
It is probable that the work under this effort will be classified under Phase II (see Description section for details).
PHASE III DUAL USE APPLICATIONS: Further refine the design from Phase II for transition to an operational test asset. Issues related to test platform integration will be addressed in cooperation with the Government. Risk management and mitigation versus the test plan and schedule will be a focus of the Phase III effort. Operational assets will be tested on an F/A-18 test bed for ground and air functionality. Other DoD components (USAF, Army, Marine Corps, SOCOM, etc.) could benefit from similar application aboard air and ground assets. Other Government applications within the Drug Enforcement Agency and the Intelligence Community for use with non-RF, covert communication are also a consideration. Private sector use in telecommunication and local, urban communication (communication nodes – line of sight) would benefit from this technology due to its high bandwidth.
Share with your friends: |