PHASE III DUAL USE APPLICATIONS: Implement the cyber simulator capability to each of the identified Air Force Cyber Weapons Systems. Further, refine each part of the simulator in relation to the scenario authoring tool, learning management system, scenario execution engine, and performance assessment.
REFERENCES:
1. USAF Core Functions Support Plan: Cyberspace Superiority, 2016.
2. USAF Strategic Master Plan, May 2015.
KEYWORDS: Cyber, cyber weapon system, cyberspace superiority, immersive training, interactive training, live, virtual, constructive, distributed, multi-unit training
AF171-030
|
TITLE: Battlefield Airmen Augmented Reality System (BAARS)
|
TECHNOLOGY AREA(S): Human Systems
The technology within this topic is restricted under the International Traffic in Arms Regulation (ITAR), 22 CFR Parts 120-130, which controls the export and import of defense-related material and services, including export of sensitive technical data, or the Export Administration Regulation (EAR), 15 CFR Parts 730-774, which controls dual use items. Offerors must disclose any proposed use of foreign nationals (FNs), their country(ies) of origin, the type of visa or work permit possessed, and the statement of work (SOW) tasks intended for accomplishment by the FN(s) in accordance with section 5.4.c.(8) of the solicitation and within the AF Component-specific instructions. Offerors are advised foreign nationals proposed to perform on this topic may be restricted due to the technical data under US Export Control Laws. Please direct questions to the AF SBIR/STTR Contracting Officer, Ms. Gail Nyikon, gail.nyikon@us.af.mil.
OBJECTIVE: Develop a low-latency multispectral digital helmet-mountable near-to-eye augmented reality system for use by Battlefield Airmen. System must be capable of use to aid vision in night, day, and all-weather operations.
DESCRIPTION: The Battlefield Airmen Augmented Reality System (BAARS) sought is multispectral device primarily for night ground operations or in helicopters with open doors. Architectures of interest include: mounting (on helmet, on head, or hand held); sensors (in-line with, or above, eyes); displays (bi-, bin-, mon-, dig- ocular), and bands (multiple reflective and emissive). Most cameras operate by sensing reflected illumination in the 400-700 nm visible (VIS) band. Current analog night vision goggles (NVG) operate by sensing ambient illumination reflected off of scenes in the 625-930 nm near infra-red (NIR) band, which encompasses part of the visible and near infra-red (VNIR) electromagnetic spectrum. However, additional night/day sky illumination energy is available the 0.9-3.5 um short-wave infra-red (SWIR) spectral band. And emissive energy is available in the 3-6 um mid-wave infra-red (MWIR), and 7-15 um long-wave infra-red (LWIR) spectral bands. The SWIR and MWIR band have a unique ability to see through atmospheric obscurants (e.g. fog, haze), improved detection of VNIR camouflage, and detection of out-of-VNIR-band lasers. Terrestrial thermal sources (from e.g. people, engines, cars, animals) emit energy in the MWIR and LWIR bands. However, these longer wavelengths are usable only in the absence of transparencies (lenses in dust goggles, canopies/windows in aircraft) made of materials like BPA-based polycarbonates with transmission cut-off beyond SWIR. The BAARS device shall be battery powered and be capable of displaying symbology/imagery from an external source. The size, mass, mass distribution, and power consumption should be minimized sufficiently to achieve user acceptance. The device should be comfortable for wearing under combat conditions for hours. Power and connectivity trade-off considerations include (a) helmet- versus body- mounted batteries and (b) wired versus wireless options for the transmission of both signal and power. Performance metric threshold (objective) sought include: sensor bands VNIR+LWIR (VNIR+SWIR+LWIR or MWIR); image resolution 640x512 px (8 Mpx); field-of-view 40x30 deg. (60x40 deg.); frame rate 60 Hz (200 Hz); latency from objective-to-eye < 20 ms (< 5 ms); head-born mass 1 kg (0.5 kg); head-born moment arm 0.1 kg-m (0.05 kg-m); power 6 W (2 W); volume 1000 cc (500 cc); and head-mounted battery time 1 hr (4 hr).
PHASE I: Design a BAARS with size, weight, and power (SWaP) consistent with head-worn implementation. Estimate all performance metrics via laboratory experiments and analyses. Develop a system architecture for BAARS integration into the dismounted BAO Kit. Develop a System Implementation Plan for evaluating BAARS operating performance in combat environments, including producibility and supportability.
PHASE II: Fabricate a prototype BAARS. Develop a test plan. Evaluate the prototype in a laboratory environment. Demonstrate BAARS mechanical and electrical interfaces for integration into the BAO Kit. Provide special test equipment, support operator testing, and refine prototype performance based on feedback. Deliver prototype BAARS optimized for SWaP performance, reliability, and ruggedization consistent with dismounted warfighter operations. Create a roadmap to mature the technology.
PHASE III DUAL USE APPLICATIONS: Develop BAARS pre-production product and integrate with the BAO Kit. Provide a pre-production BAARS bill of materials. By the end of Phase III, the BAARS should be capable of all-weather operation worldwide. Develop commercial applications.
REFERENCES:
1. Fact Sheets: (a) Combat Controllers (18 Aug 2010); (b) Guardian Angel (18 Mar 2013). Available athttp://www.af.mil/AboutUs/FactSheets/Display/tabid/224/Article/104592/combat-controllers.aspx.
2. Jeff Paul, Exploit new spectral band (SWIR) & multi-spectral fusion: MANTIS Program Update, Multispectral Adaptive Networked Tactical Imaging System (MANTIS), WBR Soldier Technology US 2008 Conference (Worldwide Business Res. Ltd.,) Arlington, VA, 15 Jan 2008.
3. Goodrich, http://www.sensorsinc.com/whyswir.html for SWIR sensor data.
4. Peter Burt, On Combining Color and Contrast-selective Methods for Fusion, IDGA Image Fusion Conference, Institute for Defense and Government Advancement, 2004.
5. Raytheon Vision Systems, http://www.raytheon.com/; Intevac Photonics Inc., http://www.intevac.com/.
KEYWORDS: Battlefield Airmen Augmented Reality System, BAARS, digital night vision, day vision, multispectral digital imaging, adaptive fusion, visible, short-/mid-/long- wave infra-red
AF171-031
|
TITLE: Wearable, Broadband, EM Field Exposure Detection System with Data Sharing Capability
|
TECHNOLOGY AREA(S): Human Systems
OBJECTIVE: Develop a real-time, wearable battlefield radio frequency field detection system to monitor human exposure to harmful EM fields, notify the wearer, broadcast exposure information for command and control, and improve battlespace situational awareness.
DESCRIPTION: With ongoing development of radio-frequency (RF) directed energy weapons, it can be expected that warfighters will encounter such weapons on the battlefield. In some circumstances, exposure to DEW is not detectable and injury may occur before individuals have an opportunity to react. Experience has shown that the best way to avoid injury from DEWs is to notify individuals when they are exposed so they can leave the area. Therefore, it is necessary to equip warfighters with safety equipment that can give an indication when exposure occurs. Additionally, it is important to share this information with C2 and other warfighters on the same battlefield to increase leadership’s command ability and increase the warfighter’s situational awareness.
Off-the-shelf RF safety equipment exists. Technicians in the telecommunication industry, for example, routinely employ hand-held devices that measure electric field strength or power density of ambient RF fields. Such equipment, however, is not suitable for military personnel because it is bulky and fragile. Furthermore, existing safety devices provide indication of exposure, but don’t give broadband indication of risk to the operator in real time. In addition, current safety devices have no capability in aiding situational awareness for C2 and the warfighter.
To mitigate these shortfalls, a new RF detection system must be developed. The system will need to be hands free, wearable, and a maximum of 5 lbs. It will need to determine the frequency, intensity, and duration to RF fields between 10kHz and 300GHz. It must determine the location of the wearer during an exposure incident. It will need to compare the exposing RF field strength with the frequency dependent maximum permissible exposure (MPE) determined by IEEE C-95.1 and AFI 6055.11. It will need the capability to give multi-step audio and visual warnings to the warfighter depending on the intensity with respect to the MPE. The system will need to share data with other units within 100 meters to display all troop exposures on the battlefield on a common map to aid in C2 and warfighter situational awareness. Finally, each unit must have the capability to relay data between other devices to enable communication beyond the 100 meter range.
PHASE I: Design a system to meet the defined specifications. Demonstrate through laboratory testing that the system design will meet specifications and identify areas where the specifications cannot be met with new technology. Identify promising technology development that will allow improvement beyond the scope of the Phase I SBIR effort.
PHASE II: Develop a prototype system that implements the system as designed in Phase I. Collaborate with government personnel to test the prototype in a simulated operational environment.
PHASE III DUAL USE APPLICATIONS: Develop a commercial system that can be used by industries to improve RF detection and safety. Identify potential customers beyond the sponsor of this project and create a commercialization plan for the system.
REFERENCES:
1. DoD Instruction 6055.11, Protection of DoD Personnel from Exposure to Radiofrequency Radiation. 2009.
2. IEEE Standard for Safety Levels with Respect to Human Exposure to Radio Frequency Electromagnetic Fields, 3kHz to 300 GHz. Santa Ana, Calif.: Global Engineering Documents, 2006.
KEYWORDS: Wearable, Detection, Safety, Command & Control, Warfighter, Battlefield
AF171-032
|
TITLE: Wearable Device to Characterize Chemical Hazards for Total Exposure Health
|
TECHNOLOGY AREA(S): Human Systems
OBJECTIVE: Develop modular wearable instrument that incorporates real-time sensor and collection technologies for volatiles and aerosols to monitor frequency, magnitude, and chemical make-up of contaminants to understand risks to Total Exposure Health (TEH).
DESCRIPTION: The quality of the breathing air has a profound influence on the overall health and ability of individuals to perform duties at an optimal level. Consequently, the presence of contaminants such as environmental gases, volatile organic compounds (VOCs), or aerosols can severely compromise health, primarily by capture within the respiratory tract or in the lungs. These pollutants can cause inflammation of the airway or circulatory system, increasing the potential for stroke or heart attack, or cause headache, dizziness, cancer or neurotoxic effects [1]. Environmental sampling of air quality has demonstrated the presence of airborne contaminants in a variety of settings aside from both anticipated and unexpected exposures in occupational scenarios. The recreational and residential activities of individuals also have unique exposure profiles which contribute to the overall health of the individual, including those caused by vehicular pollution, cleaning products, or cooking [2]. When specific lifestyle choices are taken into consideration, this type of study is referred to as Total Exposure Health (TEH), a complete picture of the exposures a given individual is subjected to in order to perform risk assessment on groups or individuals. In the broad occupational setting of the USAF, an individualized sensor package could be designed specific to the hazards of different careers and TEH choices, tracking exposures over time, and providing toxicologists and Big Data teams with data linked with health outcomes. The long term goal is to use this data to recommend changes to safety procedures, and to ensure personnel are healthy and able to perform at the highest possible level required by the mission.
Wearable sensors provide unique opportunities for TEH, because small battery operated devices are portable enough to travel with an individual and produce a real-time exposure assessment without limiting regular activity throughout the course of an entire day/week/month. One specific application for the USAF environment is for workers such as aircraft fuel tank maintainers, who work in a confined space and hazardous environment with strict air quality monitoring standards. However, this type of technology could be critical for any commercial or USAF occupation requiring hands-free operation, such as maintenance workers, landscapers, laboratory or manufacturing professionals, or machinists. For these reasons, and the fact that the wearable sensing industry has a multi-billion dollar market value only expected to increase in the future, a wearable air quality sensing device represents an interesting platform for TEH sensing.
Airborne contaminants in most environments include both volatile and aerosol constituents, which vary over time [3,4]. Therefore, it is important that airborne contaminants are characterized in high risk environments on a regular basis to guide development and maintenance of engineering controls and personal protection equipment and exposure standards. Currently, a suite of real-time data logging and capturing devices are available for air quality assessments [5]. However, each device requires an independent set of supporting components, including pumps, data loggers and communication ports, making the goal of wearable sensing extremely challenging. Moreover, each device must be calibrated and operated individually by skilled field technicians, and data are pulled from each device independently and are often in different formats. This work seeks to develop a modular wearable air quality device that incorporates real-time sensor and collection technologies for volatiles and aerosols into a single device containing the minimum amount of hardware for successful operation.
PHASE I: Gas-specific chemicals sensed in real-time must include VOCs, CO2, CO, O2, CH2O, NH3, HCN, NOx, H2S, and SO2 at levels below exposure limits, validated by determining cross-sensitivity, limits of detection, and operating range. Aerosol counts and mass for size cut-offs: 1.0 (including ultrafine), 2.5, 4.0, 10.0 microns. A “bread boarded” benchtop prototype device is acceptable with internal or external data storage and at 6x6x12 in. dimensions, weighing <10 lbs.
PHASE II: The final prototype should weigh <1 lb. with dimensions no larger than 4x4x4 in and include real-time gas-specific sensor and particle counts characterized using temperature (-20ºF-120ºF), humidity (0-100%) and pressure (1-15 psi) conditions in the laboratory and validated in the operational environment. Emphasis is on sensor modularity, and a Graphical User Interface (GUI) that allows for calibration, data collection parameter modification (e.g. logging intervals), data visualization and diagnostics, extensible library, and geographical and spatial location reporting.
PHASE III DUAL USE APPLICATIONS: The vision is a wearable device used to revolutionize exposure assessments by including dynamic description of exposure hazards to assess the TEH profile of an individual to mitigate risk scenarios including overall exposure to security forces or aircraft maintainers, with commercial applications in construction or mining industries.
REFERENCES:
1. Martin, J. A., Kwak, J., Harshman, S., Chan, K., Fan, M., Geier, B., Grigsby, C., Ott, D. International Journal of Environmental Analytical Chemistry, 2016, DOI:10.1080/03067319.2016.1160384.
2. C. Walgraeve, K. Demeestere, J. Dewulf, K. Van Huffel and H. Van Langenhove, Atmos. Environ. 45, 5828 (2011). doi:10.1016/j.atmosenv.2011.07.007.
3. National Institute for Occupational Safety and Health (NIOSH). NIOSH Pocket Guide to Chemical Hazards.http://www.cdc.gov/niosh/npg/. Last updated Feb 13, 2015. Accessed Feb 29, 2016.
4. Environmental Protection Agency (EPA). NAAQS Table.https://www.epa.gov/criteria-air-pollutants/naaqs-table. Last updated Feb 29, 2016. Accessed Feb 29, 2016.
5. National Institute for Occupational Safety and Health (NIOSH). NIOSH Manual of Analytical Methods.http://www.cdc.gov/niosh/docs/2003-154/method-2000.html. Last updated May 19, 2015. Accessed Feb 29, 2016.
KEYWORDS: Air quality, particulate matter, aerosols, volatile organic compounds, Environmental Health and Safety, exposure limits
AF171-033
|
TITLE: Mobile activity tracking system for field training and exercise assessment
|
TECHNOLOGY AREA(S): Human Systems
OBJECTIVE: Develop precise system to track personnel location and activity during indoor and outdoor ground-based training.
DESCRIPTION: Live, virtual, and constructive training (LVC) has brought significant capabilities and cost-savings to the fast jet and JTAC, Joint Tactical Attack Controller, domains. LVC training also has great potential to provide more effective training and cost-savings to ground-based exercises. While fast jet LVC training mainly occurs in a virtual environment, ground-based exercises must occur in the live environment. Thus, a computer system must virtualize the live environment to allow for virtual and constructive injects. Currently, this is difficult to implement due to a lack of precise position tracking capability.
Previous efforts have identified partial solutions to the problem of position tracking within indoor environments using ultra-wide band RFID or close range sonic sensors and within outdoor environments using GPS with smoothing algorithms. These may work in some cases, but to provide ground truth data for a simulation it is imperative that the tracking have a small margin of error. Further, the solutions that are currently available have made it possible to function in either indoor or outdoor, but a common solution that can be utilized during both indoor and outdoor training is desired. Additionally, wearable tracking devices (if any) must be small, lightweight, and long-range. Many current solutions will provide extremely accurate information with close range sensors but a future solution using minimal equipment at long ranges to track an entire exercise is ideal. One last consideration would be ease of installation, the system would ideally be able to be installed quickly with simple calibration and be able to be moved as necessary.
Future development requirements involve using this technology to create extremely accurate positional data for ground entities. The system therefore would need to interface with some kind of DIS, Distributed Interactive Simulation, wrapper to create the entities being tracked within the system and provide their information to the rest of the simulation. This data would possibly require some method of adding additional data to certain entities such as firing a weapon to create a munition PDU, Protocol Data Unit, or handling the entity entering a vehicle. This DIS interoperability is a capability that would enable the usage of this technology to its fullest capability. Utilizing DIS interoperability would also provide a common framework to manage the data from a system of this type to create meaningful results. The system should also be capable of using other networks in place to pass the data.
PHASE I: Identify and define the maximum error/tolerance level in personnel tracking that is allowable for DIS protocol. Also identify the accuracy needed to realistically interact with virtual or constructive entities. Design a concept for the method(s) to achieve this accuracy and tolerance in position tracking that is capable of use in both indoor and outdoor training.
PHASE II: Develop, test, and demonstrate a prototype personnel tracking method based on Phase I work. The prototype should track personnel within the defined error level and should be capable of indoor/outdoor use. If wearable tracking solutions are included, they should be small, lightweight, and untethered.
PHASE III DUAL USE APPLICATIONS: Create DIS interoperability software to make the technology accessible to many users. Demonstrate the ability to inject the live players into a DIS simulation for interaction with other entities. Transition the prototype and complete market analysis.
REFERENCES:
1. Claire Heininger Schwerin (2011). Army fields next-generation blue force tracking systemhttp://www.army.mil/article/61624/, 2016.
2. IEEE Standard for Distributed Interactive Simulation Applications Protocols, IEEE Standard 1278.1-2012.
3. Reitz, E. A., & Seavey, K. (2014). Distributed Live/Virtual Environments to Improve Joint Fires Performance. Interservice/Industry Training, Simulation, and Education Conference (IITSEC), 2014.
4.Schreiber, B. T., Schroeder, M., & Bennett Jr, W. (2011). Distributed Mission Operations Within-Simulator Training Effectiveness. The International Journal of Aviation Psychology, 21(3), 254-268.
KEYWORDS: immersive training, battlefield airmen, airmen tracking, dismounted tracking, blue force tracking
AF171-034
|
TITLE: Simulator Common Architecture Requirements and Standards (SCARS)
|
TECHNOLOGY AREA(S): Information Systems
The technology within this topic is restricted under the International Traffic in Arms Regulation (ITAR), 22 CFR Parts 120-130, which controls the export and import of defense-related material and services, including export of sensitive technical data, or the Export Administration Regulation (EAR), 15 CFR Parts 730-774, which controls dual use items. Offerors must disclose any proposed use of foreign nationals (FNs), their country(ies) of origin, the type of visa or work permit possessed, and the statement of work (SOW) tasks intended for accomplishment by the FN(s) in accordance with section 5.4.c.(8) of the solicitation and within the AF Component-specific instructions. Offerors are advised foreign nationals proposed to perform on this topic may be restricted due to the technical data under US Export Control Laws. Please direct questions to the AF SBIR/STTR Contracting Officer, Ms. Gail Nyikon, gail.nyikon@us.af.mil.
OBJECTIVE: Develop a benchmark simulator training system architecture that is sustainable in a dynamic cyber security environment.
DESCRIPTION: USAF simulators and training systems managed by the Simulators Division, AFLCMC/WNS, are acquired and sustained to meet user training requirements while also meeting performance standards for concurrency, fidelity, and availability. The ability of a simulator to meet these performance standards in order to provide realistic and reliable training have historically been the primary considerations when awarding a contract to acquire and/or sustain USAF training systems. Emerging, more stringent cyber security requirements are introducing new considerations that now must also be taken into account.
1>10>
Share with your friends: |