Air force 17. 1 Small Business Innovation Research (sbir) Phase I proposal Submission Instructions



Download 1.01 Mb.
Page27/30
Date02.02.2017
Size1.01 Mb.
#15728
1   ...   22   23   24   25   26   27   28   29   30

The ARM should control the positions of the UAVs and their corresponding sensor look angles based on pre-defined mission objectives and real-time threat information. The scenario for this topic is an HVA clearing mission. The UAV swarm should reveal, surveil, detect, and return category 2 targeting coordinates (7-15 meter CEP90) to the HVA to target the IADS and proceed into the A2/AD environment. To accomplish this mission, a composition of platforms that each host a single sensors type including electro-optic/infrared (EO/IR), synthetic aperture radar (SAR), radio frequency direction finding (RFDF), electronic support (ES), electronic attack (EA), and long range communications. The ARM should coordinate the positions of each of the sensors to effectively complete the mission objective with the intact swarm and after attrition of any of the functional sensors. The ARM should be able to effectively work through no threat environments, jamming environments, and active engagement from the threat resulting in swarm attrition. The ARM should be distributed across the swarm and be able to coordinate the swarm for resilience against attrition and sensor failures. The ARM should account for sensor type when determining actions of each sensor. Different sensors will require different tasking to accomplish seemingly similar tasks.


The ARM should be scalable to operate on Group 1 through Group 5 UAVs and any size swarm with an emphasis on smaller (Groups 1 and 2) platforms and moderate size swarms (10-20 platforms). If restricted to Group 1 and 2 UAVs, the payload including sensor, data product processing, and ARM processing should be 2 to 15 pounds. A survey of COTS sensors and understanding of processing requirements for various data product outputs should be conducted as part of this topic, but is not the focus. The focus is to develop the ARM algorithms in a Size, Weight, and Power (SWaP) configuration on appropriate processing (Field Programmable Gate Array (FPGA), in situ with commercial auto-pilot, Central Processing Unit (CPU), etc.).

PHASE I: Develop ARM algorithms to coordinate sensing across multiple platforms as outlined in the description. Propose a mix of sensor types for a swarm of 10 UAVs. Develop a processing architecture for the ARM and sensor data products. Create a laboratory test plan for execution in Phase II to demonstrate performance in “focusing mission” by being able to defeat a generic air defense system. Present an architecture for 2 sensor types that meets a 15 pound payload requirement.

PHASE II: Build two prototype processors to test in a laboratory configuration. Demonstrate that the ARM is capable of accomplishing the focusing mission in a laboratory environment by showing coordination across the systems and an appropriate level of abstraction of sensor inputs and output for integration with a UAV and COTS sensors for Phase III. Develop a flight test plan to test the ARM using 3-4 UAVs. Explore other extensions of the ARM processor for distributed coherent sensing applications or other interest areas as coordinated with the customer.

PHASE III DUAL USE APPLICATIONS: Develop a transition strategy for the ARM processing capability. Conduct a flight test with three (3)-four (4) UAVs to verify the functionality of the ARM for the focusing mission and other mission as determined by the sponsor or performer. If applicable, implement other extensions of the ARM processor as documented in Phase II.

REFERENCES:

1.  Small Unmanned Aircraft Systems (SUAS) Flight Plan: 2016-2036;http://www.af.mil/Portals/1/documents/isr/Small_UAS_Flight_Plan_2016_to_2036.pdf

2.  L. Geng, Y. F. Zhang, P. F. Wang, J. J. Wang, J. Y. H. Fuh, and S. H. Teo, "UAV surveillance mission planning with gimbaled sensors," 11th IEEE International Conference on Control & Automation (ICCA), Taichung, 2014, pp. 320-325. URL:http://ieeexplore.ieee.org.wrs.idm.oclc.org/stamp/stamp.jsp?tp=&arnumber=6870939

3.  F. Barbaresco, J. C. Deltour, G. Desodt, B. Durand, T. Guineas, and C. Labreuche, "Intelligent M3R Radar Time Resources management: Advanced cognition, agility & autonomy capabilities," 2009 International Radar Conference "Surveillance for a Safer Wor

4.  P. E. Berry and D. A. B. Fogg, "On the use of entropy for optimal radar resource management and control," Radar Conference, 2003. Proceedings of the International, 2003, pp. 572-577.
doi: 10.1109/RADAR.2003.1278804. URL:http://ieeexplore.ieee.org.wrs.

5.  T. Hanselmann, M. Morelande, B. Moran, and P. Sarunic, "Sensor scheduling for multiple target tracking and detection using passive measurements," Information Fusion, 2008 11th International Conference on, Cologne, 2008, pp. 1-8. URL:http://ieeexplore.

KEYWORDS: Sensor resource management, radar, EO/IR, Unmanned Aerial Systems (UAS)


AF171-118

TITLE: Develop a Small Pitch LADAR Receiver for Low SWAP Sensing

TECHNOLOGY AREA(S): Battlespace, Electronics, Sensors

The technology within this topic is restricted under the International Traffic in Arms Regulation (ITAR), 22 CFR Parts 120-130, which controls the export and import of defense-related material and services, including export of sensitive technical data, or the Export Administration Regulation (EAR), 15 CFR Parts 730-774, which controls dual use items. Offerors must disclose any proposed use of foreign nationals (FNs), their country(ies) of origin, the type of visa or work permit possessed, and the statement of work (SOW) tasks intended for accomplishment by the FN(s) in accordance with section 5.4.c.(8) of the solicitation and within the AF Component-specific instructions. Offerors are advised foreign nationals proposed to perform on this topic may be restricted due to the technical data under US Export Control Laws. Please direct questions to the AF SBIR/STTR Contracting Officer, Ms. Gail Nyikon, gail.nyikon@us.af.mil.

OBJECTIVE: Design, fabricate, and test laser detection and ranging (LADAR) sensor laser and detector components for 3-D imaging. The technologies should be designed to support operations from a platform with tight size weight, power, and cost (SWAP-C) constraints.

DESCRIPTION: The AF has a need to develop sensors to support Globally Integrated Intelligence, Surveillance, and Reconnaissance (GIISR) and Global Precision Attack (GPA) capabilities. This includes the development of high sensitivity, very narrow field of view optical sensors for target identification. Typical sensing scenarios cue the LADAR to an object or location to collect a sequence of return pulse measurements. This measured data may be processed for immediate display in the cockpit, downlinked to an image analyst, and/or sent to an automatic or aided target recognition system. Current linear mode and Geiger mode receivers in production or under development provide good sensitivity and clocking rates in large format arrays, as indicated in Reference 1 and 3. Recent linear mode receivers are designed to provide clock rates on the order of nanoseconds, while Geiger mode receivers an order of magnitude faster. Pixel pitch of production receivers are < 100 um, while newer development nears 30 um. Sensitivity of large array linear mode receivers has not reached the capability of counting photons. Ranging performance and sensitivity are primary drivers in the receiver.

In an effort to improve LADAR focal plane arrays (FPA), this topic seeks to reduce the size of the unit cell in the readout integrated circuit (ROIC) of a linear mode or Geiger mode LADAR and improve detector sensitivity of avalanche photodiodes (APD), minimizing detector pitch while improving sensitivity and range precision. Of interest are new design ideas for timing circuits and noise suppression. The goal of the unit cell design is to develop a high bandwidth (BW) detector with single photon detection sensitivity, capable of providing 2-D imagery, with high dynamic range for generation of 3-D point clouds with improved range estimation performance. It is envisioned that innovation in the detection logic and timing circuitry may address this challenge, however all approaches are of interest. Performance goals include noise equivalent input < 10 photons, single photon sensitivity, detection BW of GHz to THz at short scene depths, capable of imaging scene depths from 30 m to over 500 m, provide data to support video frame rates of > 20 Hz, in a 1024 x 1024 format or larger array with pixel pitch < 30 um operating in the 1 to 2 um wavelength region. Low crosstalk, large dynamic range > 10 bit, cooling, and damage threshold are considerations. A focus on the development of APDs or ROIC is acceptable, with a requirement of developing a path toward integration and completion of a full LADAR receiver. Initial design should be 16 x 16 format or larger.

The detection circuit logic for each unit cell is expected to be the same for the entire array, but could be implemented to provide various detection methods such as first surface, first and last surface, multiple surfaces, etc., supporting imaging though obscurants such as camouflage netting. The LADAR receiver would be capable of being triggered from a pulsed laser, potentially digitizing the outgoing pulse. Cost and SWAP for the full receiver would need to be considered for the final design, where low cost insertion into small unmanned aerial vehicles (SUAV) is a goal. As a threshold, a receiver with supporting electronics should fit into a 4” x 4” x 4” volume, and the cost of a complete LADAR system < $250k.

No required use of Government materials, equipment, facilities or data is envisioned.

PHASE I: Develop design ideas for unit cells, detection logic, detectors, and laboratory test configurations for fabrication of a direct detection LADAR receiver and provide circuit simulations. Develop a program plan, Statement of Work (SOW), and performance expectations for a small format 16 x 16 or larger receiver, scalable to 1024 x 1024 or larger. Develop a commercialization plan.

PHASE II: Design, develop, and test a 16 x 16 format or larger LADAR receiver capable of photon counting. Provide laboratory test results, details of test methods. Deliverable includes small format LADAR receiver with supporting electronics for laboratory testing. A focus on the development of APD’s or ROIC is acceptable. Develop a program plan, SOW, and performance expectations for a large format receiver capable of insertion into an SUAV, targeting pod, or turret.

PHASE III DUAL USE APPLICATIONS: Design, develop, and test a large format receiver capable of flight testing a brassboard system in an airborne laboratory type environment. Develop a program plan to integrate into a targeting pod or turret and perform flight testing. Explore integration into SUAV’s and fielded systems.

REFERENCES:

1. Reference 1) Michael Jack; George Chapman; John Edwards; William Mc Keag; Tricia Veeder; Justin Wehner; Tom Roberts; Tom Robinson; James Neisz; Cliff Andressen; Robert Rinker; Donald N. B. Hall; Shane M. Jacobson; Farzin Amzajerdian; and T. Dean Cook; Advances in LADAR Components and Subsystems at Raytheon. Proc. SPIE 8353, Infrared Technology and Applications XXXVIII, 83532F (May 1, 2012).

2. Reference 2) Mark A. Itzler; Mark Entwistle; Mark Owens; Ketan Patel; Xudong Jiang; Krystyna Slomkowski; Sabbir Rangwala; Peter F. Zalud; Tom Senko; John Tower; and Joseph Ferraro; Design and performance of single photon APD focal plane arrays for 3-D LADAR imaging. Proc. SPIE 7780, Detectors and Imaging Devices: Infrared, Focal Plane, Single Photon, 77801M (August 17, 2010).

3. Reference 3) Xiaogang Bai; Ping Yuan; James Chang; Rengarajan Sudharsanan; Michael Krainak; Guangning Yang; Xiaoli Sun; and Wei Lu; Development of high-sensitivity SWIR APD receivers. Proc. SPIE 8704, Infrared Technology and Applications XXXIX, 87042H

4. Reference 4) Beck JD, Scritchfield R, and Mitra P, et al; Linear mode photon counting with the noiseless gain hgcdte e-avalanche photodiode. Opt. Eng. 53(8), 081905 (Apr 25, 2014).

KEYWORDS: LADAR, photon counting, 3-D imaging


AF171-119

TITLE: NIIRS 9 Small Unmanned Aircraft System (SUAS) 5-Inch Gimbal

TECHNOLOGY AREA(S): Air Platform

The technology within this topic is restricted under the International Traffic in Arms Regulation (ITAR), 22 CFR Parts 120-130, which controls the export and import of defense-related material and services, including export of sensitive technical data, or the Export Administration Regulation (EAR), 15 CFR Parts 730-774, which controls dual use items. Offerors must disclose any proposed use of foreign nationals (FNs), their country(ies) of origin, the type of visa or work permit possessed, and the statement of work (SOW) tasks intended for accomplishment by the FN(s) in accordance with section 5.4.c.(8) of the solicitation and within the AF Component-specific instructions. Offerors are advised foreign nationals proposed to perform on this topic may be restricted due to the technical data under US Export Control Laws. Please direct questions to the AF SBIR/STTR Contracting Officer, Ms. Gail Nyikon, gail.nyikon@us.af.mil.

OBJECTIVE: Develop a National Imagery Interpretability Rating Scale (NIIRS) 9 capable 5 inch gimbal suitable for Common Launch Tube (CLT)-compatible SUAS platforms.

DESCRIPTION: Small Unmanned Aircraft Systems (SUAS) are proliferating across the USAF. They fit into the USAF Intelligence, Surveillance, and Reconnaissance (ISR) strategy by providing local persistence, operation in contested environments, and low probability of detection. As the utility of SUAS is demonstrated, demands on their performance increases.

SUAS small gimbals currently achieve NIIRS 6-7 but not NIIRS 9. Small gimbals are greatly impacted by environmental conditions and aircraft vibrations. Traditional stabilization techniques are difficult to implement in small gimbals. Improved imaging performance is dependent on the camera and optics that make up the sensor system. All of these are constrained by Size, Weight, and Power – Cost (SWaP-C) limitations.

This effort seeks a 5” diameter gimbal suitable for Common Launch Tube (CLT)-deployed SUAS of less than 50 pounds operating at a typical slant range of 1500 feet and altitude of 200 feet and higher. The gimbal shall provide full-color visible-band images and full-color motion video (T). It shall provide the operator with situational awareness, target detection and tracking, and an on-gimbal multi-mode video autotracker (T). This might be accomplished by providing simultaneous, sequential, or selectable NIIRS 9 chip-outs and a motion video stream.

The visible-band images/chip-outs shall achieve NIIRS 9 (T). NIIRS defines and measures the quality of images and the performance of imaging systems. Although primarily applied in the evaluation of satellite imagery, it provides a systematic approach to measuring image quality. NIIRS 9 would allow an analyst to “Identify (read) vehicle registration (license plate) numbers (VRN)”.

The NIIRS 9 images shall be: visually lossless after transmission (T); have a minimum frame rate of 0.5 (T) - 5 (O) hertz; have ground coverage sufficient to fully encompass the rear aspect of a vehicle (pickup truck) (T) - 20 x 20 feet (O) at the range/altitude above.

The motion video field-of-view (FOV), resolution, image latency, and frame rate shall allow the operator to maintain scene situational awareness, search for, detect, and manually and automatically track vehicle targets and dispersed personnel (T).

Motion video and NIIRS 9 capability should be provided simultaneously (O). The FOVs shall be operator-steerable over a large part of a lower hemisphere field-of-regard (T). The imaging system(s) shall have focus, non-uniformity correction, color balance, gamma, aperture and exposure control with no or limited operator intervention (T). The resulting data stream(s) shall be transmissable over existing SUAS-capable encrypted digital datalinks (T). The NIIRS-9 data stream shall not interrupt or degrade pointing accuracy, autotracking capability, image stability (T), or measurably degrade operator situational awareness (O).

This effort won’t develop new gimbal structures, but will develop a payload, and processing capability. An off-the-shelf gimbal or mature prototype is the expected starting point. This gimbal shall support typical SUAS maneuvering (orbits, racetracks, fly-bys) and fly-ins, and shall compensate for disturbances due to gusts and air turbulence. The gimbal should provide accurate line-of-sight pointing data, on-gimbal inertial measurement, and interface to platform GPS to allow for 10-15 meter Circular Error (CE90) geo-coordinate accuracy at range (T).

This effort will include both hardware and software approaches to providing a complete 5” gimbal package. Hardware solutions may include improved stabilization and pointing accuracy of the gimbal. This effort should not require focal plane array development, but take advantage of current state-of-the-art.

PHASE I: Identify hardware requirements for the gimbal, including stabilization, optics, processor, and imager. Consider software approaches and algorithms as appropriate. Prepare a preliminary design of the gimbal and payload. Use modeling and simulation to justify performance. Conduct requirements and design reviews (SRR, PDR). Demonstrate critical subsystems using representative hardware if possible.

PHASE II: Perform detailed design of the gimbal and payload. Conduct a Critical Design Review. Continue modeling and simulation of system performance. Build a prototype gimbal and payload. Provide all safety of flight data required by USAF and/or FAA authorities. Conduct a Test Readiness Review. Evaluate performance in laboratory (T), tower (T), motion base (O) and flight test (O) environments. Flight test on contractor surrogate preferred. A tower test location will be provided by the Government.

PHASE III DUAL USE APPLICATIONS: Refine design based on outcomes of tests and customer feedback in Phase II; develop a manufacturing plan and/or select a partner for production of 5” gimbals.

REFERENCES:

1.  Air Force Unmanned Aerial System (UAS) Flight Plan 2009-2047,http://www.govexec.com/pdfs/072309kp1.pdf

2.  USAF Intelligence Targeting Guide,http://fas.org/irp/doddir/usaf/afpam14-210/part13.htm.

3.  Common Launch Tube Technology,http://www.systima.com/

4.  Design of a Tube-Launched UAV,https://www.researchgate.net/publication/267377594_Design_of_a_Tube-Launched_UAV

KEYWORDS: SUAS, Imaging, NIIRS




AF171-120

TITLE: Direct Measurement of Targeting Coordinates from Small Unmanned Aerial Systems (SUAS) EO/IR Gimbal Systems

TECHNOLOGY AREA(S): Air Platform

OBJECTIVE: Develop software and hardware solutions to enable direct measurement of targeting coordinates from gimbals mounted on SUAS.

DESCRIPTION: Accurate, timely, and reliable targeting is critical for Air Force strike operations. Fighter aircraft and reconnaissance platforms use high performance targeting pods (Sniper and LITENING) and gimbals (Multispectral Targeting System) to provide this capability. However, new USAF strategies are increasingly relying on Small Unmanned Aircraft Systems (SUAS) to provide situational awareness and targeting information. See for example the Loyal Wingman concept envisioned in the AF SUAS Flight Plan, whereby SUAS provide lethal strike support to their lead aircraft with “…real-time precision targeting.”

The high performance targeting pods and gimbals on larger aircraft take advantage of the Direct Method for obtaining targeting coordinates. This approach relies on the accurate knowledge that these sensor systems have of their three-dimensional state data (altitude, position, and orientation) acquired from their Inertial Measurement Units (IMU) and Global Positioning Systems (GPS) sensors, combined with a laser rangefinder for information about the object of interest. While this approach is appropriate for large gimbals that have the available Size, Weight, Power, and Cost (SWaP-C) to host the required high-quality sensors, these large and expensive sensors are not a possibility for SUAS.

The object of this topic is to demonstrate a <5” gimbal generating own-camera metadata information and range estimates sufficient to enable targeting. The goal is to achieve Cat-1 level targeting coordinates as specified in reference 1 at slant ranges of less than 2km (threshold, 5km objective), together with the uncertainty information for any geo-location estimate to verify its accuracy. This effort will include both hardware and software approaches meeting the targeting requirements for a SUAS. Hardware solutions may include improved stabilization and pointing accuracy of the gimbal and sensors. Software solutions may include using video information provided by the sensor system in order to reduce errors in the extrinsic (location and attitude) and intrinsic (focal length) calibration values of the camera. This approach is made possible by recent advances in processing with multi-core processors developed for the cell phone industry. Video outputs from this gimbal should be standards compliant, including MISB 1107 data as required for targeting.

While the specific targeting application will not be as applicable outside of the military, approaches to provide better geo-positioning and uncertainty estimates will be of use in non-military applications.

PHASE I: Identify the hardware requirements (including sensor performance) needed to achieve desired gimbal performance. Develop software approaches and algorithms for reducing errors three-dimensional state data. Design prototype 5” gimbal. Perform modeling and simulation to justify targeting coordinate and uncertainty accuracy.

PHASE II: Continue modeling and simulation to improve software approaches, algorithms and hardware performance. Based on these results, build or modify a prototype 5” gimbal. Evaluate gimbal performance in laboratory and flight test environments.

PHASE III DUAL USE APPLICATIONS: Refine design based on outcomes of tests and customer feedback in Phase II; develop a manufacturing plan and/or select a partner for production of 5” gimbals.

REFERENCES:

1. Joint Publication 3-09.3, "Close Air Support", http://www.public.navy.mil/fltfor/ewtglant/Documents/courses/cin/TACP%20Docs%20K-2G-3615/JP%203-09.3%20CAS%20Jul%2009.pdf

2. Andersen, ED and Taylor, CN. “Improving MAV pose estimation using visual information,” in Proceedings, IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 2007.

3. Li, M and Mourikis, A. “Vision-aided inertial navigation for resource-constrained systems,” in Proceedings, IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 2012.

KEYWORDS: Targeting Coordinates; SUAS; Geolocation


AF171-121

TITLE: Integrated Circuit (IC) Die Extraction and Reassembly (DER) for More Complicated ICs

TECHNOLOGY AREA(S): Battlespace, Electronics, Sensors

The technology within this topic is restricted under the International Traffic in Arms Regulation (ITAR), 22 CFR Parts 120-130, which controls the export and import of defense-related material and services, including export of sensitive technical data, or the Export Administration Regulation (EAR), 15 CFR Parts 730-774, which controls dual use items. Offerors must disclose any proposed use of foreign nationals (FNs), their country(ies) of origin, the type of visa or work permit possessed, and the statement of work (SOW) tasks intended for accomplishment by the FN(s) in accordance with section 5.4.c.(8) of the solicitation and within the AF Component-specific instructions. Offerors are advised foreign nationals proposed to perform on this topic may be restricted due to the technical data under US Export Control Laws. Please direct questions to the AF SBIR/STTR Contracting Officer, Ms. Gail Nyikon, gail.nyikon@us.af.mil.


Directory: osbp -> sbir -> solicitations -> sbir20171
solicitations -> Army 14. 1 Small Business Innovation Research (sbir) Proposal Submission Instructions
solicitations -> Navy small business innovation research program submitting Proposals on Navy Topics
solicitations -> Navy small business innovation research program
solicitations -> Armament research, development and engineering center
sbir20171 -> Army 17. 1 Small Business Innovation Research (sbir) Proposal Submission Instructions
solicitations -> Navy 11. 3 Small Business Innovation Research (sbir) Proposal Submission Instructions
sbir20171 -> Department of the navy (don) 17. 1 Small Business Innovation Research (sbir) Proposal Submission Instructions introduction
sbir20171 -> Department of the navy (don) 17. 1 Small Business Innovation Research (sbir) Proposal Submission Instructions introduction

Download 1.01 Mb.

Share with your friends:
1   ...   22   23   24   25   26   27   28   29   30




The database is protected by copyright ©ininet.org 2024
send message

    Main page