About
Advances in low-cost low-power silicon radio frequency (RF) integrated circuits (ICs) in the last two decades have opened up the commercial applications for millimeter wave (mmWave) frequencies which are an order of magnitude beyond those used in WiFi and cellular today. Large-scale deployment of mmWave communication networks, such as NextG cellular infrastructure outdoors and NextG WiFi infrastructure indoors, implies that these resources can be leveraged for RF imaging at scales that are not otherwise possible. This project seeks to lay the intellectual foundations for Joint Communication and Imaging (JCAI) at city scales using this emerging mmWave infrastructure. Each sensor in such a system provides 4D measurements (range, Doppler, azimuth angle and elevation angle) whose resolution improves by going to higher frequencies.
This three-year project (2022-25) is funded by the National Science Foundation under grant CNS-2215646. It is a cross-disciplinary collaboration between research leaders in communications & MIMO radar and imaging, robotics for communication and sensing, RFIC design and packaging for massive mmWave MIMO arrays, and large-scale programmable networks for communications and sensing.
Technical Approach
We take advantage of 4D measurements (range, Doppler, azimuth and elevation angles) at unprecedented resolution: higher carrier frequencies enhance Doppler resolution; larger bandwidths enhance range resolution; tiny carrier wavelengths make it possible to compactly realize 2D antenna arrays with a large number of elements, enhancing azimuth and elevation angular resolution. The key aspects of our technical plan are as follows:
(1) Significantly increased imaging resolution by creating large effective apertures with networked collaboration, using the scale provided by a fixed wireless infrastructure, along with strategic use of unmanned vehicles;
(2) Developing a control plane for multi-function network operation for JCAI, including a resource management framework based on concepts such as imaging demand and imaging capacity, and protocols supporting collaborative imaging;
(3) Developing platforms for experimentation in JCAI based on off-the-shelf mmWave hardware at 60 and 77 GHz, as well as hardware beyond 100 GHz developed under other programs.
Project Team
PIs
Graduate students and postdocs (Current)
Graduate students and postdocs (Previous)
Research Activities
Our research activities include: demonstrating novel approaches of utilizing RF sensing, architectures and signal processing algorithms for networked sensing, and hardware development for accessing bands beyond 100 GHz.
Imaging edges: When there is no motion, imaging still objects with radio waves is a challenge because information from Doppler is no longer available for segmenting objects. While most prior work has focused on imaging object surfaces, we consider the RF imaging problem from a different perspective by focusing on object edges. When an incoming wave is incident on an edge, it results in a cone of outgoing rays, referred to as Keller cone, dictated by the Geometrical Theory of Diffraction. Depending on the location and orientation of the edge, it leaves different footprints on a receiver array, which can act as its signature for imaging purposes. We have proposed a new processing pipeline around this principle, which can image still objects by tracing their edges. Analysis and experimental results for RF-based edge tracing have been published in IEEE RadarConf 2023.
Beyond Field-of-View Target localization: mmWave radar systems have a limited imaging field-of-view due to their high directionality and reliance on single-bounce paths that scatter once from objects in the environment before being received at the system. We have proposed a method of exploiting natural multi-bounce scattering in the environment to enable millimeter-wave radar imaging of objects beyond the single-bounce field-of-view, for instance objects around-corners and behind-the-radar. Our method exploits various orders of multi-bounce (ranging from single-bounce to triple-bounce) and requires no additional hardware or prior knowledge about the environment. There are two core innovations in our method: (i) a matched filtering algorithm that can directly localize objects at their ground-truth locations along specific multi-bounce paths, (ii) a sequential iterative pipeline that performs matched filtering and object detection separately and sequentially along single-, double- and triple-bounce paths, and uses object detections from previous iterations to compensate for the radar’s lack of prior environment knowledge. Our implementation on a commercial millimeter-wave MIMO radar testbed shows that our method demonstrates 2×-10× improvement in the median localization error for humans standing outside the radar’s field-of-view in various indoor and outdoor scenarios. Our results are to be presented at ACM MobiCom 2024.
Autocalibration for networked sensing: We are developing the core components of an architecture for networked sensing with high-resolution mmWave MIMO radar nodes. Each node is itself highly capable, and can use its range, Doppler, and angle information to track targets within its field of view (FOV), but collaborative networked sensing with multiple such nodes provides several new capabilities for multi-target tracking, including “cellular-style” coverage of large areas, enhanced resolution due to the increased effective aperture, and robust performance under FOV limitations and line-of-sight (LoS) obstructions for individual nodes. However, the first step for collaborative target tracking and track-level fusion is calibration: knowledge of the relative poses (i.e., positions and orientations) of the sensor nodes. We have proposed an autocalibration strategy approach based on joint target tracking and pose estimation, by fusing measurements corresponding to a moving target seen by multiple radars at a centralized data fusion center. For 2D scenes, we have derived an optimal algorithm with a closed-form solution that enables any two nodes tracking a common target to determine their relative poses by matching their estimated tracks. The initial work is to appear in Asilomar 2024 conference, and provides a building block for ongoing work on algorithms and experimental validation for multi-node calibration and enhanced target track association.
Velocity estimation using MIMO radar: MIMO radars can estimate radial velocities of moving objects, but not their tangential velocities. We proposed a method that exploits multi-bounce scattering in the environment to enable estimating a moving object’s entire velocity vector – both tangential and radial velocities. Classical approaches to the velocity vector estimation problem involve tracking targets over multiple frames. Our proposed method enables instantaneous velocity vector estimation with a single MIMO radar, without additional sensors or assumptions about the object size. The only requirement of our method is the existence of at least one angle-resolvable multi-bounce path to/from the object due to static landmarks in the environment. We tested our proposed approach using simulations and experiments with TI’s mmWave MIMO radar, AWR2243 cascade. Initial results have been published in IEEE CISA 2024.
Crowd analytics using mmWave MIMO radar: We have developed a new mathematical modeling and processing pipeline for crowd analytics (e.g., crowd counting, anomaly detection, etc) using mmWave radar. A major bottleneck when sensing crowds with mmWave signals is the blockage caused by the first layer of people, since mmWave signals are significantly attenuated after passing through the human body. This crowd shadowing can significantly affect sensing quality in crowded areas. To address this, we have derived a novel closed-form mathematical expression that can characterize the statistical dynamics of undercounting due to crowd shadowing. This new approach enables estimation of large crowds with mmWave signals, and may have significant impact on crowd management and urban planning. Our initial results, which include extensive experiments using off-the-shelf mmWave radar (e.g., TI AWR2243BOOST), are to be presented at ACM Mobicom 2024.
100+ GHz radar testbed: We are setting up a 140 GHz radar testbed at UCSB, leveraging hardware developed under previous programs as well new hardware being developed under 4D100 and related programs. Under previous programs (the DARPA/SRC JUMP1.0 center ComSenTer), we had developed 140 GHz CMOS single--channel transmitter and receiver ICs, and had integrated these into 8-channel MIMO arrays using LTCC packages carrying 8 such IC and 8-element antenna arrays. To support this development, under ComSenTer we had also made, as test structures, LTCC packages with a single antenna to accommodate a single transmitter or receiver IC. Under 4D100, we have taken the transmitter and receiver ICs and mounted them onto the single-antenna LTCC substrates, to thereby form single-element transmitter and receiver modules. We are presently working to assemble an 8-transmitter, 8-receiver MIMO radar testbed using these ICs. Testbed demonstrations of 140 GHz MIMO radar will involve close collaboration between mmWave hardware experts (Buckwalter and Rodwell), and modeling/algorithms experts (Madhow, Mostofi and Sabharwal).
Hardware design at 100+ GHz: We are building highly power-efficient wideband transmit arrays at 100+GHz in low-cost semiconductor processes. Our current effort focuses on a 22-nm CMOS SOI 4-channel MIMO tile. The low-resolution transmitter uses only 2 bits to control the output phase between I and Q planes while supporting an extremely wide bandwidth covering 110-160GHz to add some ease to tuning for future applications and demonstrations. Further, we have completed an LTCC package with using substrate integrated waveguides that transition the 140-GHz signals through C4 bumps into the SIW waveguide for routing to lambda/2-spaced Vivaldi antennas. Finally, we have fabricated a test board to handle RF and DC connectors to the LTCC package. We anticipated completing and testing the entire assembly soon.
Publications
- A. Banik, Y. Mostofi, A. Sabharwal and U. Madhow, "Optimal Self-Calibration for Collaborative Sensing in mmWave Radar Networks", to appear in the 57th Asilomar Conference on Signals, Systems and Computers.
- N. Mehrotra, D. Pandey, U. Madhow, Y. Mostofi and A. Sabharwal, "Instantaneous Velocity Vector Estimation Using a Single MIMO Radar Via Multi-Bounce Scattering," 2024 IEEE Conference on Computational Imaging Using Synthetic Apertures (CISA), Boulder, CO, USA, 2024, pp. 1-5, doi: 10.1109/CISA60639.2024.10576593.
- N. Mehrotra, D. Pandey, A. Prabhakara, Y. Liu, S. Kumar and A. Sabharwal, ``Hydra: Beyond-Field-of-View mmWave Radar via Multi-Bounce Scattering,'' to appear in ACM MobiCom, 2024.
- A. Pallaprolu, B. Korany and Y. Mostofi, "Analysis of Keller Cones for RF Imaging," 2023 IEEE Radar Conference (RadarConf23), San Antonio, TX, USA, 2023, pp. 1-6, doi: 10.1109/RadarConf2351548.2023.10149785.
- S. M. Farrell, Vivek Boominathan, Nathaniel Raymondi, Ashutosh Sabharwal, Ashok Veeraraghavan, “CoIR: Compressive Implicit Radar,” IEEE Transactions on Pattern Analysis and Machine Intelligence (TPAMI). (to be presented at ICCP 2023)
Code repositories
- FusionSense: This repository contains the source code for the FusionSense team as a part of UC Santa Barbara's Electrical Engineering Senior Design Capstone project.
- Compressive MIMO for Extended Targets: This repository contains the source code for extended target modeling for compressive MIMO radar.
Broader Impact
The increasingly used term Joint Communication and Sensing (JCAS) reflects an emerging consensus that next-generation wireless networks should be multi-function, supporting both communication and sensing at scale. Our vision of JCAI provides a concrete shape to this trend, viewing imaging as a layered network service analogous to data communication, and pushing the limits of resolution and energy efficiency so as to make it attractive to deploy this service at scale. The concepts and methods we develop have potential impact in a vast array of applications, including vehicular autonomy and road safety, manufacturing automation, indoor and outdoor security, eldercare, and healthcare. The PIs will work closely with industry partners, building on their strong track record in transitioning mmWave research, to maximize the impact of this research. UCSB is a minority-serving institution, and we will leverage the diversity of our undergraduate population for recruitment of REU researchers. The proposed research is synergistic with ongoing curriculum reform at UCSB aimed at increasing the flexibility of undergraduates to specialize in sub-disciplines of interest, and will be incorporated into the undergraduate curriculum through courses, capstone projects, and REU projects.
Educational Resources
Software lab on mmWave radar developed for UCSB undergraduate communications sequence
Experimental Data used in the lab (Thanks to Prof. Yasamin Mostofi's group for collecting the data):
System Setup - Two mmWave radars tracking human targets.
A brief description of the experimental data format and the chirp parameters can be found here.
(a) Single moving human target seen from the perspective of two radars (Radar 1 Data, Radar 2 Data).
(b) Multiple moving human targets seen from the perspective of two radars (Radar 1 Data, Radar 2 Data).