Gulnaaz Afzal
Senior Autonomy Systems Integration Engineer
Rivian

Interview with Gulnaaz Afzal, Senior Autonomy Systems Integration Engineer at Rivian
Q: What are the key trends and developments you're seeing in the autonomous vehicle space right now, and which innovations and technology do you think will have the biggest impact over the next few years?
The most significant shift in the autonomous vehicle space is the move from demonstrating capability in controlled environments to sustaining performance in real-world deployment. While advances in perception models and compute have been substantial, the focus is increasingly on reliability, consistency, and scalability across diverse operating conditions.
In the near term, the biggest impact will likely come from improvements in system-level integration rather than individual sensor or model performance. This includes better synchronization across sensors, more robust handling of degraded inputs, and improved observability into system behavior in the field. As autonomy systems mature, differentiation will come less from peak performance metrics and more from how well systems maintain performance over time and across varying environmental and operational conditions.
Q: What opportunities does the evolution of automotive sensor suites present to automakers in enhancing vehicle safety, autonomy, and overall user experience?
The evolution of automotive sensor suites presents an opportunity to move beyond perception accuracy as the primary objective and toward building systems that are more robust, diagnosable, and resilient in real-world operation.
A well-integrated sensor suite allows automakers to design systems that can better handle edge cases, adapt to environmental variability, and provide redundancy where needed. It also enables improved diagnostics and monitoring, which are critical for maintaining safety over the vehicle lifecycle. From a user experience perspective, this translates into more consistent system behavior, fewer unexpected disengagements, and greater confidence in advanced driver assistance features.
Ultimately, the opportunity lies not just in adding sensing capability, but in how effectively that capability is integrated, validated and maintained at scale.
Q: What excites you most about contributing to the future of autonomous transportation, and what do you hope could be achievable in the near future?
What I find most compelling is the opportunity to build systems that operate reliably in complex, real-world environments. Autonomy is often framed in terms of technological breakthroughs, but in practice it is equally about engineering discipline, system integration, and long-term reliability.
From a broader perspective, I’m also motivated by the potential for autonomy to improve road safety. Traffic incidents remain a significant challenge, and while autonomy is not a single solution, it represents a meaningful tool for reducing variability introduced by human factors. The opportunity to build systems that can operate more consistently and predictably in complex environments is what makes this space particularly impactful.
In the near future, I’m particularly interested in progress around making these systems more observable and diagnosable in deployment. As autonomy systems become more widely adopted, understanding how they behave over time — and being able to detect and respond to degradation — will be critical.
Q: What are the broader risks and challenges you've encountered with integrating advanced autonomy sensors like LiDARs and RADARs into modern vehicles?
One of the primary challenges in integrating advanced sensors such as LiDAR and radar is managing variability introduced at the system level. While individual sensors may perform well under controlled conditions, real-world deployment introduces factors such as mechanical tolerances, thermal effects, electromagnetic interference, and long-term wear that can affect performance.
Another critical challenge is maintaining consistent time synchronization across sensors. Even small amounts of timestamp misalignment can lead to incorrect spatial and temporal associations during sensor fusion, which may not present as a clear fault but can degrade perception quality in subtle ways. Ensuring robust time synchronization in real-world conditions — where network jitters, clock drifts, and system loads vary — is essential for maintaining reliable multi-sensor performance.
In addition, ensuring sufficient observability into sensor behavior once deployed remains difficult. Many failure modes such as intermittent signal degradation or calibration drift do not present as explicit faults but can still impact overall system performance. Addressing these challenges requires validation beyond nominal conditions and designing systems that are ready for the real-world conditions that can detect, diagnose, and adapt to changes over time.
Q: AutoTech is a key event for showcasing innovation in the automotive industry. What are you looking forward to most about participating in AutoTech this year, and what do you hope attendees take away from your presentation?
I’m looking forward to engaging with others across the industry who are working on similar challenges around deploying and validating autonomous systems at scale. I believe that conferences like AutoTech provide a valuable opportunity to share practical insights and learn from different perspectives across OEMs, suppliers, and technology providers.
Through my keynote session, I hope attendees take away a clearer understanding of how sensing systems behave beyond the lab, and why validation and observability are critical to long-term system performance. The goal is to contribute to a more grounded conversation around autonomy — one that focuses not just on capability, but on reliability and real-world operation.
