Visual SLAM / Sensor Fusion Engineer – Autonomous Systems | San Jose, CA |
We are recruiting for an innovative company at the forefront of autonomous technology in agriculture. This organization is revolutionizing farming by developing advanced robotic machinery capable of navigating complex, GPS-denied environments. Their solutions integrate cutting-edge Visual SLAM and sensor fusion technologies to deliver precise, autonomous operation in challenging conditions.
This is a fully onsite role based in San Jose, CA, offering a competitive salary in the range of $180,000 - $230,000 per year, depending on experience.
About the Role
As a Visual SLAM / Sensor Fusion Engineer, you will develop and optimize state-of-the-art algorithms to enable precise localization and mapping for autonomous vehicles. You’ll work with multidisciplinary teams to integrate perception, navigation, and control systems, ensuring seamless operation in dynamic, real-world environments.
Key Responsibilities
• Design and implement Visual SLAM algorithms for real-time localization and mapping.
• Develop and integrate sensor fusion solutions combining data from cameras, IMUs, LiDAR, and other sensors.
• Optimize algorithms for real-time performance and robust operation in GPS-denied environments.
• Create scalable systems capable of handling large, dynamic outdoor environments with variable terrain and conditions.
• Collaborate with hardware and software teams to ensure seamless system integration.
• Conduct testing and validation in simulated and real-world field environments.
• Debug, optimize, and enhance system performance for edge cases and long-term reliability.
• Document methodologies, findings, and performance metrics for internal use and external stakeholders.
What We’re Looking For
• 5+ years of experience in Visual SLAM, sensor fusion, or related fields, with a focus on autonomous systems.
• Proficiency in C++ and Python for developing real-time robotics applications.
• Strong understanding of state estimation, multi-view geometry, and sensor calibration.
• Hands-on experience with SLAM frameworks (e.g., ORB-SLAM, RTAB-Map, or GTSAM) and sensor fusion libraries.
• Expertise in integrating and optimizing data from cameras, LiDAR, and IMUs.
• Familiarity with ROS or ROS2, and experience deploying solutions in robotics platforms.
• Proven ability to debug and optimize real-time systems for performance and reliability.
Preferred Skills
• Background in GNSS-denied navigation and mapping.
• Experience with agricultural, off-road, or other autonomous vehicle platforms.
• Familiarity with machine learning techniques applied to perception and localization.
• Knowledge of functional safety standards (e.g., ISO 26262).
Why Join Us?
• Work on cutting-edge autonomous systems that directly impact a critical industry.
• Collaborate with a passionate, innovative team solving real-world challenges.
• Competitive salary and growth opportunities in a high-energy, execution-oriented environment.
If you’re excited about leveraging your expertise in Visual SLAM and sensor fusion to drive the future of autonomous systems, apply now!
#VisualSLAM #SensorFusion #Robotics #AutonomousVehicles #SLAM #ROS2 #Engineering #SanJose #AgTech #NowHiring