- Lidar data representation
- Visualize Lidar data
- Work with a simulator to create PCD
Online
4 Months
Quick facts
particular | details | |||
---|---|---|---|---|
Collaborators
Mercedes Benz
|
Medium of instructions
English
|
Mode of learning
Self study, Virtual Classroom
|
Mode of Delivery
Video Based
|
Learning efforts
10 Hours Per Week
|
Course overview
The supply of professionals with advanced sensor fusion skills has remained behind the demand expressed by the industry. Hence, an investment in this Nanodegree program will set you up in the league of candidates that are in short supply, and therefore very likely to find economically and professionally rewarding packages in the global value chain.
This program revolves around 4 uniquely curated, industrial robotics projects that test the most relevant problem statements of this industry. Thus, once you brainstorm through these wholesome, exciting, and complex scenarios, you would come out on the other side having answered a lot of the questions prospective recruiting companies look for, in their candidates.
Typically, studying for 10 hours a week, a student is able to graduate from this program in 4 months. For those who have limited background in one of the prerequisites, candidates should consider the Intro to Self-driving Cars Nanodegree program. This will help them in laying foundation so that you can enroll for this program thereafter.
The highlights
- 4 months program
- 10 hours per week
- Personalized feedback
- Nanodegree program
- Job offers negotiation
- Dedicated technical support
- Job search assistance
- Real-life projects
Program offerings
- Personal career coach
- Technical mentor support
- Real-world projects
- Student community
- Flexible learning curriculum
- Interview preparation assistance
- Linkedin profile review
- Expert project feedback
Course and certificate fees
The tuition fee for the Sensor Fusion Engineer Nanodegree program is as follows:
Particulars | Amount in INR |
Promotional Offer Fee for the 4-month duration | 77,676 |
Monthly fee - pay as you go | 22,849 |
certificate availability
certificate providing authority
Eligibility criteria
Work experience
The program needs no work experience, and is thus open for all.
Education
For the best learning experience, it is advisable to have working knowledge of C++ or an equivalent programming language based on the principles of an Object Oriented setup. Additionally, it would be very desirable to have the basics of probabilities, calculus, and linear algebra to be in place.
Certification qualifying details
The program requires the students to clear the projects associated with all the 4 modules to be duly cleared in order to graduate. Whenever a submission is made, it will be evaluated by the designated faculty and relevant feedback be provided. The submissions not clearing in the first attempt will need to be re-attempted till they match the minimum standards.
What you will learn
Upon graduating from the Sensor Fusion Engineer Nanodegree program, the participants will learn these concepts:
- Design intelligent systems to detect vehicles on the road with the use of segmentation, filtering, and clustering of raw lidar data.
- Application of Euclidean clustering to differentiate between vehicles and obstacles.
- Get equipped with radar signatures which will help in calculating velocity and orientation by taking into account radial velocity distortions.
- Become proficient in the in the nuances associated with thresholds to ascertain and subsequently eliminate false positive.
- Recognize lidar point cloud data to extract information from camera images to accurately arrive at the object’s motion and direction of travel.
- Utilize 3-D image projection techniques in collaboration with lidar data to setup a motion model.
- Gain a deep understanding of data fusion from multiple sources with the assistance of Kalman filters.
- Expedite the implementation of the feedback loop which will enable a dynamically updated version of Kalman filters. Moreover, unscented Kalman filters can then be built.
Who it is for
The Sensor Fusion Engineer Nanodegree program is apt for professionals aspiring to learn about radar, lidar, and camera data, to subsequently fuse it together for real-world application. Professionals with the following capabilities are likely to gain the most from this course:
- Practical coding experience of an Object Oriented Programming language such as Python
- High-school probability applications
- Linux command lines
- Expertise in Linear Algebra
- Comfortable with Calculus
Admission details
The enrolment steps for the Sensor Fusion Engineer Nanodegree program are:
Enrolment
This Nanodegree program comes with a free trial that can be availed by all users. You can access this from the course page by signing in with your Google or Facebook account. For others, please proceed with signing up with a new account.
Payment
The standard accessibility of 4 months comes in-built with the tuition fee of $1356 USD. Moreover, it will be charged to your preferred banking route only after the first 7 days of the free trial. Moving on, if the candidate hasn’t cleared all the requisite papers within the 4-month period, the cost for each subsequent is $399 USD. This will be applicable till the student clears all the projects to the standards deemed fit.
LMS
The LMS access is the last step in the chain of events. This is made available within a few minutes of the receipt of payment by us.
The syllabus
Lidar
Introduction of Lidar & Point Clouds
Point Cloud Segmentation
- Use the RANSAC algorithm for planar model fitting
- Use PCL to segment point clouds
Clustering Obstacles
- Use a KD-tree to store point cloud data
- Use PCL to cluster obstacles
- Apply building boxes around clusters
- Implement Euclidean Clustering to find clusters
Working with Real Point Cloud Data
- Work with real self-driving car PCD
- Play back multiple PCD files
- Filter PCD
- Apply Point cloud processing to detect obstacles
Course Project – Lidar Obstacle Detection
- Filter, segment, and cluster real point cloud data to detect obstacles in a driving environment
Radar
Introduction to Radar
- Handle real radar data
- Determine the appropriate sensor specifications for a task
- Calculate object headings and velocities
Radar Calibration
- Filter noise from real radar sensors
- Correct radar data to account for radial velocity
Radar Detection
- Predict the location of occluded objects
- Threshold radar signatures to eliminate false positives
Course Project – Radar Obstacle Detection
- Calibrate, threshold, and filter radar data to detect obstacles in real radar data
Camera
Sensor Fusion and Autonomous Driving
- Understand the SAE levels of autonomy
- Compare typical autonomous vehicle sensor sets including Tesla and Mercedes
- Compare camera, lidar and radar using a set of industry-grade performance criteria
Camera Technology and Collision Detection
- Understand how light forms digital images and which properties of the camera (e.g. aperture, focal length) affect this formation
- Manipulate images using the OpenCV computer vision library
- Design a collision detection system based on motion models, lidar and camera measurements
Feature Tracking
- Match features between images to track objects over time using state-of-the-art binary descriptors
- Detect features from objects in a camera image using state-of-the-art detectors and standard methods
Camera and Lidar Fusion
- Project 3D lidar points into a camera sensor
- Use deep-learning to detect vehicles (and other objects) in camera images
- Create a three-dimensional object from lidar and camera data
Course Project – Camera and Lidar Fusion
- Detect and track objects in 3D space from the benchmark KITTI dataset based on camera and lidar measurements. Compute time-to-collision based on both sensors and compare the results. Identify the best combination of key point detectors and descriptors for object tracking
Kalman Filters
Kalman Filters
- Merge data from multiple sources
- Construct Kalman filters
- Improve tracking accuracy
- Reduce sensor noise
Lidar and Radar fusion with Kalman filters
- Handle both radar and lidar data
- Build a Kalman Filter in C++
Extended Kalman Filters
- Predict when non-linear motion will cause errors in a Kalman filter
- Construct Jacobian matrices to support EKFs
- Program an extended Kalman filter to cope with non-linear motion
Unscented Kalman Filters
- Estimate when highly nonlinear motion might break even an extended Kalman Filter
- Create an unscented Kalman Filter to accurately track non-linear motion
Course Project – Unscented Kalman Filters Project
- Put your skills to the test! Code an Unscented Kalman Filter in C++ in order to track highly non-linear pedestrian and bicycle motion
Scholarship Details
As of now, there is no provision for a scholarship. However, depending on if you pay upfront or as you go, the total price will incur a discount of 15%.
Evaluation process
The Nanodegree is offered when all assignments at the end of each module are cleared by the students. This leads to the access to the next module.
How it helps
The founding stones of the program are designed based on thorough primary market research. The elements being offered and their depth has been taken from the most employable skills that have demand in today’s job market.
Real world projects are ingrained very well into the Robotics Software Engineer Nanodegree Program. All the 4 modules have been curated by industry experts to give participants confidence for having immersed in it.
Instructors
Mr David Silver
Lead
Udacity
Other Bachelors, MBA
Mr Stephen Welch
Instructor
Udacity
Ph.D
Mr Andreas Haja
Professor
Freelancer
Ph.D
Mr Abdullah Zaidi
Instructor
UMD
M.S
Mr Aaron Brown
Instructor
Freelancer
FAQs
Following is the course faculty:
- Dr. Stephan Welch (Instructor, PhD in Mathematics)
- Dr. Andreas Haja (Instructor, ex Volkswagen and Bosch)
- Mr. Abdullah Zaidi (Instructor, M.S. – University of Maryland)
- Mr. David Silver (Curriculum Lead, ex Ford, MBA Stanford)
No, the objectives of the program have been laid out. Please evaluate if those match with your professional goals and proceed to enroll.
The approach to the Learning Management System is made available on receipt of the payment towards the course fee. Thus, the course commencement depends on the candidate.
Over the 4-month access, it is envisaged that the students should be able to graduate from this Nanodegree program. Should there be a need to extend, the same can be done by paying a monthly fee.
Absolutely, a department of technical experts will be assigned to your batch for any and all product support that may be sought from you. Their contact mechanism will be explained after on-boarding is completed.
Measures are in place to widen the professional network for our students through strategic networking events from global leaders. Moreover, in-depth interview preparation in-person as well telephonically will be made available.
Following are some common job profiles: Sensor Fusion Engineer, Automated Vehicle Engineer, Self-driving car engineer, System integration engineer, Perception engineer, Imaging engineer, and Object tracking engineer.
Yes, the course preview can be accessed from the homepage of the Nanodegree program.
As the coding will be done on our virtual machines through the browsers, you wouldn’t need to install anything on your own computer. However, if you would like to complete projects on your machine, please install C++ version 11 and Point Cloud Library 1.7. Guidance will be given on how and where to get these from.