Learn Camera Calibration in OpenCV with Python Script

preview_player
Показать описание


You will also get access to all the technical courses inside the program, also the ones I plan to make in the future! Check out the technical courses below 👇

_____________________________________________________________

In this video 📝 We'll talk about Camera Calibration and Geometry. We will first talk about the basics of camera geometry and how it can be used for calibrating cameras. We will see different types of distortion on cameras and images. At the end of the video, I'll show you in a Python script how to apply what we have learned and calibrate a camera from a practical computer vision setup.

If you enjoyed this video, be sure to press the 👍 button so that I know what content you guys like to see.

_____________________________________________________________

_____________________________________________________________

_____________________________________________________________

📷 Calibrator 📷
My camera calibration software, CharuCo Boards, and Checker boards

_____________________________________________________________

📞 Connect with Me:

_____________________________________________________________

🎮 My Gear (Affiliate links):
🖥️ Desktop PC:

_____________________________________________________________

Tags:
#CameraCalibration #CameraCalibrationOpenCV #CameraDistortion #ComputerVision #OpenCV #python
Рекомендации по теме
Комментарии
Автор

NGL, This is the best calibrating tutorial video ever. SPLENDID!!!

candysugarsosweet
Автор

Thanks, it was really helpful. At first i thought i had the speed set on x1.25

elmanku
Автор

Hi Nicolai. Thank you so so much for your videos. All about image processing field is amazing. Only I have an advice about your lists of videos or your channel structure. I think you can put a video link or something like that when you talk about other topics that you explain before, because is easier for new people follow all topics knowing and seeing previously videos.

Again, your channel and codes are so helpful. Thanks!

ronaldmiguelzafraurrea
Автор

Thanks for the video series, it's very helpful. Although I find images like those in the video are commonly used in calibration examples, I don't understand why the calibration pattern, i.e., the chessboard, isn't placed against a black background with carefully controlled lighting. What is the motivation? Is it just to keep photo session setup simple or do the pseudo-random image compositions serve some purpose?

rustyboltmusic
Автор

Hello, very good video. I have two questions btw. 1) why does objpoints on line 18 (or line15) have to be 3d points? 2) why is there no scaling needed on the objpoints to match the imgPoints in pixel space?

ProfFrankify
Автор

I got this error. I did exactly the same. Can you help me please?
File "Webcam_calib_pics.py", line 43, in <module>
ret, cameraMatrix, dist, rvecs, tvecs = cv2.calibrateCamera(
cv2.error: OpenCV(4.5.3) error: (-215:Assertion failed) nimages > 0 in function 'calibrateCameraRO'

hernansepu
Автор

Tnx for the tutorials man. These videos are great.

beef_nerd
Автор

I am using image on the dot pattern not checker or chessboard and I’m stucked in the part of calibration. How can I get the XYZ points of 3D real world and all the parameters (Camera matrix, translation and rotation vectors)? Thank you

eugenegatete
Автор

Really helpful video! You earned a sub.

I have a question regarding camera calibration though:
While running this script on images taken from a pretty bad drone-mounted camera, after cropping i receive a cropped image that is 40x10 pixels instead of the original 324x224 pixels. When I removed the cropping part of the script, my "calibrated" image was even more distorted as if it were taken by a fish eye lens 🤣 I got a total error of 0.58 which is pretty bad compared to your 0.04 👀

ObliviousBanana
Автор

IDK why, But I found this video so confusing. I could've been delineated in a step-wise manner. Of course, this video is for folks having background knowledge of computer vision. But for beginner like me, it is not easy to get along.

iamshakeelsindho
Автор

Hi, I wanted to ask how can I get 3D co-ordinates of a moving point? given that I had two webcams set orthogonally and both can track a point ( in this case I'm using a LED at the tip of a pen) and both cameras can track the LED. Now I wanted to plot or log the 3D co-ordinates of the LED in real time as I move it? Please provide me an idea. Thanks a lot!

raihankhan
Автор

At 17:44, shouldn't corners2 be appended to the imgPoints instead of corners?
Corners2 are more accurate, right?

hamzamohiuddin
Автор

Hi buddy, great video! Quick question, I must had missed the explanation, but I don't understand what you do with the total error at the end of the program. That value is it used to correct the image, or is it some kind of focal distance? Thanks!

gabrielscopeldelima
Автор

I'm trying to use this to get the orientation of an object for pick and place application. How can I get the orientation value of the object in real-time ? Also is it possible to get the coordinates of the object being tracked ? These 2 values (orientation of the object and coordinates of the objects) are required for the robot to pick-up the object.

atharvamadiwale
Автор

How I can scale this to 3D pose estimation of an object? What is the correct approach to understand it

enzoflores
Автор

Thanks for the nice tutorial about camera calibration. Could you please share the images you used during this tutorial ?

lalitmohankandpal
Автор

What is considered a low or good mean_error? Mine is ~2.4 and I don't know how good that is. Thanks!

seanyamamoto
Автор

Is the code fast enough to do undistortion on drone footage in real time, like 15-30 fps or at least 5 fps? I'm going to detect ArUco markers with my drone and if the camera distortion is too much, I reckon undistorting the images will be a great help, but I don't know if the drone can do it fast enough to be practical real-time. Great video!

AlexanderHL
Автор

Hey! first of all, congrats for the quality of the videos, you make so easy to understand this complex world of computer vision!

I have a question regarding the distance to calibrate the stereo system. Does it matters how far you place the chessboard from the cameras to take the pictures? Or depends on how far is the depth that you want to calculate after? Cause I`m building a set up that should get a minimum distance of 10 meters.

Thanks in advance for your help!

hectormarcos
Автор

after i corrected the fisheye images with the calibrated matrix i lost some field of view... How much do i loss exaclty? or better asked: which section of the image ist getting corrected?

bransen