The Cheapest and Worst DIY 3D-Scanner in the World [ESP8266, ToF, WiFi, WebGL]

preview_player
Показать описание
This mini project shows an attempt to built a cheap areal 3D scanner. The two attempts use an ultrasonic transducer and a time-of-flight sensor.

Part 2:

Code and 3d pritner models:

Project page:

Links to the cheep parts including shipping (affiliate):

Wemos d1 mini ESP8266:

Servo SG90:

Ultrasonic Sensor JSN-SR04T:

Time-of-flight sensor VL53L0X:

Tripod nuts:

Laser Distance Meter:

plz share :-)

Consider supporting our work on Patreon for some extras:

We are also thankful for any donation on PayPal:

Twitter: @bitluni
reddit: r/bitluni
Рекомендации по теме
Комментарии
Автор

I totally respect that you posted the video anyways after the project failed.

AlbertScoot
Автор

IDEA: use two axis, one rotational (X; polar) and other axial (Y; Cartesian). Put the object in the rotational axis and move the sensor up/down for each complete rotation (or full Y scale, then increment rotation on object/X). This is a more traditional approach IF you want to scan an (small) object, not an open space. You'll be limited by the scanner design dimensions. Also, you probably can't use servos unless you mod them to allow full rotation - maybe you can use salvaged steppers from 2D printers/scanners? To speed things up, you can have an array of equally spaced sensors on Y axis (costly... or... try the two different sensors you already have, at the same time, if the ESP can manage it...).

nunolourenco
Автор

using blender as CAD, like a true legend

Ziar
Автор

Andreas Spiess did some videos about distance sensors, might be interesting.

bardenegri
Автор

It is scary how smart you are sir. Mad scientist smart.

emjoneshouseDIY
Автор

Uh, the ToF rangefinder VL53L0X isn't very good either. I mean it fires its laser at a 35° spread. and then collects the reflected signal at 25° angle... looks like it needs a collimating lens on the laser. The detection distance is the distance of the white-grey-ish objects covering the 25° field of view.

Andreas Spiess made a sensor comparison not too long ago that might be valuable to your goal. See "#203 Best LIDAR Sensors for Makers" here on the Tubes.

David Cambridge made a LIDAR scanner a few days ago, see also here on YouTube, that will make you soil your pants, perhaps there's something to be learned from that - well not from the state of your pants, but otherwise. He used a Garmin LIDAR which is kinda not cheap at all.

SianaGearz
Автор

I would take two TOF modules, preferably better ones, and mount them so that you get a denser cloud at far distances and some parallax effect when closer. I am not sure if using two gimbals or just mounting both sensors on a stick would be better. I think either would help quite a bit, as would smaller steps with the servos. I think the ultrasonic sensor is getting swamped by noise. To use it I think you will either need to take a lot of readings at each position or use a much more powerful transmit transducer and/or a receive transducer with better sensitivity and overload characteristics. You also need some logic to weight each reading with it's signal to noise ratio. It can be done, but I suspect that the laser will work better for most situations.

jdrissel
Автор

I think the ultra sonic mapped results should be viewed from the bottom inside. Another option is to add another statically connected, secondary camera a specified distance from the primary, and run each one taking turns capturing as the whole thing moves (doubling the amount of time it takes but at least having double the 3D data). I think stereo vision is a smart way to go.. on another note, maybe laser holds more promise.. because different materials absorb different sounds better.. but for light-stealth operations.. ultrasonic is not a bad idea. great contribution!

mrbiznessguy
Автор

I absolutely 100% feel you at 5:50 when you stay perfectly still so you don't vibrate your scanner for 10 minutes. Cool build either way, I bet this could be made a ton better in post with the right 'algorithms', whatever those are!

WaynesStrangeBrain
Автор

Actually good. Keen to see follow-up project. Thanks for doing what you do.

LatinDanceVideos
Автор

Nice job! Always good seeing problems as well as solutions. The usual way to get those nuts in is to touch them with a soldering iron so they melt-fit into the plastic.

GaryMcKinnonUFO
Автор

I think you're on the right track with the laser module, but need to tweak the calibration and resolution. I think the ESP might be a bit underpowered to get decent results in a reasonable amount of time though!

himselfe
Автор

Hi there. This is a very nice project. I think having a calibration prcedure prior to a full 3D scan, will increase your precision. You may want to do it at 3 points within your ToF range (min, med, max), also for each RGB and then apply the found correction factors. Also, based on your sample colour you can determine some patterns to optimize the measuring result.

gabrielm
Автор

Possibly the internal position control of the servos influences the accuracy of the measurement. Try to decouple the measuring circuit from the control as far as possible.
If the servos have screwed up the 5V power supply, a badly decoupled measuring circuit is immensely affected.

Brausepaul
Автор

Watching your video is always a great fun, especially during a weekend after a busy week. Thanks for the great videos.
BTW, I think I will do 3D reconstruction using stereo camerca using 2 esp-cam modules. Computer vision methods are known to provide better resolution, and if you don't want to do real-time reconstruction, use the processing power on those 2 esp-cam module is definitely sufficient.

sullivanzheng
Автор

Great project, I would think the time of flight sensor would have been great for this purpose. This is why I was interested in getting one was a cheap lidar, but maybe its too low res? I was reading that the time of flight sensor has a delay and can only take a measurement every so often, so it might be worth looking at the data sheet and making sure you are not plotting data when the sensor is not ready, thus reducing your resolution. Perhaps a slower scan is needed? Maybe multiple passes? Maybe a second sensor to scan on a different timeline and have the sensor spin 360. Thus one sensor is getting hemisphere 1 and the other is scanning hemisphere 2 and then the swap hemispheres. This way you get a 360 view and double the resolution because the timing is offset to scan in between the points that the previous time of flight module missed. This could be doubled in resolution again by having 4 sensors mounted, one for each quadrant.

ytfp
Автор

Perhaps more arc in rotation and closer to the object in this scenario? It may provide more detail.

Enviros
Автор

Try to focus the ultrasomic scanner, so that the cone is as narrow as possible and send a sequence of pulses, only counting recieved pulses for the correct sequence. This should filter out noise from echos and early reflections.
In both cases, take multiple samples (8-16 maybe) for each position, to help reduce bad data. Each sensor may have a particular distance it is good for, so it may need to be closer or further from its target to work. Also... you might need much smaller step sizes on the sevo. Could possibly be done by adding a gear. 20:1 ratio would increase resolution (and also scanning time). Good luck!

stephenborntrager
Автор

I have a few suggestions.
Allow the servo to tilt the sensors much slower
And add a black paper background to the object in a darker room.
Or put the object in a 5 face cube

moechano
Автор

You could perhaps to this:
Make two sets of these 2-axis holders. One has a laser, the other a web-cam. Now you point the laser at the point you want to scan. Set up the camera holder to move so that the laser spot is in the center of the image. Now you can calculate the distance to the point based on the distance between the two holders and the four angles of the servos. Of course the resulting resolution will be limited by the angular resolution of the servos, but you could scan a few times and take the average or something. I assume the mcu that you are using can handle basic image processing.

radnaskelars