Optical Calibration Demo and Three Hours Working on a Bug

Just a suggestion, in case no-one else has considered it. Have you thought about using QR codes or AR fiducials for your square markers?

Because the codes are square you can still measure the centre and any rotation, but also, with a bit of software wizardry, you can determine which square you are looking at. This allows you to command a move to a specified point, verify that the specified point was reached (or which point was actually reached) and measure the actual location accurately.

1 Like

I’m now confident the the issue with the oscillations were due to friction between the paper blueprint and sled. The paper was also not perfectly flat (I live in Florida and its humid so it had some every so slight “buckleness” to it). I just reran the calibration and test using the laminated pattern and got some remarkable results. It moved so smooth that I changed the homing tolerance to 0.125 mm (halved it from 0.250 mm). I also got it centered a bit better this time around, so the starting error magnitudes weren’t as bad… Here’s the calibration offsets it calculated:

For those not able to see the picture, it seems much more consistent. In fact, the charts show this:

Y error (normalized to the 0,0 offset)
image

X error (normalized to the 0,0 offset)

image

So the oscillations are gone and all I did was change the registration pattern from plain paper to laminated paper. The RMS error for X and Y are 0.778 mm and 0.708 mm, respectively. But the spread between the max and min errors were close to 3 mm for X axis and 2.4 mm for Y axis. Not very good.

Now, for the test results (keeping in mind this is only 6-inches around the center):

So here, worse error was 0.330 mm and many were less than 0.125 mm error. The RMS error for X and Y are 0.069 mm and 0.170 mm, respectively… That’s a remarkable improvement over stock calibration. Just remember my homing goal (how close I tried to get to dead center was 0.125 mm) so the error is really 0.125+/-0.069 mm and 0.125+/-0.170 mm… So practically an accuracy to 0.300 mm… within 6 inches of center… Here the spread of max and min error was 0.27 mm for X axis and .52 mm for Y axis. Pretty darn good.

I’ll work on it this weekend to see if it translates across the whole board. I don’t see why it can’t. I’m pretty sure I can shave another 0.1 mm or so off the error… will see if I’m right :wink: Edit: nope… I already took the step I was thinking of into account… .

3 Likes

Yes, that was the plan from the start, but I switched to simple squares because I readily found python code to do what was needed whereas the QR code software, irrc, wasn’t available for Python or something? I don’t really recall precisely. However, I’m not really sure it’s needed because no one’s machine should be that far out of whack (over 1.5-inch error) that it doesn’t know which square it’s looking at. If that’s the case, they have some defects that need fixing.
The pattern would look cooler though! :smiley:

But, maybe if we get a camera that looks through the opening, you can use QR code stickers on your workpiece to map out positions of cuts or something (i.e, change home location).

so are the errors you are reporting accuracy or precision errors?

At this point, I think we should be looking for finding how precise/repeatable we can be, and then look to see what we can do in terms of accuracy

I reran the calibration and for the X errors, the worse difference between my previous offsets and the new offsets was 0.06 mm… for Y errors, it was 0.25 mm difference. So X axis is more precise than Y axis but pretty well within our goal for 0.5 mm accuracy. I think you’re correct about the precision of the calibration pattern (I’ll try to measure one of the squares this weekend with calipers… need to buy a new coin battery for it since I left the darn thing on) and I think we are pushing the limits of the optical recognition with the endoscope as well. I get 12 mm per pixel with where I have the camera placed and the process to recognize the edges of the squares is not perfect (which is the enemy of the good). Nevertheless, I could lower it some since the error isn’t too far off to improve the resolution…

I think what I’d like to do is try to measure a significant portion of the board this weekend and see how it performs.

4 Likes

I’d be happy to try this out if you want a second datapoint. I have an endoscope and a medium size 3d printer (200x200mm). I could print a small-ish version of the pattern at work and see if I get similar results. Is the code online somewhere?

1 Like

Yes, it’s on my github repo:

I takes a new firmware and ground control. Back up your current groundcontrol.ini file somewhere because this adds new settings as well.

There’s lots of quality-of-life improvements to make to the software, but this is the process:

  1. Go to settings and enable “Use Calibration Error Matrix (Experimental)”
  2. Go to Actions and “Return to Center” (always start in the center)
  3. Go to Actions->Advanced->Optical Calibration
  4. Make sure you have a good image of a 1/2-inch square in the video on the left. If it’s low or high, you can manually adjust your motor y axis offset in settings. If it’s off left/right, then likely your beam isn’t centered on the frame or, possibly, you skipped a chain link…
  5. Press Calibrate
  6. If it worked, you’ll get a Pixel/mm value in bottom left part of screen with a calibration matrix (probably all zeros initially).
  7. Press Auto Measure and it’ll work its way around. Right now it’s hard-coded for a 9x9 matrix (24 inch x 24 inch area… just increased it)
  8. VERY IMPORTANT: Hit the [0,0] button to send the router back to the center. If you don’t, you’ll get a “sled not keeping up with position error” after next step… still working on that.
  9. When done, press “Save and Send” and it will save the values to Ground Control’s settings and will send them to the controller. It takes a little time, but there’s not message that pops up yet. If you look at your Windows cmd window you’ll see the settings stream.
  10. When that’s done, press Auto Measure again. This time around, it will again measure the error, but it position will be adjusted by the values in the error matrix. Therefore, what you are measuring is the accuracy of those adjustments.
  11. DO NOT SAVE THOSE FINAL RESULTS. Those results are accuracy measurements and if you hit “Save and Send” the controller will try to use those values as the error matrix.

When you exit ground control and come back in, the controller will have lost all the calibration settings (they are not saved to flash memory at this time and I’m not sure if we have the space for that). So, move the sled to center (0,0), go to optical calibration, press calibrate and the previous error settings should appear in lower left portion. If so, press “Save and Send” and wait for them to be sent over.

You likely will have to install some python module… I don’t recall how I did it, but if you look at the code, it imports from:

  • scipy.spatial
  • imutils
  • numpy
  • cv2
    I think these need to get installed.
2 Likes

I did the test over larger area (+/- 12 inch)…

image

Errors after calibration:

I see a 0.45 mm in there, but most are really small.

I’ll go bigger tomorrow night and then try a full board this weekend. I need to figure out how to put a 1/4-inch hole into the laminated sheets. I might need to get a single-hole hole punch (I just have a two hole one without a very deep throat) and then cut a groove in the side so I can get the punch over the hole.

1 Like

Alright I’ve got your fork running, now to 3d print a holder for my endoscope! Your test pattern is 1/2" squares spaced 3-inches apart on center right (not 3 inches of white space between the squares)?

There’s a joke here about assessing errors in my face or something

4 Likes

I’m just going to 3d print a cylinder the diameter of my router with a hole the diameter of my endoscope and wedge it all together. Any tips for improvements? I wonder how much not having the weight of the router would affect measurements.

Yes, 1/2 inch spaces 3-inch on center. I’m calling it a night. Will check in tomorrow morning.

Account for shrinkage…

I’ve had pretty good luck with the dimensional accuracy with PLA on my printer. I’m going to try the measured dimensions and adjust from there. I’ve got a few more hours before this thing is done printing so I wont be able to try it tonight, but hopefully tomorrow night!

I did notice that my endoscope isn’t centered that well though. Possibly something I could adjust for in future versions of my 3d model.


Also here’s the (untested) model if someone else wants to give it a try
router-endoscope-adapter.stl (83.0 KB)

1 Like

@madgrizzle how did you generate your test pattern file? I’m poking around with export options from Fusion 360 and also wrestling with Inkscape, but it all seems very tedious.

I use Visio at work so it’s a piece of cake to do. I’ll upload a pdf file for you to print. What’s the biggest size paper you can print?

Edit: they are now up on the github repo… I made a 8.5x11 and a 11x17. I’ll work on updating the code to allow you to select the dimensions of the test. With one sheet of 8.5x11, you really can only do 9 test points (3-inch by 3-inch)

1 Like

Yeah, that will cause a slight error when the sled rotates.

When you install the camera, make sure it’s oriented where up is up. The program will adjust for rotation of +/- 30 degrees, but if it’s way off, the program gets confused. That’s something that could be fixed by using QR code or something, but it shouldn’t be too hard for people to make sure up is up.

1 Like

I’ve printed 11x17 in the past. But I work in a large office with lots of printers. Possible we have a larger format one somewhere I haven’t found. I’ll ask around. Thanks for putting up the PDFs!

The 3-inch spacing is not ideal because you can fit only 4 squares across the 11 inch dimension… So one square isn’t in the center. You’ll see what I mean when you look at the file. I updated it a few minutes ago to mark the square that should be used as the center square.

1 Like

Do you have some theory towards how to turn the results of the test pattern scan into calibration values (rotation radius, sag compensation, motor spacing, etc)? I’d be interested in getting up to speed on that so I could think about other test patterns that might be easier to produce (such as following a straight line drawn with a meter stick, or even following the edge of the work material)