Maslow Home Maslow Community Garden

Optical Calibration Demo and Three Hours Working on a Bug


#401

What’s happening is that the camera isn’t detecting the square. The software, at the moment, needs to be able to see the full square (i.e., none of it can be caught by the edge). I was thinking that we might be able to add a white border around the image once captured to facilitate edge detection. Either that, or we add manual controls for people to move the square within the camera image. The advantage of the larger square is that we are more accurate with where the center is. The disadvantage of the larger square is that this problem can crop up more readily.


#402

I’m not really a python programmer… anything you can do to improve it is welcome.


#403

Yeah that was definitely the case. But the weird part to me was that the calibration routine moved the machine several inches away from where it needed to go (the camera was on the 2x4 below my work area). The prior calibration point had reasonable values. So I’m at a loss at what could have caused this.

Me neither but I understand concurrency in some other languages. I’ll do some reading


#404

Well, the problem was that the contour had to have an area of at least 1000 to be recognized… I’m sure I put it in there for a reason (maybe to avoid bad detections), but when a square is partially seen by the camera, it’s area is likely to drop below the 1000 threshold. Then, it just keeps taking picture after picture, locking up the system.

I had an error tracker (falseCounter) that would escape if it couldn’t get 10 good images, but I disabled the escaping to try to fix something, iirc. I’m going to drop the threshold considerably and put the escape back in.


#405

That all makes sense. Any idea what could have caused the sled to go so far in the wrong direction? It was at the edge of the work piece so at least 3 inches away from where it should have been to find the square


#406

Is it possible that it landed in the correct spot first, then had a problem with finding a good square and got a bad value for it and moved away?


#407

Totally possible. I don’t have any other hypothesis.

I was able to run three successful calibrations yesterday (out of 5 attempts). The data is mostly consistent / repeatable. Except for the 2nd run, where the entire Y-error-by-row graph is shifted by 2mm.

Here’s the third run (the first run is also very similar to this)

But then here’s the second run

Looks like things across the board are shifted by ~2mm. Any hypothesis about what could cause that? Other than that, the offset the data looks consistent so probably isn’t really a problem unless you’re shooting for accuracy relative to the edge or center of the work piece. I’m mostly just curious.

Edit: full data is in the 3 sheets marked 10/6 in my spreadsheet.


#408

Are the correction values all offset relative to the offset of the center value? Or are the correction values absolute?


#409

The values in the calibration matrix that you see and are sent to the controller are all absolute values. In the firmware, the positions of the motors are adjusted based upon the center (0,0) value and then, in kinematics, the amount of correction is “reduced” by the center value.


#410

Assuming the controller wasn’t using a calibration model (disabled) then I’m not sure what could cause such a shift. Did something change in the camera setup or the optical center value?


#411

Ok cool this is what I understood from past discussions. Just wanted to make sure I understood it right.

I didn’t intentionally change anything. But this seems like the best explanation.


#412

I’ve been thinking about how the calibration curves I’m seeing on my machine represent various inaccuracies in my frame or my measurements. @madgrizzle have you or anyone else built a kinematics simulator that, given chain lengths and motor spacing (and maybe sled weight / chain sag), can work backwards to find the expected x,y coordinate? It’d be basically turning the kinematics code in the firmware on its head and doing the opposite (finding the x,y coordinate from chain lengths, instead of finding the chain lengths from x,y coordinate).

If we had something like that, we could try different chain error / motor measurement / rotational radius (chain length), and sag values and see how they could potentially affect x,y error across the work area. We could then graph that error for our test points and compare with error curves that we’re seeing, and maybe better understand the underlying most significant causes for miscalibrated Maslows and potentially work to address those.

If we wanted to get fancy, we could even grab the chain-length values from a special build of the firmware (using the same eeprom values already in the machine), then feed those into the kinematics simulator for validation. Ardupilot does something like this for hardware-in-the-loop testing. This would also double as potentially a useful integration test for the Maslow firmware.

What do you think?


#413

I’ve been thinking about how the calibration curves I’m seeing on my machine
represent various inaccuracies in my frame or my measurements. @madgrizzle
have you or anyone else built a kinematics simulator that, given chain lengths
and motor spacing (and maybe sled weight / chain sag), can work backwards to
find the expected x,y coordinate? It’d be basically turning the kinematics
code in the firmware on its head and doing the opposite (finding the x,y
coordinate from chain lengths, instead of finding the chain lengths from x,y
coordinate).

the firmware already has that. It’s used to figure out where it is to compare
where it should be and adjust the power to the motors

If we had something like that, we could try different chain error / motor
measurement / rotational radius (chain length), and sag values and see how
they could potentially affect x,y error across the work area. We could then
graph that error for our test points and compare with error curves that we’re
seeing, and maybe better understand the underlying most significant causes for
miscalibrated Maslows and potentially work to address those.

take a look at the simulator in Ground Control, it lets you tweak these errors
and will show you a graph of how the coordinates are distorted from what they
should be.

If we wanted to get fancy, we could even grab the chain-length values from a
special build of the firmware (using the same eeprom values already in the
machine), then feed those into the kinematics simulator for validation.
Ardupilot does something like this for
hardware-in-the-loop
testing. This would also double as potentially a useful integration test for
the Maslow firmware.

What do you think?

more of this is already there than you think.

The problem is that our model is not correct. We know that our chain sag
calculation is incorrect for example, but we have not worked out what the
correct math is yet.

David Lang


#414

Oh I did not know about this! Maybe I’ll rig it up to output calibration charts like the ones @madgrizzle and I have been generating from the test points, so it’s easy to compare between the two.

As you’ve mentioned in the past I think this optical calibration routine will be useful to help validate any new calculations. We just need a way to compare the data computed from the simulation with the data generated from the optical calibration.


#415

I’m out of town until the hurricane blows through so I won’t be able to do any testing… hopefully my shed survives.


#416

What would you think of using the QR code(Inside the square, 3 contrasting definied squires inthe corner ) principle? That would also accounts for the rotation of the camera or sled


#417

Praying for safety for you and your family (and also your shed!)


#418

It would be a neat idea, since then the routine wouldn’t need to start at the center calibration point (we could encode the calibration point’s coordinate in the code).

Other than not needing to start with the center calibration point visible to the camera, I’m not sure if we gain a ton from it though. @madgrizzle’s code can already adjust for rotation as long as it’s <45° from level. We don’t really get that far out of level from sled rotation, so that hasn’t been a issue in practice.

I guess one possible benefit of using qr codes is that you could flip the camera upside down (if that were more convenient for mounting) and everything would still work.

All that said, why not? If you want to write some code that can generate the qr code banner that would be awesome


#419

Is there an echo in here?

:slight_smile:


#420

I don’t think you actually need to start at the center of the pattern. You just need a square in view to calibrate the dimensions of the square. Honestly, now, I’m not really sure we even need to do that step and can probably modify the code to eliminate it.