Question Regarding Controller Accuracy and Feedrate

Am I correct in assuming that I can increase the “accuracy” of a very small move command by reducing the feed rate of that command? For example, I tell the controller to move up 0.1 mm without specifying a feedrate, it will use the maximum value of 800 mm/min (or whatever is set in GC’s setting), but will my accuracy improve if I tell the controller to use, say 100 mm/min for that particular small movement?

I ask because I’m finding that when I calculate that I’m off my target by a small amount and tell the controller to move that small amount, I still find I often still off target a significant portion of that small amount.

1 Like

I think that friction and sag might be limiting factors here. To test, try moving away by say 10 times the error amount then back with the desired coordinates. That would test for friction. To test for sag, in the same manner choose a direction where the final motion results in the most balance between the forces from the motors - generally up and toward the center. Do either of these make a difference to arriving at ‘the spot’?

1 Like

I thought about that (move away and try again) but I’m concerned about how well it will work. It’s like constantly hitting a golf ball from the tee until you sink it rather than putting from the green. Put its certainly something to try because that’s ideally what you want (hitting the target from a distance). But I’d be interested in testing if I get the same target coordinates based upon which direction the sled moves. For example, will moving from left to right give same values as moving from right to left… or top to bottom or bottom to top… w


I don’t think so. These functions in Motion.cpp seem to me to be the controlling factors, and you would get a message if the firmware thought it couldn’t get to its destination.

calculateFeedrate(const float& stepSizeMM, const float& usPerStep) - Calculate the time delay between each step for a given feedrate .

computeStepSize(const float& MMPerMin) - Determines the minimum step size which can be taken for the given feed-rate based on the loop interval frequency. Converts to MM per microsecond first then mutiplies by the number of microseconds in each loop.

movementUpdate() - prints an alarm if the step did not complete in the allotted time.


So regardless of how slow I tell it to move, the same number of steps will be taken… just that the motor will turn slower… correct? Then I tend to think faster is better, if something like friction is an issue.

1 Like

probably, you are correct that you may end up in slightly different positions
depending on the direction you are moving, but this is something you should
experiment with to see what the errors look like.

one other factor is that PID loops have to have a dead area where they
stop trying to correct (otherwise you always overshoot and overcorrect)