Agreed. This is why I was thinking more of a “class”, with machine documentation guidelines (like a Airtable survey into a base) that would help document all the factors. Maybe I’ll play with this idea and bring it back here.
I meant development of the project overall, not necessarily the software. My observation is that mechanical issues far outrank software driven ones, but I admittedly don’t grasp the details of current calibration efforts. I would think getting to a max-accurate (for a given class) machine would be a baseline.
Please check out:
Agree wholeheartedly. It is not at all clear to me what frame should be chosen (let alone at a given time commitment/complexity or price point) in order to yield the most accurate Maslow. If I had more time and was willing to spend more money can I be relatively assured of a better cut than with Bee’s cheap and cheerful (wonderfully documented) frame? I still don’t have a clear answer.
I’m intrigued by the laser cut top beam emerging from the MetalMaslow iteration of the design. Lots of conversations suggest the direction of finely tuning the motor spacing and vertical position -should- lead to optimal results and greater consistency (makes logical sense) but I haven’t yet seen a more formalized and repeatable test that confirms this. If it’s true, it leads one to ask if there is a shipping/mail friendly way to perhaps make something like laser cut motormount templates that slide over imperial or metric sized square tube that can be bought at local home centers around the globe… Afterall, we don’t need the whole bar shipped, only templates that register to the ends and provide mounting holes (at least if I understand correctly…) to achieve this.
Building on what @cmullins70 said, even without stratifying in to various “small/medium/large” budget tiers, I’d be happy if there was a 3 or 5 item “if you do X you’ll get Y benefit” that properly documented the “upgrade” or change and talked about reasonable expected improvement in accuracy or repeatability (or even reliability if applicable). There is a great sense of urgency and engineering ingenuity in the air here, which is something I think we all enjoy partaking in, but it does sometimes feel like we skip a cog every now and then and the best we can do is point someone to a dense and rolling 400+ entry post that “kind of has the answer, you’ll get the idea…” but doesn’t quite answer the question completely.
I’m not sure what the answer is, but it does seem like marshalling all the resources and getting a vetted, updated and comprehensive set of baseline docs and specs and accuracy readings in place is necessary to determine if all the other work is materially moving things forward in ways that can be easily measured. How to do that, well I don’t have any big thoughts on that one yet.
I will say that the one problem with opensource designs and “everyone can do it their own way, huzzah, it’s open source” is that you can end up losing the very constraints and challenges that make engineering both more enjoyable but also more useful. Constraints are actually fun and helpful and make it easier for newcomers to engage…
PS - Said a different way, I’m still hoping for a clear 1-2-3 for the what/why of the new controller design. Super stoked reading Bar’s musings in the thread @WoodCutter4 linked but even for a sub component it shouldn’t be too hard to clearly state the key objectives. Brainstorming is all well and good, especially for clean sheet ideas, but when you’re trying to improve an existing one it helps to know if we’re shooting for pure innovation or improving and rounding off sharp edges of the existing one…
I’m leaning towards the Texas Instruments DRV8873. When I first glanced at the datasheet I assumed that “SPI or Hardware Interface Options” meant that I could interface fully through SPI to command something like “set power 50%” and it would stay there until a new command was sent. On reading the data sheet more closely I believe that you are correct that it requires a regular PWM interface and then SPI can be used to set error flags and read the current draw. I still think it’s a nice chip, but you are right that more pins will be needed than just the SPI bus I was hoping for.
Has anyone seen a cheap motor driver chip which can be controlled over just SPI?
Afterall, we don’t need the whole bar shipped, only templates that
register to the ends and provide mounting holes (at least if I understand
correctly…) to achieve this.
this mount registers to the end, designed to fit unistrut or wood
Building on what @cmullins70 said, even without stratifying in to various
“small/medium/large” budget tiers, I’d be happy if there was a 3 or 5 item “if
you do X you’ll get Y benefit” that properly documented the “upgrade” or
change and talked about reasonable expected improvement in accuracy or
repeatability (or even reliability if applicable). There is a great sense of
urgency and engineering ingenuity in the air here, which is something I think
we all enjoy partaking in, but it does sometimes feel like we skip a cog every
now and then and the best we can do is point someone to a dense and rolling
400+ entry post that “kind of has the answer, you’ll get the idea…” but
doesn’t quite answer the question completely.
part of the problem we have is that some people are getting good accuracy,
others, are not, and we don’t know why.
My current working theory is that there are three main reasons for this
frame (mostly top beam) is too flexible
the calibration routine is not working well enough and is resulting in
calculated measurements that are obviously wrong (I am thinking that holey
triangulation will help this a lot, but I’m struggling to run it)
the tape measure being used is inaccurate enough to cause grief. We recently
had an interesting discussion on class 1, class 2, and unclassified tape
In search of accurate measurements
we also don’t have anyone who has gotten a good accurate machine go through and
test other configs to see what effect the other configs have.
PS - Said a different way, I’m still hoping for a clear 1-2-3 for the what/why
of the new controller design.
We currently have three controllers
arduino mega and stock maslow controller
arduino due and modified maslow controller
arduino mega and TLE controller
Pairing one of these with a Raspberry Pi and using WebControl seems like it is a
huge win (I’ll say that being able to use my phone to bump things to get a
sprocket to 12 o’clock is a HUGE win compared to having to get to the computer
to move things and the machine to see if it’s in the right position yet)
So one goal of the new controller is to see if we can combine this, make the
controller run WebControl (or equivalent)
Additional features that we want to see if we can support (not in any order)
- support a stepper for a Z axis
- support other encoder types (SPI and RS422/RS485 communications)
- support 4 motor designs
- support higher current for the motors
- support higher voltages for the motors
- eliminate the ability to plug the power supply into the wrong plug
- current limiting on the motors to avoid frying chips
- current measurements on the motor drive
- wifi connectivity
- faster processor to handle the calculations a little better
- double precision floating point math for more accurate calculations
I haven’t yet seen one, but the beast might exist. I still go back to the idea that if you are going to build a custom motor controller, go ahead and add a simple, low cost microprocessor to the board whose function is to just perform motor controls and read the encoders and let it talk to the “master” device (RPi, whatever) through SPI, serial, etc. You could use a PIC controller (multiple PWM outputs and quadrature encoder interfaces) but I don’t know how easy they are to program or update in the event there’s a need to.
Would it make sense for organized efforts to focus on Epics (themes) to move work forward in a particular area? For example, a theme might be ‘Eliminate Bad Experiences’. (A $500 CNC busts its target price if I have to reorder a controller or a motor - not that I’m speaking from personal experience or anything ). Your #6, 7 address this. I also see “Better Accuracy”, “Faster Cutting”, and “Broader Software Support”. I think.
Another Epic I would like to see is “reduce or eliminate recalibration”.
we need to fix the ‘manually set chain length’ in the stock GC. This is one
thing webcontrol does much better.
in webcontrol you have the motor controls to set the sprockets to 12 o’clock in
the manually set chain length screen. So the process is, go to that screen,
tweak the sprockets to 12 o’clock, set the marked links on the sprocket, hit the
button, return to cutting
in Ground Control you have to go to automatic calibration, tweak the sprockets,
quit, answer the scary warning telling you that things may not work if you quit,
go to an advanced tab to then find the ‘manually set chain lengths’ button and
then go back to cutting.
if you don’t mark your chains, or you change something else on the machine, you
are going to have to re-calibrate.
better accuacy is a software/hardware issue, not a controller issue.
faster cutting requires different motors (plus acceleration planning in the
what are you thinking of when you say “Broader Software Support”?
So does this bring us back around to the importance of unsticking the holey-calibration branch and web control updates and getting those merges approved and new builds out so this can become mainline? I was very pleased to hear from @WoodCutter4 that there is some movement in that direction but it still sounds like it’s not quite clear yet how to get it all over the finish line. Are there any blocking items? Certainly having better usability around ensuring you stay calibrated will go a long way to accurate cutting.
not enough people with time/expertise in firmware and GC programming
I’m afraid that isn’t my area of expertise but I guess this is a good opportunity to put out the call. Maybe we should sticky a “job opening” at the top of the relevant forum and catch a few casual eyeballs who don’t know there is a need?
So does this bring us back around to the importance of unsticking the holey-calibration branch and web control updates
no, this is a fix to GC independed of holey-calibration
Are there any blocking items?
testing, we need people to test the git master branch, both with the holey
calibration, but (in some ways more imporantly) also with the tradional
calibration to make sure it didn’t break anything.
Maybe someone can create a new post asking for testers (this thread has digressed a bit ). I can’t test it because I don’t even have a functional Maslow at this point (all my motors are used in the four-motor test rig).
If someone was going to hypothetically design a controller for testing but include a stepper motor controller for z-axis, is there any ‘preferred’ stepper motor module? I see them readily available all over the internet.
To clarify further, my thought was perhaps a controller with components for five DC motors and headers installed if someone wants to install an optional stepper motor module and use it for the Z-axis.
I would use the standard plugable module
see the wide variety in the first list at
These are all pin compatible  and most new driver chips get packaged into
this format so that they can be used with a bunch of 3d printer boards.
 some of them have options for serial or SPI interfaces to the boards for
enhanced features, such as detecting when it has hit it’s limit without switches
I am leaning in the stepper for the zaxis direction too, especially since linear slides can be had cheaply with a stepper motor already attached. I don’t have any recommendations, but I am excited to see the results.
One possibility is that the same chip could be used to power a DC motor option if needed. A stepper motor is basically two DC motors in one.