WebControl Camera

I am playing around with webcontrol and I love it so far!

Is there a list of supported or recommended cameras to use? I used an old Microsoft webcam that I had laying around. I had previously used it on my 3d printer before getting an actual Raspberry cam (connects to the camera slot with a ribbon cable on the Pi board). The Microsoft camera worked fine on that system but I needed to mount it differently and the Raspberry camera was more flexible.

When I plugged the Microsoft cam into this Raspberry and viewed it, the entire system slowed down to a crawl! The sled would slow down by at least 75%. As soon I clicked the camera off, the speed would return to normal. The indicated CPU% at the top of the screen is around 75%-80% with the cam on and 30%-40% with it off while cutting.

I took my Pi camera off my printer and tried that. When I click on the camera button now, I am just shown a “broken image” icon. I do not have a video feed and the sled stays at normal speed. The processor % stays the same as well.

Both my router and printer have identical Pi’s on them (3 B+). I rebooted the system every time I made a change to the cameras. I checked all of my camera connections and they were good. I also put the Pi camera back on the printer and it worked without issue. I am going to get another Pi camera to try but did not know if anyone had any other recommendations to try as well.

Thanks in advance!

1 Like

The only thing I ever tried was a USB endoscope. You have to make sure the fps is really low (like 3-4 fps) because of the way it sends the pictures to the client.

I might be able to figure out the rpi camera… I think it requires a different programming… just have to find the rpi camera I bought a while back.

I use the “Spaghetti Detective” app. on OctoPrint on my printer. They allow using a regular USB camera for basic functions but if you want higher fps, you have to use the Pi camera. Beyond that, I have no clue whatsoever how it works.

Ideally the web browser gets the image directly from the camera or through some other software that runs on a separate core of the pi (i.e., not part of webcontrol). As it is, webcontrol has to stop what it’s doing, grab a frame from the camera, encode it, and send it over the network to the browsers. Webcontrol is a python application and runs in a single thread/core so if one process consumes a lot of time to complete, then other processes don’t get much time to do their jobs.

Pretty neat. I’m sure it could be trained to detect :fire: as well.

1 Like

That is what the Nest smoke alarm is for… :rofl:

Quick question:

Where is all of the processing for the actual sled movements done? Is it done in the Arduino or the Pi? Is the Pi taking the Gcode, determining what steps to turn each motor and then sending the request over to the Arduino? Or is the Arduino just being sent the Gcode and figuring out how to distribute the movements and come up with the commands itself?

The rpi just feeds the code to the controller a line at a time. All the real processing occurs on the Arduino… but if you have lots of really short moves, then the rpi needs to be relatively unencumbered to keep feeding the controller. If the rpi is tied up processing video images, etc., then it won’t be able to keep up with the arduino.

There a setting page for the camera… I really can’t guarantee it works properly since it was primarily put in for optical calibration…

First and foremost, thank you for everything and do not take my questions as being critical in any way! I am just a very curious person and love to learn.

I am trying to figure out where the bottle neck is and if there is anything that can be done to minimize it. I am trying to understand and help (if possible) but will admit, this is all still new to me.

I have been giving myself a crash course in Arduino and Raspberry these past few weeks. I opened up the Arduino compiler and I did see (after asking) that the Arduino is the one performing the math functions on the gcode to figure out where (and how) to move the sled. The gcode is coming from the Pi.

I have begun digging around on the 3d printing side of things to see how they handle this situation. I have mentioned before I am running Octoprint on a printer I have. It functions in a very similar way to webcontrol. It turns the Pi into a web-server for interacting with the printer. It also stores your gcode and then feeds it to the printer as it is printing. The basic Octoprint allows you to monitor and control your printer remotely. You are also able to view a video in a browser window with minimal delay. There are a bunch of plugins that sit on top octoprint to enable additional features.

When running, the printer EATS gcode FAST! I can still use almost all functions of the Octoprint without it slowing or hanging while printing.

At the end of the day, I am super happy and grateful for webcontrol! I love it and am glad you have put in all this time to get it to where it is today! I wish I knew how to contribute more but I have been motivated to start learning!

Thanks again and please keep up the amazing work!!

the arduino is being sent the g-code and is figuring out what to do from there.

David Lang

YEs, WebControl is the same idea of Octoprint, you may even be able to get
octoprint to feed g-code to a maslow (it couldn’t do the calibration,
single-motor controls, etc needed for the maslow)

David Lang

I am asking a ton of questions to try and understand how this works. Not because I have a better idea or think something should be different! I really appreciate everyone’s time with answering!!

If webcontrol cant feed the stream without slowing down the Pi, I do not think Octoprint is some magical program that can… lol… I think webcontrol is a very clean program from what I have seen. It doesn’t seem to have anything unnecessary. I cant say the same about Octoprint… lol I am just wondering what the bottle neck in our system is.

How is the calibration data handled?

Is the calibration data sent to the Arduino as a set of data points? The Arduino then uses this to modify its commands? Or does the Pi modify the gcode based on the calibration before sending over the gcode?

Does the calibration data change much once it is initially determined?

Would it be feasible to build a “tool” that could push calibration data (and other data that is not changed once initially setup) into the Arduino compiler automatically? You could then flash it onto the Arduino.

Build your Maslow. Make your initial test cuts, take measurements, enter your changes into a “tool” then flash Arduino. If something is “off”, make adjustments and re-flash.

Change something on your machine? Then make a change in the “tool” and flash.

Need to reset sprockets? bump L/R until at 12:00 have the “tool” adjust the new “home” and re-flash.

I don’t think it would be easy to do this but I am willing to put in some time to dig into it (I am still VERY fresh at this). I am just wondering if there are any roadblocks that have already been found before I fall too far down this “rabbit hole”… lol…

So is the calibration data sent to the Arduino as a data point the Arduino then uses to modify its commands or does the Pi modify the gcode based on the calibration before sending over the gcode?

mostly no, the Pi/GroundControl does not modify the g-code based on calibration

GroundControl/WebControl do modify the g-code based on resetting home. This is
something that should not be happening, but was done as the easy way out rather
than implementing G54 and family in the fimrware.

Would it be feasible to build a “tool” that could push calibration data (and
other data that is not changed once initially setup) into the Arduino compiler
automatically? You could then flash it onto the Arduino.

you should not have to recompile to change settings and calibration data.

There is actually good justification for having multiple sets of calibration
available for you to switch between

Build your Maslow. Make your initial test cuts, take measurements, enter your changes into a “tool” then flash Arduino. If something is “off”, make adjustments and re-flash.

Change something on your machine? Then make a change in the “tool” and flash.

Need to reset sprockets? bump L/R until at 12:00 have the “tool” adjust the new “home” and re-flash.

flashing is extremely slow compared to just sending the new parameters and
having them saved to eeprom. Why do you want to slow things down so much?

I dont think it would be easy to do this but I am willing to put in some time
to dig into it (I am still VERY fresh at this). I am just wondering if there
are any roadblocks that have already been found before I fall too far down
this “rabbit hole”… lol…

you are talking about taking things that are relatively simple variables today
and making it harder to change them, why?

We should be looking at pushing more of the calibration calculations into the
firmware, so that we are less dependent on ‘special tools’ that have to run on
some other system, not adding new dependencies that have to be maintained across
many different operating systems.

David Lang

I am just trying to understand why we are using “two” computers when the Arduino should be more than capable of handling it all alone.

Compared to your typical “delta” style 3d printer, I would think the calculations on a Maslow would be simpler for the computer (once calibrated). Most delta style printer run off of an Arduino. There is no additional computer feeding it data. They run off an SD card with only the job’s gcode on it. That is why I was asking about moving the calibration data out of groundcontrol into Arduino. From what I have seen in the Arduino so far, “most” of the building blocks for this are already there. It should not need much of a change to make this happen. We just need to figure out an easy way for “normal” users to do this with a simple interface. I know this is the path that the software took LONG before me.

There is a lot of variable calibration data on a 3d printer that is modified from time. The reason I was suggesting a firmware flash was the lack of need for additional hardware. With most printers you have an additional display and knob for changing settings on the device (as well as jogging the head around).

It would be cool in the future to be able to stick an SD card in my Maslow and click cut… lol…

Thank you as always. I know it is hard to convey inflection via text. I do appreciate all your input!

James

most CNC machines are two parts, with the device running the firmware just
handling the g-code. It’s a bit of a different world than 3d printers

if you look at laser cutters you see a 3rd world, where the mchine can do pretty
much nothing except define where home is.Everything else is done on a remote
machine.

with 3d printer, you don’t use the front panel to position the print on the
bed, you do all that on a remote machine.

with a mill/router, you need to position the start of the cut based on where the
material is

also, ith the maslow, it started with using a PC to manage the machine, do the
CD/CAM work, etc. The arduino doesn’t have any display. The Arduino is pretty
heavily loaded without any UI on it.

If you talk on the Marlin list about trying to run a mill with their firmware,
you will hear a lot of people explain why it’s not a good fit.

Marlin has you modify the source code and compile the variables in because they
didn’t have eeprom available to store the settings, if you think about it,
wouldn’t it be much nicer if you could set the various parameters without having
to recompile?

No worries what so ever.

There’s multiple reasons why this might be the case. First, and quite possibly the most likely, is that whoever programmed octoprint is a real programmer and not a self-taught hack (albiet self taught over the last 40+ years since I had a ti99/4a). People might get tired of me saying that, but it’s true. :slight_smile:

But specifically for the camera, it’s possible that octoprint doesn’t actually handle the video and it runs using a separate streamer application and the browser just connects to it. This is the preferred way to do it, in my opinion, as it can run on a separate core.

I just wanted to spread the idea of using an old iPhone 4 for as a webcam. There are several apps to make this work in the App Store.

I just made it work like this and am able to leave the room now :nerd_face:

I saw the camera function. Since I used a Pi zero W my CPU sits at 85-90%. Pretty sure it won’t support video. :slight_smile: I may change to a Pi 3b+ but everything fits in the enclosure I have with a fan.

Can someone post video of their camera working? Would be nice to see. TIA

I had the idea to use a raspberry PI zero with a micro cam and attach this to the vacuum adapter. Only drawback is that it needs either a (Lithium Polymer) Accu/Battery or an usb cable for power supply. Battery solution is probably better.

With the rasperry zero you just install one of the available cctv/cam solutions. You can then configure how many frames in which resolutions you would like.