?

Log in

No account? Create an account
   Journal    Friends    Archive    Profile    Memories
 

Why Webcams Are Slow? - morfizm


Dec. 24th, 2010 11:54 am Why Webcams Are Slow?

I've finally tracked down the common webcam problem: why they advertise 30 fps at VGA (640x480), but in reality all you can get is 2-3 fps, sometimes up to 10, but never 30, and lowering resolution doesn't seem to help much.

I saw in forums people say that webcams are just slow, and getting it to 8 fps is a good aim. Someone recommended to close unnecessary applications, kill progresses, disconnect other USB devices, plug your camera in a dedicated USB port, and optimize your computer, claiming he could improve frame rate from 8 to 12 fps. Well, if he could do it, probably his computer was too slow or really screwed up. But if you think about it, the CPU and USB load from webcam is really low, it shouldn't be a problem.

Let's do simple math. If we assume that frames are delivered uncompressed, one true color pixel is 24 bit. 640x480 = 300K pixels = 7.03 MBit. At 30 fps, it makes 210.9375 MBit/second. USB 2.0 standard is still more than twice faster: 480 MBit/second. Moreover, if you go to grayscale (8 bits per pixel), you'll reduce traffic 3 times. If you go to QVGA (320x240), you'll reduce it another 4 times. If there's at least a tiny bit of compression done by the camera itself, this would reduce it even more.

So, 30fps at VGA shouldn't be a problem, unless there's super-high load on USB (e.g. you have an external harddrive or a scanner attached and you're actively using it at the same time with the camera) - note that USB bandwidth is shared, or you have a really old computer, like Pentium III, or your USB version is not 2.0 (all modern computers have 2.0 and used to have it for years).

The real problem turns to be length of exposure. The camera itself adds delay for each frame, because otherwise it doesn't have enough light to produce good quality picture. This can be solved in two ways: one is to add light. Especially, daylight. Fps will go up. Another is to tweak camera settings to reduce quality of image and allow shorter exposure. It can be "exposure" setting, or "light setting". You can compensate low light by adding more contrast or brightness in either camera settings or in the post processing (in computer software), but the main idea is that the camera should know that user isn't requesting good quality pictures from it.

Here is a wonderful video that demonstrates a problem and a solution. The title is specific to a certain webcam model, but the idea applies to any webcam:



I am curious why camera manufacturers decided that quality of image is so much more important than frame rate? I would rather make an option of setting a fixed framerate, and then changing other settings to get best possible quality while preserving frame rate, rather than fix quality and make frame rate variable, depending on lighting. This is weird. Okay.

Another nice conclusion you can make is that the most important feature of webcams is size of lens. That's why video recorded from a cheap photo camera is so substantially better than a video made by an expensive webcam. Photo camera just has better lenses. Knowing that size of lens is important, if you have fixed budget, you should look for minimal set of other features: for example, no "super-software included", no autorotation, auto track, auto focus, and so on. Any "auto anything" will add cost to camera, this reducing the budget manufacturer can spend on better lenses.

Enjoy! :)

Keywords: slow, web cam, webcam, camera, pc, desktop, laptop, bad webcam drivers, webcam low fps, webcam low frame rate, low frames per second, fix frame rate, fix fps, speed up webcam, make it faster, optimize webcam, poor webcam quality, improve webcam quality.

8 comments - Leave a commentPrevious Entry Share Next Entry

Comments:

From:archaicos
Date:December 24th, 2010 09:55 pm (UTC)
(Link)
Reminds me my experience in embedded, where people (engineers included) thought that just plugging in a new complex component would work out of the box w/o any tuning. :)
And, of course, "premature optimization is root of all evil". :)
From:morfizm
Date:December 25th, 2010 02:57 am (UTC)
(Link)
I was surprised finding that some people *believe* that 3-5 fps is normal speed for webcams, 8 fps is good and 12 fps is excellent, that you can achieve with various tricks. They were probably pissed of at the beginning (looking at "VGA @30fps" advertisements) but then they couldn't find a solution in reasonable time and just gave up, thinking that it's normal!
From:archaicos
Date:December 25th, 2010 09:29 am (UTC)
(Link)
Case in point... There's a TV and cable modem (for it) in my room. Both are controlled from the same Comcast remote. To turn on both you typically need to press the red button "All On". The problem is, the two units do not always get the signal from the remote leaving you at times with one on and the other off. Pressing the button again just reverses both states. :) There's also the "Power" on the remote that controls only the modem. It took me a while to figure out how this shit works and how/why it doesn't.
Now, in the living room there's another modem, a DVD/cassette player, a TV and 3 remotes. And there I still haven't figured out how to turn on exactly what I want. Often times when I want to watch something there I spend some 10-15 minutes pressing all those buttons on all those remotes w/o knowing what's going to happen. If it takes me that much of time and effort with all my knowledge and experience to do something as simple as that, there's no surprise in ordinary people failing to do similar things without much if any technical knowledge.
From:vorber
Date:December 25th, 2010 12:25 am (UTC)
(Link)
afaik (and i'm quite close to the topic) most common webcams indeed produce uncompressed image and 200mbps is quite good estimate for VGA@30fps required bw there. We should also take in mind that most software that works with video also has to encode/decode it using some (usually pretty shitty) codec implementation.
And video is pretty good btw ;)
From:morfizm
Date:December 25th, 2010 02:44 am (UTC)
(Link)
Disagree re "we should also take in mind.

Postprocessing works independently of USB transfer, because I/O is non-blocking (I hope there is a special place in hell for programmers who don't use async. I/O for webcam implementations...).

Now you have 2 options:

1. Save on disk uncompressed and post-process later. Good for non-realtime applications. Any modern hard drive will give 20 MBytes/sec, which is 160 Mbit. It's just slightly below VGA @30 fps. You can easily catch up by cutting the resolution slightly, going down to some 24 fps, or applying any kind of compression (anything super lightweight). Another thing: any modern PC have 1 GB of memory available to applications. It is 38 seconds of raw VGA @30fps, which will still allow you to capture a smooth 3-4 minute clip with conditions above. But notice that these conditions are worst case. In practice, drive speed is likely more like 60 MB/sec instead of 20, you typically have 2+ GB of RAM, etc, and then you can always do a combination with #2:

2. For real-time applications you can compress. Again, any modern machine can do real-time compression of VGA @30fps at certain quality level. The quality downgrade will only affect sharpness and precision of colors, it will not affect brightness/contrast, smoothness of motion etc. Often, this quality is very high, unless you want to capture still images in the post-processing and print them. After all, you can always downsample to QVGA (downsampling is extremely cheap) and do 4x less work.

On the other hand, the problem I described won't get better with lower resolution. With insufficient light, camera would make the same long exposures and split out frames at low rate.

I was surprised finding that some people *believe* that 3-5 fps is normal speed for webcams, 8 fps is good and 12 fps is excellent, that you can achieve with various tricks. They were probably pissed of at the beginning (looking at "VGA @30fps" advertisements) but then they couldn't find a solution in reasonable time and just gave up, thinking that it's normal!
From:morfizm
Date:December 25th, 2010 02:45 am (UTC)
(Link)
RE using shitty codec implentation - see my notes about places in hell ;)
From:morfizm
Date:December 25th, 2010 02:54 am (UTC)
(Link)
Btw!

I wonder, how Skype shows device-specific property pages?
I wanted to get a simple app, something like "webcam configuration", which would just display that - what Skype does in Tools/Options/Video Settings/Webcam Settings. I was surprised that couldn't find such a tool!

On my computer Skype displays a 3-tab control with different options, and, apparently, the control is device-specific.

I started to look for simple APIs to bring it up, and wasn't very successful. First, I've searched for something like "Configuring a Video Capture Device" and found this:

(MSDN) Display VFW Capture Dialog Boxes

It had example of using IAMVfwCaptureDialogs interface of capture filter.

I had no idea what capture filter is, but I looked further. Found interface definition (1, 2), and found a very nice .Net wrapper library with samples: DirectShowNET.

In library there was example that displays various property pages. I was able to get the same dialog I wanted, but *only* with 1 tab, instead of 3. Here is a person on a forum that complains about the same problem (no solution). Basically, it's to use ISpecifyPropertyPages of a device filter and then call OleCreatePropertyFrame to bring up the dialog with tabs.

Do you know if the right way is to use this interface, or to get device-specific property pages somehow else?

Also, if you know any good (& free or cheap) video/audio capture and/or configuration software, please let me know :)
From:morfizm
Date:December 25th, 2010 02:56 am (UTC)
(Link)
(Re VGA@ 30fps - my point was that while 5-7 years ago capturing it realtime might have been a stretch, now it's not. It's still an intensive task but it's pretty far from using 100% resources on machine.)

Edited at 2010-12-25 02:56 am (UTC)