morfizm (morfizm) wrote,
morfizm
morfizm

Why Webcams Are Slow?

I've finally tracked down the common webcam problem: why they advertise 30 fps at VGA (640x480), but in reality all you can get is 2-3 fps, sometimes up to 10, but never 30, and lowering resolution doesn't seem to help much.

I saw in forums people say that webcams are just slow, and getting it to 8 fps is a good aim. Someone recommended to close unnecessary applications, kill progresses, disconnect other USB devices, plug your camera in a dedicated USB port, and optimize your computer, claiming he could improve frame rate from 8 to 12 fps. Well, if he could do it, probably his computer was too slow or really screwed up. But if you think about it, the CPU and USB load from webcam is really low, it shouldn't be a problem.

Let's do simple math. If we assume that frames are delivered uncompressed, one true color pixel is 24 bit. 640x480 = 300K pixels = 7.03 MBit. At 30 fps, it makes 210.9375 MBit/second. USB 2.0 standard is still more than twice faster: 480 MBit/second. Moreover, if you go to grayscale (8 bits per pixel), you'll reduce traffic 3 times. If you go to QVGA (320x240), you'll reduce it another 4 times. If there's at least a tiny bit of compression done by the camera itself, this would reduce it even more.

So, 30fps at VGA shouldn't be a problem, unless there's super-high load on USB (e.g. you have an external harddrive or a scanner attached and you're actively using it at the same time with the camera) - note that USB bandwidth is shared, or you have a really old computer, like Pentium III, or your USB version is not 2.0 (all modern computers have 2.0 and used to have it for years).

The real problem turns to be length of exposure. The camera itself adds delay for each frame, because otherwise it doesn't have enough light to produce good quality picture. This can be solved in two ways: one is to add light. Especially, daylight. Fps will go up. Another is to tweak camera settings to reduce quality of image and allow shorter exposure. It can be "exposure" setting, or "light setting". You can compensate low light by adding more contrast or brightness in either camera settings or in the post processing (in computer software), but the main idea is that the camera should know that user isn't requesting good quality pictures from it.

Here is a wonderful video that demonstrates a problem and a solution. The title is specific to a certain webcam model, but the idea applies to any webcam:



I am curious why camera manufacturers decided that quality of image is so much more important than frame rate? I would rather make an option of setting a fixed framerate, and then changing other settings to get best possible quality while preserving frame rate, rather than fix quality and make frame rate variable, depending on lighting. This is weird. Okay.

Another nice conclusion you can make is that the most important feature of webcams is size of lens. That's why video recorded from a cheap photo camera is so substantially better than a video made by an expensive webcam. Photo camera just has better lenses. Knowing that size of lens is important, if you have fixed budget, you should look for minimal set of other features: for example, no "super-software included", no autorotation, auto track, auto focus, and so on. Any "auto anything" will add cost to camera, this reducing the budget manufacturer can spend on better lenses.

Enjoy! :)

Keywords: slow, web cam, webcam, camera, pc, desktop, laptop, bad webcam drivers, webcam low fps, webcam low frame rate, low frames per second, fix frame rate, fix fps, speed up webcam, make it faster, optimize webcam, poor webcam quality, improve webcam quality.
Tags: devices, in english
Subscribe
  • Post a new comment

    Error

    default userpic

    Your reply will be screened

    Your IP address will be recorded 

    When you submit the form an invisible reCAPTCHA check will be performed.
    You must follow the Privacy Policy and Google Terms of use.
  • 8 comments