-
Notifications
You must be signed in to change notification settings - Fork 18
videofilt_processingvideoframes
VirtualDub Plugin SDK 1.2
Processing video frames (runProc)
The runProc method of a video filter is the main method that actually processes image data.
The source frame buffers are specified by the src field of the
VDXFilterActivation structure, and the output frame buffer is
specified by the dst field. The task of runProc is to process the
source image and write the result into the output image. The host has
already allocated both frame buffers and fetched the source frame by the
time the filter is invoked, so the only thing that needs to happen is to
process the image data.
Note that if paramProc specified that the filter is operating in
in-place mode, the source and output frame buffers will point to the
same memory.
runProc is always single-threaded — there is never a case where it is
invoked twice on the same filter instance at the same time. However, it
is possible that two different instances of the same filter may execute
concurrently. Therefore, it is safe to use a buffer that is attached to
the filter instance, but two instances should generally not share a
buffer that both read and write to.
When possible, memory allocation should not occur within runProc, and
it is better to preallocate buffers in startProc instead. There are
two reasons for this: it's faster, and it means that allocation failures
are caught before the render begins instead of during the render itself.
This also goes for other external resources, such as reading images off
disk. The exception is if there are enough of them that preloading them
is impractical; secondary video files, for instance, would be better off
read on the fly.
This is an example runProc implementation which simply divides all
pixel values by two, thus reducing the image to half brightness. Note
that runProc in a production filter usually considerably more complex,
both in functionality and in implementation.
int runProc(const VDXFilterActivation *fa, const VDXFilterFunctions *ff) {
int w = fa->dst.w;
int h = fa->dst.h;
uint32 *dst = (uint32 *)fa->dst.data;
ptrdiff_t dstpitch = fa->dst.pitch;
const uint32 *src = (const uint32 *)fa->src.data;
ptrdiff_t srcpitch = fa->src.pitch;
// loop over all rows
for(int y=0; y<h; ++y) {
// loop over all pixels in current row
for(int x=0; x<w; ++x) {
uint32 pixelValue = src[x];
// divide all channels by 2
dst[x] = (pixelValue & 0xfefefe) >> 1;
}
// advance to next row in destination and source pixmaps
dst = (uint32 *)((char *)dst + dstpitch);
src = (const uint32 *)((const char *)src + srcpitch);
}
return 0;
}
Copyright (C) 2007-2012 Avery Lee.
Setting up your development environment
Conventions
Plugin initialization
Dynamic loading
Reference counting
Using CPU extensions
Introduction
What's new
Breaking changes
Gotchas
Deprecated features
Migrating from the old Filter SDK
Programming model
Handling bitmaps
Creating a video filter
Setting filter parameters
Processing video frames
Managing filter data
Creating time-varying filters
Handling aspect ratio
Prefetching multiple source frames
Handling multiple sources
Making a filter configurable
Scripting support
CPU dependent optimization
VDXA index omitted
Getting started
Writing the module entry point
Creating a video filter
Adding configurability
Adding script support
Introduction
What's new
Autodetect
Direct mode
Video frames vs. samples
Video decodint model
Video decoder