Skip to content

Commit e01a8a4

Browse files
committed
docs: documentation for VideoBufferConverter
1 parent e273065 commit e01a8a4

File tree

5 files changed

+156
-91
lines changed

5 files changed

+156
-91
lines changed

docs/_sidebar.md

Lines changed: 4 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -28,9 +28,10 @@
2828
- [Logging](guide/logging.md)
2929
- Utilities
3030
- [Audio Converter](guide/audio_converter.md)
31-
- [Video Capture](guide/video_capture.md)
32-
- [Screen Capturer](guide/screen_capturer.md)
33-
- [Window Capturer](guide/window_capturer.md)
31+
- [Video Buffer Converter](guide/utilities/video_buffer_converter.md)
32+
- [Video Capture](guide/video_capturer.md)
33+
- [Screen Capture](guide/screen_capturer.md)
34+
- [Window Capture](guide/window_capturer.md)
3435
- [Voice Activity Detector](guide/voice_activity_detector.md)
3536
- [Power Management](guide/power_management.md)
3637

docs/guide/camera_capture.md

Lines changed: 3 additions & 85 deletions
Original file line numberDiff line numberDiff line change
@@ -114,43 +114,6 @@ videoSource.stop();
114114
videoSource.dispose();
115115
```
116116

117-
### Handling Device Changes
118-
119-
If cameras might change during your application's lifecycle (e.g., cameras connecting or disconnecting), you should implement a device change listener:
120-
121-
```java
122-
import dev.onvoid.webrtc.media.Device;
123-
import dev.onvoid.webrtc.media.DeviceChangeListener;
124-
import dev.onvoid.webrtc.media.MediaDevices;
125-
import dev.onvoid.webrtc.media.video.VideoDevice;
126-
127-
// Create a device change listener
128-
DeviceChangeListener listener = new DeviceChangeListener() {
129-
@Override
130-
public void deviceConnected(Device device) {
131-
if (device instanceof VideoDevice) {
132-
System.out.println("Camera connected: " + device.getName());
133-
// You might want to update your UI or offer the new camera as an option
134-
}
135-
}
136-
137-
@Override
138-
public void deviceDisconnected(Device device) {
139-
if (device instanceof VideoDevice) {
140-
System.out.println("Camera disconnected: " + device.getName());
141-
// Handle the case where the current camera was disconnected
142-
}
143-
}
144-
};
145-
146-
// Register the listener
147-
MediaDevices.addDeviceChangeListener(listener);
148-
149-
// ... later, when you're done listening for events
150-
// Unregister the listener
151-
MediaDevices.removeDeviceChangeListener(listener);
152-
```
153-
154117
## Receiving Video Frames
155118

156119
Once you have set up your camera video source and peer connection, you'll likely want to receive and process the video frames. This section explains how to receive both local and remote video frames.
@@ -243,56 +206,11 @@ When processing video frames, consider these important points:
243206
- Using a frame queue with a dedicated processing thread
244207
- Skipping frames if processing can't keep up with the frame rate
245208

246-
### Converting VideoFrame to BufferedImage
247-
248-
To display or process video frames in Java applications, you often need to convert the `VideoFrame` to a `BufferedImage`. Here's how to do it:
249-
250-
```java
251-
import dev.onvoid.webrtc.media.FourCC;
252-
import dev.onvoid.webrtc.media.video.VideoBufferConverter;
253-
import dev.onvoid.webrtc.media.video.VideoFrame;
254-
import dev.onvoid.webrtc.media.video.VideoFrameBuffer;
255-
import java.awt.image.BufferedImage;
256-
import java.awt.image.DataBufferByte;
257-
258-
public void onVideoFrame(VideoFrame frame) {
259-
try {
260-
// Get frame dimensions
261-
VideoFrameBuffer frameBuffer = frame.buffer;
262-
int frameWidth = frameBuffer.getWidth();
263-
int frameHeight = frameBuffer.getHeight();
264-
265-
// Create a BufferedImage with ABGR format (compatible with RGBA conversion)
266-
BufferedImage image = new BufferedImage(frameWidth, frameHeight, BufferedImage.TYPE_4BYTE_ABGR);
267-
268-
// Get the underlying byte array from the BufferedImage
269-
byte[] imageBuffer = ((DataBufferByte) image.getRaster().getDataBuffer()).getData();
270-
271-
// Convert the frame buffer from I420 format to RGBA format
272-
VideoBufferConverter.convertFromI420(frameBuffer, imageBuffer, FourCC.RGBA);
273-
274-
// Now you can use the BufferedImage for display or further processing
275-
// For example, you could display it in a Swing component:
276-
// myJLabel.setIcon(new ImageIcon(image));
277-
278-
} catch (Exception e) {
279-
e.printStackTrace();
280-
} finally {
281-
// Always release the frame when done
282-
frame.release();
283-
}
284-
}
285-
```
286-
287-
This conversion process works as follows:
288-
289-
1. Create a `BufferedImage` with the same dimensions as the video frame, using `BufferedImage.TYPE_4BYTE_ABGR` format which is compatible with the RGBA format we'll convert to.
290-
291-
2. Get the underlying byte array from the BufferedImage using `((DataBufferByte) image.getRaster().getDataBuffer()).getData()`.
209+
### Converting VideoFrame to other pixel formats
292210

293-
3. Use `VideoBufferConverter.convertFromI420()` to convert the frame buffer from I420 format (which is the internal format used by WebRTC) to RGBA format, storing the result directly in the BufferedImage's byte array.
211+
For converting I420 frames to UI-friendly pixel formats (e.g., RGBA) and other pixel format conversions, use the `VideoBufferConverter` utility.
294212

295-
4. The resulting BufferedImage can be used for display in Swing/JavaFX components or for further image processing.
213+
- See: [Video Buffer Converter](guide/utilities/video_buffer_converter.md)
296214

297215
### Scaling Video Frames
298216

docs/guide/overview.md

Lines changed: 4 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -38,9 +38,10 @@ This section provides detailed guides for various features of the webrtc-java li
3838
## Utilities
3939

4040
- [Audio Converter](guide/audio_converter.md) - Resample and remix 10 ms PCM frames between different rates and channel layouts
41-
- [Video Capture](guide/video_capture.md) - Control a camera device, configure capabilities, and deliver frames to a sink
42-
- [Screen Capturer](guide/screen_capturer.md) - Enumerate and capture full desktop screens/monitors
43-
- [Window Capturer](guide/window_capturer.md) - Enumerate and capture individual application windows
41+
- [Video Buffer Converter](guide/utilities/video_buffer_converter.md) - Convert between I420 and common FourCC pixel formats (e.g., RGBA, NV12)
42+
- [Video Capture](guide/video_capturer.md) - Control a camera device, configure capabilities, and deliver frames to a sink
43+
- [Screen Capture](guide/screen_capturer.md) - Enumerate and capture full desktop screens/monitors
44+
- [Window Capture](guide/window_capturer.md) - Enumerate and capture individual application windows
4445
- [Voice Activity Detector](guide/voice_activity_detector.md) - Detect speech activity in PCM audio streams
4546
- [Power Management](guide/power_management.md) - Prevent the display from sleeping during desktop capture or presentations
4647

Lines changed: 145 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,145 @@
1+
# Video Buffer Converter
2+
3+
The VideoBufferConverter provides fast pixel format conversions between WebRTC's internal I420 video frame buffers and other formats identified by a FourCC. Conversions are delegated to optimized native routines.
4+
5+
API: `dev.onvoid.webrtc.media.video.VideoBufferConverter`
6+
7+
## When to use it
8+
- Rendering frames in UI toolkits that expect interleaved RGB(A) byte layouts.
9+
- Preparing frames for encoders/decoders that require specific pixel formats.
10+
- Importing external pixel data (e.g., RGBA, NV12) into the WebRTC pipeline as I420.
11+
12+
See also: [Video Capture](../video_capture.md), [Custom Video Source](../custom_video_source.md).
13+
14+
## Supported operations
15+
16+
1) From I420 to other pixel formats
17+
- `convertFromI420(VideoFrameBuffer src, byte[] dst, FourCC fourCC)`
18+
- `convertFromI420(VideoFrameBuffer src, ByteBuffer dst, FourCC fourCC)`
19+
20+
2) From other pixel formats to I420
21+
- `convertToI420(byte[] src, I420Buffer dst, FourCC fourCC)`
22+
- `convertToI420(ByteBuffer src, I420Buffer dst, FourCC fourCC)`
23+
24+
Notes:
25+
- The VideoFrameBuffer is internally converted to I420 if necessary using `VideoFrameBuffer#toI420()` before transformation.
26+
- When using ByteBuffer destinations/sources, direct buffers use a zero-copy native path for best performance; otherwise, the method will use the backing array or a temporary array.
27+
28+
## FourCC formats
29+
30+
The target/source pixel layout is selected with `dev.onvoid.webrtc.media.FourCC`. Common values include:
31+
- `FourCC.RGBA` – 4 bytes per pixel, RGBA order (commonly used with BufferedImage TYPE_4BYTE_ABGR interop, see example below)
32+
- `FourCC.ARGB`, `FourCC.ABGR`, `FourCC.BGRA` – other 32-bit packed variants
33+
- `FourCC.NV12`, `FourCC.NV21` – 4:2:0 semi-planar YUV formats
34+
35+
Consult the FourCC enum in your version for the complete list.
36+
37+
## Buffer sizing
38+
39+
You must allocate destination buffers large enough for the chosen format:
40+
- For 32-bit RGBA-like formats: `width * height * 4` bytes
41+
- For NV12/NV21: `width * height * 3 / 2` bytes
42+
- For other layouts, compute according to their specification
43+
44+
Attempting to convert into undersized buffers will result in an error.
45+
46+
## Example: Convert VideoFrame to BufferedImage
47+
48+
This example demonstrates converting a WebRTC VideoFrame to a Java BufferedImage using RGBA output.
49+
50+
```java
51+
import dev.onvoid.webrtc.media.FourCC;
52+
import dev.onvoid.webrtc.media.video.VideoBufferConverter;
53+
import dev.onvoid.webrtc.media.video.VideoFrame;
54+
import dev.onvoid.webrtc.media.video.VideoFrameBuffer;
55+
56+
import java.awt.image.BufferedImage;
57+
import java.awt.image.DataBufferByte;
58+
59+
public void onVideoFrame(VideoFrame frame) {
60+
try {
61+
// Get frame dimensions
62+
VideoFrameBuffer frameBuffer = frame.buffer;
63+
int frameWidth = frameBuffer.getWidth();
64+
int frameHeight = frameBuffer.getHeight();
65+
66+
// Create a BufferedImage with ABGR format (compatible with RGBA conversion)
67+
BufferedImage image = new BufferedImage(frameWidth, frameHeight, BufferedImage.TYPE_4BYTE_ABGR);
68+
69+
// Get the underlying byte array from the BufferedImage
70+
byte[] imageBuffer = ((DataBufferByte) image.getRaster().getDataBuffer()).getData();
71+
72+
// Convert the frame buffer from I420 format to RGBA format
73+
VideoBufferConverter.convertFromI420(frameBuffer, imageBuffer, FourCC.RGBA);
74+
75+
// Now you can use the BufferedImage for display or further processing
76+
// e.g., display in Swing/JavaFX
77+
}
78+
catch (Exception e) {
79+
// Handle conversion errors
80+
e.printStackTrace();
81+
}
82+
finally {
83+
// Always release the frame when done
84+
frame.release();
85+
}
86+
}
87+
```
88+
89+
How it works:
90+
1. Create a BufferedImage sized to the frame.
91+
2. Access its backing byte[] via DataBufferByte.
92+
3. Convert the VideoFrameBuffer from I420 to RGBA into the image buffer.
93+
94+
Tip: If you have a direct NIO ByteBuffer (e.g., for native interop), use the ByteBuffer overload to keep a direct native path.
95+
96+
## Example: Import RGBA data into I420
97+
98+
```java
99+
import dev.onvoid.webrtc.media.FourCC;
100+
import dev.onvoid.webrtc.media.video.VideoBufferConverter;
101+
import dev.onvoid.webrtc.media.video.VideoFrame;
102+
import dev.onvoid.webrtc.media.video.VideoFrameBuffer;
103+
104+
import java.awt.image.BufferedImage;
105+
import java.awt.image.DataBufferByte;
106+
107+
public void onImage(BufferedImage image) {
108+
// Get image dimensions
109+
int imageWidth = image.getWidth();
110+
int imageHeight = image.getHeight();
111+
112+
// Create a I420Buffer
113+
NativeI420Buffer i420 = NativeI420Buffer.allocate(imageWidth, imageHeight);
114+
115+
// Get the underlying byte array from the BufferedImage
116+
byte[] imageBuffer = ((DataBufferByte) image.getRaster().getDataBuffer()).getData();
117+
118+
try {
119+
// In this example, we assume the BufferedImage is in 4 byte ABGR format
120+
VideoBufferConverter.convertToI420(imageBuffer, i420, FourCC.RGBA);
121+
122+
// Now you can use the I420Buffer, e.g., wrap in a VideoFrame
123+
}
124+
catch (Exception e) {
125+
// Handle conversion errors
126+
e.printStackTrace();
127+
}
128+
finally {
129+
// Always release the buffer when done
130+
i420.release();
131+
}
132+
}
133+
```
134+
135+
## Error handling and edge cases
136+
- All methods throw NullPointerException if src/dst is null; ensure proper checks.
137+
- ByteBuffer destinations must be writable (not read-only) for `convertFromI420`.
138+
- Ensure the correct FourCC is used for the actual memory layout you pass/expect.
139+
- Beware of frame rotation metadata; conversions do not rotate pixels. Handle `VideoFrame.rotation` separately if your renderer requires upright images.
140+
141+
## Related
142+
- [Video Capture](../video_capture.md)
143+
- [Custom Video Source](../custom_video_source.md)
144+
- [Screen Capturer](../screen_capturer.md)
145+
- [Window Capturer](../window_capturer.md)
File renamed without changes.

0 commit comments

Comments
 (0)