[Top][All Lists]
[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
[Gnash-dev] NetStream design (episode 1)
From: |
strk |
Subject: |
[Gnash-dev] NetStream design (episode 1) |
Date: |
Thu, 15 May 2008 19:58:20 +0200 |
I've been trying to model the new NetStream architecture.
Doing so, questions arise very early in the process, so
here they are.
Consider the following model:
+------------+
| input |
+------------+
| bytes |
+------------+
|
v
( parser )
|
+----------+---------+
| |
v v
+----------------+ +----------------+ +------------+
| video_buffer | | audio_buffer | | playhead |
+----------------+ +----------------+ +------------+
| encoded_frames | | encoded_frames | | cur_time |
+----------------+ +----------------+ | play_state |
+------------+
In this model, the (parser) process would run in its thread and
fully fill the video/audio buffers. The buffers would need to
have concept of 'timestamp' to respond to ActionScript exposed
queries like: "how many seconds of media do we have in the buffer ?"
and accept requests from ActionScript like: "buffer these many seconds
of frames before starting playback".
I can see that the FLV format allows this, in that timestamps are
associated to encoded frames. But is this also true for other kind
of formats ? What about mpeg for instance ? And OGG ?
I'll add more elements to the diagram as feedback flows....
--strk;
() ASCII Ribbon Campaign
/\ Keep it simple!
- [Gnash-dev] NetStream design (episode 1),
strk <=