gnash-dev
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [Gnash-dev] NetStream design (episode 1)


From: Bastiaan Jacques
Subject: Re: [Gnash-dev] NetStream design (episode 1)
Date: Fri, 16 May 2008 18:48:41 +0200 (CEST)



On Thu, 15 May 2008, strk wrote:

I've been trying to model the new NetStream architecture.
Doing so, questions arise very early in the process, so
here they are.

Consider the following model:


                +------------+
                | input      |
                +------------+
                | bytes      |
                +------------+
                      |
                      v
                 ( parser )
                      |
           +----------+---------+
           |                    |
           v                    v
   +----------------+    +----------------+      +------------+
   | video_buffer   |    | audio_buffer   |      | playhead   |
   +----------------+    +----------------+      +------------+
   | encoded_frames |    | encoded_frames |      | cur_time   |
   +----------------+    +----------------+      | play_state |
                                                 +------------+

In this model, the (parser) process would run in its thread and
fully fill the video/audio buffers.

I like the idea of having a lot of things in one thread: downloading,
parsing, and decoding. In theory that should be enough, so we'd only
have two threads (including "main"). I think we might also consider (in
our final design a definition of the way we'll do communication across
threads.

The buffers would need to
have concept of 'timestamp' to respond to ActionScript exposed
queries like: "how many seconds of media do we have in the buffer ?"
and accept requests from ActionScript like: "buffer these many seconds
of frames before starting playback".

I can see that the FLV format allows this, in that timestamps are
associated to encoded frames. But is this also true for other kind
of formats ? What about mpeg for instance ? And OGG ?

AFAIK, most container formats have some timekeeping mechanism. Usually
you'll get some extra information like duration through metadata. So by
the time you've parsed the first kilobyte of the stream you'll usually
know the size of the stream in bytes, the duration in seconds, the total
number of frames and perhaps the bitrates of audio and video,
respectively.

I know that Gstreamer has a mechanism for streams that don't have much
information with regards to bitrates where it detects the speed of the
data flowing to your hardware and decides what the bitrates are and FPS
should be. We could do something similar if we encounter such a
situation.





reply via email to

[Prev in Thread] Current Thread [Next in Thread]