gnash-commit
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

[Gnash-commit] gnash ChangeLog backend/render_handler.h backen...


From: Tomas Groth
Subject: [Gnash-commit] gnash ChangeLog backend/render_handler.h backen...
Date: Tue, 05 Dec 2006 14:26:10 +0000

CVSROOT:        /sources/gnash
Module name:    gnash
Changes by:     Tomas Groth <tgc>       06/12/05 14:26:10

Modified files:
        .              : ChangeLog 
        backend        : render_handler.h render_handler_agg.cpp 
                         render_handler_cairo.cpp render_handler_ogl.cpp 
        libbase        : image.cpp image.h 
        server         : Makefile.am gnash.h render.cpp render.h 
                         video_stream_instance.cpp 
        server/asobj   : Makefile.am NetStream.cpp NetStream.h 
Added files:
        server/asobj   : NetStreamFfmpeg.cpp NetStreamFfmpeg.h 

Log message:
                * backend/render_handler.h, backend/render_handler_agg.cpp,
                  backend/render_handler_cairo.cpp, server/render.h,
                  backend/render_handler_ogl.cpp, server/render.cpp: Removed 
                  YUV_video, and added videoFrameFormat() and drawVideoFrame().
                * libbase/image.cpp, libbase/image.h: Added yuv.
                * server/Makefile.am: Removed video_yuv.cpp
                * server/gnash.h: Removed YUV_video.
                * server/video_stream_instance.cpp: Changed to use the new 
                  videoframe rendering method.
                * server/asobj/Makefile.am: Added NetStreamFfmpeg.{cpp|h}
                * server/asobj/NetStream.{cpp|h}: Removed the ffmpeg code, 
                  and made a non-decoding class (NetStreamBase).
                * server/asobj/NetStreamFfmpeg.{cpp|h}: Created the files, 
                  which contains the ffmpeg specific code, moved from 
                  NetStream.{cpp|h}. Now uses StreamProvider to handle files,
                  and connects in a seperate thread to avoid blocking

CVSWeb URLs:
http://cvs.savannah.gnu.org/viewcvs/gnash/ChangeLog?cvsroot=gnash&r1=1.1853&r2=1.1854
http://cvs.savannah.gnu.org/viewcvs/gnash/backend/render_handler.h?cvsroot=gnash&r1=1.24&r2=1.25
http://cvs.savannah.gnu.org/viewcvs/gnash/backend/render_handler_agg.cpp?cvsroot=gnash&r1=1.51&r2=1.52
http://cvs.savannah.gnu.org/viewcvs/gnash/backend/render_handler_cairo.cpp?cvsroot=gnash&r1=1.17&r2=1.18
http://cvs.savannah.gnu.org/viewcvs/gnash/backend/render_handler_ogl.cpp?cvsroot=gnash&r1=1.59&r2=1.60
http://cvs.savannah.gnu.org/viewcvs/gnash/libbase/image.cpp?cvsroot=gnash&r1=1.16&r2=1.17
http://cvs.savannah.gnu.org/viewcvs/gnash/libbase/image.h?cvsroot=gnash&r1=1.10&r2=1.11
http://cvs.savannah.gnu.org/viewcvs/gnash/server/Makefile.am?cvsroot=gnash&r1=1.94&r2=1.95
http://cvs.savannah.gnu.org/viewcvs/gnash/server/gnash.h?cvsroot=gnash&r1=1.78&r2=1.79
http://cvs.savannah.gnu.org/viewcvs/gnash/server/render.cpp?cvsroot=gnash&r1=1.11&r2=1.12
http://cvs.savannah.gnu.org/viewcvs/gnash/server/render.h?cvsroot=gnash&r1=1.11&r2=1.12
http://cvs.savannah.gnu.org/viewcvs/gnash/server/video_stream_instance.cpp?cvsroot=gnash&r1=1.4&r2=1.5
http://cvs.savannah.gnu.org/viewcvs/gnash/server/asobj/Makefile.am?cvsroot=gnash&r1=1.25&r2=1.26
http://cvs.savannah.gnu.org/viewcvs/gnash/server/asobj/NetStream.cpp?cvsroot=gnash&r1=1.18&r2=1.19
http://cvs.savannah.gnu.org/viewcvs/gnash/server/asobj/NetStream.h?cvsroot=gnash&r1=1.14&r2=1.15
http://cvs.savannah.gnu.org/viewcvs/gnash/server/asobj/NetStreamFfmpeg.cpp?cvsroot=gnash&rev=1.1
http://cvs.savannah.gnu.org/viewcvs/gnash/server/asobj/NetStreamFfmpeg.h?cvsroot=gnash&rev=1.1

Patches:
Index: ChangeLog
===================================================================
RCS file: /sources/gnash/gnash/ChangeLog,v
retrieving revision 1.1853
retrieving revision 1.1854
diff -u -b -r1.1853 -r1.1854
--- ChangeLog   5 Dec 2006 10:39:54 -0000       1.1853
+++ ChangeLog   5 Dec 2006 14:26:09 -0000       1.1854
@@ -1,3 +1,22 @@
+2006-12-05 Tomas Groth Christensen <address@hidden>
+
+       * backend/render_handler.h, backend/render_handler_agg.cpp,
+         backend/render_handler_cairo.cpp, server/render.h,
+         backend/render_handler_ogl.cpp, server/render.cpp: Removed 
+         YUV_video, and added videoFrameFormat() and drawVideoFrame().
+       * libbase/image.cpp, libbase/image.h: Added yuv.
+       * server/Makefile.am: Removed video_yuv.cpp
+       * server/gnash.h: Removed YUV_video.
+       * server/video_stream_instance.cpp: Changed to use the new 
+         videoframe rendering method.
+       * server/asobj/Makefile.am: Added NetStreamFfmpeg.{cpp|h}
+       * server/asobj/NetStream.{cpp|h}: Removed the ffmpeg code, 
+         and made a non-decoding class (NetStreamBase).
+       * server/asobj/NetStreamFfmpeg.{cpp|h}: Created the files, 
+         which contains the ffmpeg specific code, moved from 
+         NetStream.{cpp|h}. Now uses StreamProvider to handle files,
+         and connects in a seperate thread to avoid blocking.
+
 2006-12-05 Sandro Santilli <address@hidden>
 
        * server/movie_root.h: Added more information for the

Index: backend/render_handler.h
===================================================================
RCS file: /sources/gnash/gnash/backend/render_handler.h,v
retrieving revision 1.24
retrieving revision 1.25
diff -u -b -r1.24 -r1.25
--- backend/render_handler.h    2 Dec 2006 21:03:50 -0000       1.24
+++ backend/render_handler.h    5 Dec 2006 14:26:09 -0000       1.25
@@ -17,7 +17,7 @@
 // 
 //
 
-/* $Id: render_handler.h,v 1.24 2006/12/02 21:03:50 strk Exp $ */
+/* $Id: render_handler.h,v 1.25 2006/12/05 14:26:09 tgc Exp $ */
 
 #ifndef RENDER_HANDLER_H
 #define RENDER_HANDLER_H
@@ -245,8 +245,19 @@
        /// Delete the given bitmap info class.
        virtual void    delete_bitmap_info(bitmap_info* bi) = 0;
 
-       virtual YUV_video* create_YUV_video(int width, int height) = 0; 
-       virtual void delete_YUV_video(YUV_video* yuv) = 0;
+       /// The different video frame formats
+       enum video_frame_format
+       {
+               NONE,
+               YUV,
+               RGB
+       };
+
+       /// Returns the format the current renderer wants videoframes in.
+       virtual int videoFrameFormat() = 0;
+       
+       /// Draws the video frames
+       virtual void drawVideoFrame(image::image_base* frame, const matrix* 
mat, const rect* bounds) = 0;
 
        /// Sets the update region (called prior to begin_display). The 
renderer 
        /// might do clipping and leave the region outside these bounds 
unchanged,

Index: backend/render_handler_agg.cpp
===================================================================
RCS file: /sources/gnash/gnash/backend/render_handler_agg.cpp,v
retrieving revision 1.51
retrieving revision 1.52
diff -u -b -r1.51 -r1.52
--- backend/render_handler_agg.cpp      2 Dec 2006 23:13:38 -0000       1.51
+++ backend/render_handler_agg.cpp      5 Dec 2006 14:26:09 -0000       1.52
@@ -16,7 +16,7 @@
 
  
 
-/* $Id: render_handler_agg.cpp,v 1.51 2006/12/02 23:13:38 strk Exp $ */
+/* $Id: render_handler_agg.cpp,v 1.52 2006/12/05 14:26:09 tgc Exp $ */
 
 // Original version by Udo Giacomozzi and Hannes Mayr, 
 // INDUNET GmbH (www.indunet.it)
@@ -204,26 +204,6 @@
 };
 
 
-
-// --- YUV VIDEO 
---------------------------------------------------------------
-// Currently not implemented.
-
-class agg_YUV_video : public gnash::YUV_video
-{
-public:
-
-  agg_YUV_video(int width, int height): YUV_video(width, height)
-  {
-    log_msg("warning: YUV_video not supported by AGG renderer");
-  }
-  
-  ~agg_YUV_video() {
-  }
-  
-}; // class agg_YUV_video
-
-
-
 // --- ALPHA MASK BUFFER CONTAINER 
---------------------------------------------
 // How masks are implemented: A mask is basically a full alpha buffer. Each 
 // pixel in the alpha buffer defines the fraction of color values that are
@@ -426,16 +406,15 @@
        }
        
        
-       gnash::YUV_video*       create_YUV_video(int w, int h)
-       {         
-         return new agg_YUV_video(w, h);
+       // Returns the format the current renderer wants videoframes in.
+       int videoFrameFormat() {
+               return RGB;
   }
   
-  void delete_YUV_video(gnash::YUV_video* yuv)
-       {
-         // don't need to pointer != null before deletion
-         //if (yuv)
-             delete yuv;
+       /// Draws the video frames
+       void drawVideoFrame(image::image_base* baseframe, const matrix* mat, 
const rect* bounds){
+               image::rgb* frame = static_cast<image::rgb*>(frame);
+               //TODO: implement!
        }
 
 

Index: backend/render_handler_cairo.cpp
===================================================================
RCS file: /sources/gnash/gnash/backend/render_handler_cairo.cpp,v
retrieving revision 1.17
retrieving revision 1.18
diff -u -b -r1.17 -r1.18
--- backend/render_handler_cairo.cpp    9 Nov 2006 08:30:56 -0000       1.17
+++ backend/render_handler_cairo.cpp    5 Dec 2006 14:26:09 -0000       1.18
@@ -50,22 +50,6 @@
     }
 };
 
-// --- YUV VIDEO 
---------------------------------------------------------------
-// Currently not implemented.
-
-class cairo_YUV_video : public gnash::YUV_video
-{
-public:
-
-    cairo_YUV_video(int width, int height): YUV_video(width, height)
-    {
-       log_msg("warning: YUV_video not supported by Cairo renderer");
-    }
-
-    ~cairo_YUV_video() {
-    }
-  
-}; // class agg_YUV_video
 
 class render_handler_cairo : public gnash::triangulating_render_handler
 {
@@ -635,15 +619,16 @@
            }
        }
        
-    gnash::YUV_video*  create_YUV_video(int w, int h)
-       {         
-           return new cairo_YUV_video(w, h);
+       // Returns the format the current renderer wants videoframes in.
+       int videoFrameFormat() {
+               return RGB;
        }
   
-    void       delete_YUV_video(gnash::YUV_video* yuv)
-       {
-           if (yuv) delete yuv;
+       /// Draws the video frames
+       void drawVideoFrame(image::image_base* frame, const matrix* mat, const 
rect* bounds){
+       //TODO: implement!
        }
+
 };     // end class render_handler_cairo
 
 

Index: backend/render_handler_ogl.cpp
===================================================================
RCS file: /sources/gnash/gnash/backend/render_handler_ogl.cpp,v
retrieving revision 1.59
retrieving revision 1.60
diff -u -b -r1.59 -r1.60
--- backend/render_handler_ogl.cpp      1 Dec 2006 10:21:26 -0000       1.59
+++ backend/render_handler_ogl.cpp      5 Dec 2006 14:26:09 -0000       1.60
@@ -6,7 +6,7 @@
 // A render_handler that uses SDL & OpenGL
 
 
-/* $Id: render_handler_ogl.cpp,v 1.59 2006/12/01 10:21:26 alexeev Exp $ */
+/* $Id: render_handler_ogl.cpp,v 1.60 2006/12/05 14:26:09 tgc Exp $ */
 
 //#include "gnash.h"
 #include "render_handler.h"
@@ -58,89 +58,9 @@
 
        virtual void layout_image(image::image_base* im);
 };
-// YUV_video_ogl declaration
 
-// TODO: Implement this usiging glMatrix*().
-
-static GLfloat yuv2rgb[2][4] = {{0.500000f, 0.413650f, 0.944700f, 0.f},        
{0.851850f, 0.320550f, 0.500000f, 1.f}};
-static GLfloat yuv2rgbmatrix[16] = {
-       1, 1, 1, 0,
-       0, -0.344136, 1.773, 0,
-       1.402, -0.714136, 0, 0,
-       0, 0, 0, 0
-};
 static GLint iquad[] = {-1, 1, 1, 1, 1, -1, -1, -1};
 
-class YUV_video_ogl : public gnash::YUV_video
-{
-       public:
-
-               YUV_video_ogl(int width, int height): YUV_video(width, height)
-               {
-               };
-
-               ~YUV_video_ogl()
-               {
-               }
-
-               void display(const matrix* mat, const rect* bounds)
-               {
-                       glPushAttrib(GL_ENABLE_BIT | GL_COLOR_BUFFER_BIT);
-
-                       static GLfloat yuv_rgb[16] = {
-                               1, 1, 1, 0,
-                               0, -0.3946517043589703515f, 
2.032110091743119266f, 0,
-                               1.139837398373983740f, -0.5805986066674976801f, 
0, 0,
-                               0, 0, 0, 1
-                       };
-
-                       glMatrixMode(GL_COLOR);
-                       glPushMatrix();
-                       glLoadMatrixf(yuv_rgb);
-               glPixelTransferf(GL_GREEN_BIAS, -0.5f);
-                       glPixelTransferf(GL_BLUE_BIAS, -0.5f);
-
-                       m = mat;
-                       m_bounds = bounds;
-               
-                       gnash::point a, b, c, d;
-                       m->transform(&a, gnash::point(m_bounds->get_x_min(), 
m_bounds->get_y_min()));
-                       m->transform(&b, gnash::point(m_bounds->get_x_max(), 
m_bounds->get_y_min()));
-                       m->transform(&c, gnash::point(m_bounds->get_x_min(), 
m_bounds->get_y_max()));
-                       d.m_x = b.m_x + c.m_x - a.m_x;
-                       d.m_y = b.m_y + c.m_y - a.m_y;
-
-                       float w_bounds = TWIPS_TO_PIXELS(b.m_x - a.m_x);
-                       float h_bounds = TWIPS_TO_PIXELS(c.m_y - a.m_y);
-                       GLenum rgb[3] = {GL_RED, GL_GREEN, GL_BLUE}; 
-
-                       unsigned char*   ptr = m_data;
-                       float xpos = a.m_x < 0 ? 0.0f : a.m_x;  //hack
-                       float ypos = a.m_y < 0 ? 0.0f : a.m_y;  //hack
-                       glRasterPos2f(xpos, ypos);      //hack
-                       for (int i = 0; i < 3; ++i)
-                       {
-                               float zx = w_bounds / (float) planes[i].w;
-                               float zy = h_bounds / (float) planes[i].h;
-                               glPixelZoom(zx, - zy);  // flip & zoom image
-
-                               if (i > 0)
-                               {
-                                       glEnable(GL_BLEND);
-                                       glBlendFunc(GL_ONE, GL_ONE);
-                               }
-
-                               glDrawPixels(planes[i].w, planes[i].h, rgb[i], 
GL_UNSIGNED_BYTE, ptr);
-                               ptr += planes[i].size;
-                       }
-
-                       glMatrixMode(GL_COLOR);
-                       glPopMatrix();
-
-                       glPopAttrib();
-               }
-};
-
 class render_handler_ogl : public gnash::triangulating_render_handler
 {
 public:
@@ -423,24 +343,75 @@
            delete bi;
        }
     
-       // Ctor stub.
-       render_handler_ogl()
-       {
+       // Returns the format the current renderer wants videoframes in.
+       int videoFrameFormat() {
+               return YUV;
        }
 
-       // Dtor stub.
-       ~render_handler_ogl()
+       /// Draws the video frames
+       void drawVideoFrame(image::image_base* baseframe, const matrix* m, 
const rect* bounds){
+               image::yuv* frame = static_cast<image::yuv*>(baseframe);
+               glPushAttrib(GL_ENABLE_BIT | GL_COLOR_BUFFER_BIT);
+
+               static GLfloat yuv_rgb[16] = {
+                       1, 1, 1, 0,
+                       0, -0.3946517043589703515f, 2.032110091743119266f, 0,
+                       1.139837398373983740f, -0.5805986066674976801f, 0, 0,
+                       0, 0, 0, 1
+               };
+
+               glMatrixMode(GL_COLOR);
+               glPushMatrix();
+               glLoadMatrixf(yuv_rgb);
+               glPixelTransferf(GL_GREEN_BIAS, -0.5f);
+               glPixelTransferf(GL_BLUE_BIAS, -0.5f);
+
+               gnash::point a, b, c, d;
+               m->transform(&a, gnash::point(bounds->get_x_min(), 
bounds->get_y_min()));
+               m->transform(&b, gnash::point(bounds->get_x_max(), 
bounds->get_y_min()));
+               m->transform(&c, gnash::point(bounds->get_x_min(), 
bounds->get_y_max()));
+               d.m_x = b.m_x + c.m_x - a.m_x;
+               d.m_y = b.m_y + c.m_y - a.m_y;
+
+               float w_bounds = TWIPS_TO_PIXELS(b.m_x - a.m_x);
+               float h_bounds = TWIPS_TO_PIXELS(c.m_y - a.m_y);
+               GLenum rgb[3] = {GL_RED, GL_GREEN, GL_BLUE}; 
+
+               unsigned char*   ptr = frame->m_data;
+               float xpos = a.m_x < 0 ? 0.0f : a.m_x;  //hack
+               float ypos = a.m_y < 0 ? 0.0f : a.m_y;  //hack
+               glRasterPos2f(xpos, ypos);      //hack
+               for (int i = 0; i < 3; ++i)
        {
+                       float zx = w_bounds / (float) frame->planes[i].w;
+                       float zy = h_bounds / (float) frame->planes[i].h;
+                       glPixelZoom(zx, - zy);  // flip & zoom image
+
+                       if (i > 0)
+                       {
+                               glEnable(GL_BLEND);
+                               glBlendFunc(GL_ONE, GL_ONE);
        }
 
-       gnash::YUV_video*       create_YUV_video(int w, int h)
+                       glDrawPixels(frame->planes[i].w, frame->planes[i].h, 
rgb[i], GL_UNSIGNED_BYTE, ptr);
+                       ptr += frame->planes[i].size;
+               }
+
+               glMatrixMode(GL_COLOR);
+               glPopMatrix();
+
+               glPopAttrib();
+       
+       }
+   
+       // Ctor stub.
+       render_handler_ogl()
        {
-               return new YUV_video_ogl(w, h);
        }
 
-       void    delete_YUV_video(gnash::YUV_video* yuv)
+       // Dtor stub.
+       ~render_handler_ogl()
        {
-           if (yuv) delete yuv;
        }
 
     void       begin_display(

Index: libbase/image.cpp
===================================================================
RCS file: /sources/gnash/gnash/libbase/image.cpp,v
retrieving revision 1.16
retrieving revision 1.17
diff -u -b -r1.16 -r1.17
--- libbase/image.cpp   28 Aug 2006 11:07:14 -0000      1.16
+++ libbase/image.cpp   5 Dec 2006 14:26:09 -0000       1.17
@@ -29,6 +29,10 @@
        {
        }
 
+       void image_base::update(uint8_t* data)
+       {
+               memcpy(m_data, data, m_pitch * m_height);
+       }
 
        uint8_t*        scanline(image_base* surf, int y)
        {
@@ -213,6 +217,78 @@
                return h;
        }
 
+       //
+       // yuv
+       //
+       yuv::yuv(int w, int h) :
+               image_base(0, w, h, w, YUV)
+
+       {
+               planes[Y].w = m_width;
+               planes[Y].h = m_height;
+               planes[Y].size = m_width * m_height;
+               planes[Y].offset = 0;
+
+               planes[U] = planes[Y];
+               planes[U].w >>= 1;
+               planes[U].h >>= 1;
+               planes[U].size >>= 2;
+               planes[U].offset = planes[Y].size;
+
+               planes[V] = planes[U];
+               planes[V].offset += planes[U].size;
+
+               m_size = planes[Y].size + (planes[U].size << 1);
+
+               for (int i = 0; i < 3; ++i)
+               {
+                       planes[i].id = 0;       //texids[i];
+
+                       unsigned int ww = planes[i].w;
+                       unsigned int hh = planes[i].h;
+                       planes[i].unit = 0; // i[units];
+                       planes[i].p2w = (ww & (ww - 1)) ? video_nlpo2(ww) : ww;
+                       planes[i].p2h = (hh & (hh - 1)) ? video_nlpo2(hh) : hh;
+                       float tw = (double) ww / planes[i].p2w;
+                       float th = (double) hh / planes[i].p2h;
+
+                       planes[i].coords[0][0] = 0.0;
+                       planes[i].coords[0][1] = 0.0;
+                       planes[i].coords[1][0] = tw;
+                       planes[i].coords[1][1] = 0.0;
+                       planes[i].coords[2][0] = tw; 
+                       planes[i].coords[2][1] = th;
+                       planes[i].coords[3][0] = 0.0;
+                       planes[i].coords[3][1] = th;
+               }
+
+               m_data = new uint8_t[m_size];
+
+       //              m_bounds->m_x_min = 0.0f;
+       //              m_bounds->m_x_max = 1.0f;
+       //              m_bounds->m_y_min = 0.0f;
+       //              m_bounds->m_y_max = 1.0f;
+       }       
+
+       unsigned int yuv::video_nlpo2(unsigned int x) const
+       {
+               x |= (x >> 1);
+               x |= (x >> 2);
+               x |= (x >> 4);
+               x |= (x >> 8);
+               x |= (x >> 16);
+               return x + 1;
+       }
+
+       int yuv::size() const
+       {
+               return m_size;
+       }
+
+       void yuv::update(uint8_t* data)
+       {
+               memcpy(m_data, data, m_size);
+       }
 
        //
        // utility

Index: libbase/image.h
===================================================================
RCS file: /sources/gnash/gnash/libbase/image.h,v
retrieving revision 1.10
retrieving revision 1.11
diff -u -b -r1.10 -r1.11
--- libbase/image.h     14 Sep 2006 23:54:22 -0000      1.10
+++ libbase/image.h     5 Dec 2006 14:26:09 -0000       1.11
@@ -29,7 +29,8 @@
                        RGB,
                        RGBA,
                        ALPHA,
-                       ROW
+                       ROW,
+                       YUV
                };
 
                id_image m_type;
@@ -40,6 +41,7 @@
                int     m_pitch;        // byte offset from one row to the next
 
                image_base(uint8_t* data, int width, int height, int pitch, 
id_image type);
+               void update(uint8_t* data);
        };
 
        /// 24-bit RGB image.  Packed data, red byte first (RGBRGB...)
@@ -76,7 +78,29 @@
                unsigned int    compute_hash() const;
        };
 
+class DSOEXPORT yuv : public image_base
+{
 
+public:
+
+       enum {Y, U, V, T, NB_TEXS};
+
+       yuv(int w, int h);
+       ~yuv();
+       void update(uint8_t* data);
+       unsigned int video_nlpo2(unsigned int x) const;
+       int size() const;
+
+       struct plane {
+               unsigned int w, h, p2w, p2h, offset, size;
+               int unit;
+               int id;
+               float coords[4][2];
+       } planes[4];
+
+       int m_size;
+
+};
        /// Make a system-memory 24-bit bitmap surface.  24-bit packed
        /// data, red byte first.
        DSOEXPORT rgb*  create_rgb(int width, int height);

Index: server/Makefile.am
===================================================================
RCS file: /sources/gnash/gnash/server/Makefile.am,v
retrieving revision 1.94
retrieving revision 1.95
diff -u -b -r1.94 -r1.95
--- server/Makefile.am  1 Dec 2006 16:35:38 -0000       1.94
+++ server/Makefile.am  5 Dec 2006 14:26:09 -0000       1.95
@@ -18,7 +18,7 @@
 # 
 #
 
-# $Id: Makefile.am,v 1.94 2006/12/01 16:35:38 strk Exp $
+# $Id: Makefile.am,v 1.95 2006/12/05 14:26:09 tgc Exp $
 
 AUTOMAKE_OPTIONS = 
 
@@ -65,7 +65,6 @@
        swf/tag_loaders.cpp     \
        swf_function.cpp        \
        video_stream_instance.cpp \
-       video_yuv.cpp   \
         StreamProvider.cpp \
         array.cpp \
         button_character_instance.cpp \

Index: server/gnash.h
===================================================================
RCS file: /sources/gnash/gnash/server/gnash.h,v
retrieving revision 1.78
retrieving revision 1.79
diff -u -b -r1.78 -r1.79
--- server/gnash.h      27 Nov 2006 15:57:51 -0000      1.78
+++ server/gnash.h      5 Dec 2006 14:26:09 -0000       1.79
@@ -17,7 +17,7 @@
 // 
 //
 
-/* $Id: gnash.h,v 1.78 2006/11/27 15:57:51 strk Exp $ */
+/* $Id: gnash.h,v 1.79 2006/12/05 14:26:09 tgc Exp $ */
 
 /// \mainpage
 ///
@@ -606,39 +606,6 @@
                }
 };
 
-class DSOEXPORT YUV_video : public ref_counted
-{
-
-public:
-
-       enum {Y, U, V, T, NB_TEXS};
-
-       YUV_video(int w, int h);
-       ~YUV_video();
-       unsigned int video_nlpo2(unsigned int x) const;
-       void update(uint8_t* data);
-       virtual void display(const matrix* m, const rect* bounds);
-       int size() const;
-
-protected:
-
-       uint8_t* m_data;
-       int m_width;
-       int m_height;
-       int m_size;
-
-       struct plane {
-               unsigned int w, h, p2w, p2h, offset, size;
-               int unit;
-               int id;
-               float coords[4][2];
-       } planes[4];    
-
-       const matrix* m;
-       const rect* m_bounds;
-
-};
-       
 /// Keyboard handling
 namespace key {
 enum code

Index: server/render.cpp
===================================================================
RCS file: /sources/gnash/gnash/server/render.cpp,v
retrieving revision 1.11
retrieving revision 1.12
diff -u -b -r1.11 -r1.12
--- server/render.cpp   25 Nov 2006 16:20:45 -0000      1.11
+++ server/render.cpp   5 Dec 2006 14:26:09 -0000       1.12
@@ -48,12 +48,6 @@
                        bogus_bi() {}
                };
 
-               class bogus_yuv : public YUV_video
-               {
-               public:
-                       bogus_yuv(): YUV_video(0, 0) { assert(0); }
-               };
-
                bitmap_info*    create_bitmap_info_alpha(int w, int h, unsigned 
char* data)
                {
 #ifdef DEBUG_RENDER_CALLS
@@ -92,15 +86,15 @@
                        if (s_render_handler) 
s_render_handler->delete_bitmap_info(bi);
                }
 
-               YUV_video*      create_YUV_video(int width, int height)
-               {
-                       if (s_render_handler) return 
s_render_handler->create_YUV_video(width, height);
-                       else return new bogus_yuv;
+               // Returns the format the current renderer wants videoframes in.
+               int videoFrameFormat() {
+                       if (s_render_handler) return 
s_render_handler->videoFrameFormat();
+                       else return NONE;
                }
 
-               void    delete_YUV_video(YUV_video* yuv)
-               {
-                       if (s_render_handler) 
s_render_handler->delete_YUV_video(yuv);
+               /// Draws the video frames
+               void drawVideoFrame(image::image_base* frame, const matrix* 
mat, const rect* bounds){
+                       if (s_render_handler) return 
s_render_handler->drawVideoFrame(frame, mat, bounds);
                }
 
 

Index: server/render.h
===================================================================
RCS file: /sources/gnash/gnash/server/render.h,v
retrieving revision 1.11
retrieving revision 1.12
diff -u -b -r1.11 -r1.12
--- server/render.h     1 Dec 2006 15:52:17 -0000       1.11
+++ server/render.h     5 Dec 2006 14:26:09 -0000       1.12
@@ -55,8 +55,19 @@
                /// Delete the given bitmap info struct.
                void    delete_bitmap_info(bitmap_info* bi);
 
-               YUV_video*      create_YUV_video(int width, int height);
-               void    delete_YUV_video(YUV_video* yuv);
+               /// The different video frame formats
+               enum video_frame_format
+               {
+                       NONE,
+                       YUV,
+                       RGB
+               };
+
+               /// Returns the format the current renderer wants videoframes 
in.
+               int videoFrameFormat();
+
+               /// Draws the video frames
+               void drawVideoFrame(image::image_base* frame, const matrix* 
mat, const rect* bounds);
 
                /// \brief
                /// Bracket the displaying of a frame from a movie.

Index: server/video_stream_instance.cpp
===================================================================
RCS file: /sources/gnash/gnash/server/video_stream_instance.cpp,v
retrieving revision 1.4
retrieving revision 1.5
diff -u -b -r1.4 -r1.5
--- server/video_stream_instance.cpp    20 Nov 2006 15:06:53 -0000      1.4
+++ server/video_stream_instance.cpp    5 Dec 2006 14:26:09 -0000       1.5
@@ -56,22 +56,22 @@
 
                if (nso->obj.playing())
                {
-                       YUV_video* v = nso->obj.get_video();
-                       if (v)
+                       image::image_base* i = nso->obj.get_video();
+                       if (i)
                        {
-                               v->display(&m, &bounds);
+                               gnash::render::drawVideoFrame(i, &m, &bounds);
                        }
                }
        }
 }
 
 void
-video_stream_instance::advance(float delta_time)
+video_stream_instance::advance(float /*delta_time*/)
 {
 }
 
 void
-video_stream_instance::get_invalidated_bounds(rect* bounds, bool force)
+video_stream_instance::get_invalidated_bounds(rect* bounds, bool /*force*/)
 {
        bounds->expand_to_point(-1e10f, -1e10f);
        bounds->expand_to_point(1e10f, 1e10f);

Index: server/asobj/Makefile.am
===================================================================
RCS file: /sources/gnash/gnash/server/asobj/Makefile.am,v
retrieving revision 1.25
retrieving revision 1.26
diff -u -b -r1.25 -r1.26
--- server/asobj/Makefile.am    1 Dec 2006 16:35:38 -0000       1.25
+++ server/asobj/Makefile.am    5 Dec 2006 14:26:10 -0000       1.26
@@ -18,7 +18,7 @@
 # 
 #
 
-# $Id: Makefile.am,v 1.25 2006/12/01 16:35:38 strk Exp $
+# $Id: Makefile.am,v 1.26 2006/12/05 14:26:10 tgc Exp $
 
 AUTOMAKE_OPTIONS = 
 
@@ -59,6 +59,7 @@
        Mouse.cpp       \
        NetConnection.cpp\
        NetStream.cpp   \
+       NetStreamFfmpeg.cpp \
        Number.cpp      \
        Object.cpp      \
        Selection.cpp   \
@@ -94,6 +95,7 @@
        MovieClipLoader.h \
        NetConnection.h \
        NetStream.h     \
+       NetStreamFfmpeg.h \
        Number.h        \
        Object.h        \
        Selection.h     \

Index: server/asobj/NetStream.cpp
===================================================================
RCS file: /sources/gnash/gnash/server/asobj/NetStream.cpp,v
retrieving revision 1.18
retrieving revision 1.19
diff -u -b -r1.18 -r1.19
--- server/asobj/NetStream.cpp  24 Nov 2006 10:38:26 -0000      1.18
+++ server/asobj/NetStream.cpp  5 Dec 2006 14:26:10 -0000       1.19
@@ -18,7 +18,7 @@
 //
 //
 
-/* $Id: NetStream.cpp,v 1.18 2006/11/24 10:38:26 alexeev Exp $ */
+/* $Id: NetStream.cpp,v 1.19 2006/12/05 14:26:10 tgc Exp $ */
 
 #ifdef HAVE_CONFIG_H
 #include "config.h"
@@ -28,8 +28,6 @@
 #include "NetStream.h"
 #include "fn_call.h"
 
-#include "StreamProvider.h"    
-#include "render.h"    
 #include "movie_root.h"
 
 #if defined(_WIN32) || defined(WIN32)
@@ -41,503 +39,6 @@
 
 namespace gnash {
  
-namespace globals { // gnash::globals
-
-       /// global StreamProvider
-       extern StreamProvider streamProvider;
-
-} // namespace gnash::global
-
-
-NetStream::NetStream():
-       m_video_index(-1),
-       m_audio_index(-1),
-#ifdef USE_FFMPEG
-       m_VCodecCtx(NULL),
-       m_ACodecCtx(NULL),
-       m_FormatCtx(NULL),
-       m_Frame(NULL),
-#endif
-       m_go(false),
-       m_yuv(NULL),
-       m_video_clock(0),
-       m_pause(false),
-       m_unqueued_data(NULL)
-{
-}
-
-NetStream::~NetStream()
-{
-       close();
-}
-
-// called from avstreamer thread
-void NetStream::set_status(const char* code)
-{
-       if (m_netstream_object)
-       {
-//             m_netstream_object->set_member("onStatus_Code", code);
-//             push_video_event(m_netstream_object);
-       }
-}
-
-void NetStream::pause(int mode)
-{
-       if (mode == -1)
-       {
-               m_pause = ! m_pause;
-       }
-       else
-       {
-               m_pause = (mode == 0) ? true : false;
-       }
-}
-
-void NetStream::close()
-{
-       if (m_go)
-       {
-               // terminate thread
-               m_go = false;
-
-               // wait till thread is complete before main continues
-               pthread_join(m_thread, NULL);
-       }
-
-       sound_handler* s = get_sound_handler();
-       if (s)
-       {
-               s->detach_aux_streamer((void*) this);
-       }
-
-#ifdef USE_FFMPEG
-
-       if (m_Frame) av_free(m_Frame);
-       m_Frame = NULL;
-
-       if (m_VCodecCtx) avcodec_close(m_VCodecCtx);
-       m_VCodecCtx = NULL;
-
-       if (m_ACodecCtx) avcodec_close(m_ACodecCtx);
-       m_ACodecCtx = NULL;
-
-       if (m_FormatCtx) av_close_input_file(m_FormatCtx);
-       m_FormatCtx = NULL;
-
-#endif
-
-       render::delete_YUV_video(m_yuv);
-       m_yuv = NULL;
-
-       while (m_qvideo.size() > 0)
-       {
-               delete m_qvideo.front();
-               m_qvideo.pop();
-       }
-
-       while (m_qaudio.size() > 0)
-       {
-               delete m_qaudio.front();
-               m_qaudio.pop();
-       }
-
-}
-
-int
-NetStream::play(const char* c_url)
-{
-
-/*     URL url(c_url);
-
-       tu_file* in = globals::streamProvider.getStream(url);
-       if (in == NULL)
-       {
-           log_error("failed to open '%s'; can't create movie.\n", c_url);
-           return;
-       }
-       else if (in->get_error())
-       {
-           log_error("streamProvider opener can't open '%s'\n", c_url);
-           return;
-       }
-*/
-
-       // This registers all available file formats and codecs 
-       // with the library so they will be used automatically when
-       // a file with the corresponding format/codec is opened
-
-#ifdef USE_FFMPEG
-
-       // Is it already playing ?
-       if (m_go)
-       {
-               return 0;
-       }
-
-       av_register_all();
-
-       // Open video file
-       // The last three parameters specify the file format, buffer size and 
format parameters;
-       // by simply specifying NULL or 0 we ask libavformat to auto-detect the 
format 
-       // and use a default buffer size
-
-       if (av_open_input_file(&m_FormatCtx, c_url, NULL, 0, NULL) != 0)
-       {
-         log_error("Couldn't open file '%s'", c_url);
-               set_status("NetStream.Play.StreamNotFound");
-               return -1;
-       }
-
-       // Next, we need to retrieve information about the streams contained in 
the file
-       // This fills the streams field of the AVFormatContext with valid 
information
-       if (av_find_stream_info(m_FormatCtx) < 0)
-       {
-    log_error("Couldn't find stream information from '%s'", c_url);
-               return -1;
-       }
-
-//     m_FormatCtx->pb.eof_reached = 0;
-//     av_read_play(m_FormatCtx);
-
-       // Find the first video & audio stream
-       m_video_index = -1;
-       m_audio_index = -1;
-       for (int i = 0; i < m_FormatCtx->nb_streams; i++)
-       {
-               AVCodecContext* enc = m_FormatCtx->streams[i]->codec; 
-
-               switch (enc->codec_type)
-               {
-                       case CODEC_TYPE_AUDIO:
-                               if (m_audio_index < 0)
-                               {
-                                       m_audio_index = i;
-                                       m_audio_stream = 
m_FormatCtx->streams[i];
-                               }
-                               break;
-
-                       case CODEC_TYPE_VIDEO:
-                               if (m_video_index < 0)
-                               {
-                                       m_video_index = i;
-                                       m_video_stream = 
m_FormatCtx->streams[i];
-                               }
-                               break;
-                       case CODEC_TYPE_DATA:
-                       case CODEC_TYPE_SUBTITLE:
-                       case CODEC_TYPE_UNKNOWN:
-                               break;
-    }
-       }
-
-       if (m_video_index < 0)
-       {
-               log_error("Didn't find a video stream from '%s'", c_url);
-               return -1;
-       }
-
-       // Get a pointer to the codec context for the video stream
-  m_VCodecCtx = m_FormatCtx->streams[m_video_index]->codec;
-
-       // Find the decoder for the video stream
-       AVCodec* pCodec = avcodec_find_decoder(m_VCodecCtx->codec_id);
-       if (pCodec == NULL)
-       {
-               m_VCodecCtx = NULL;
-               log_error("Decoder not found");
-               return -1;
-       }
-
-       // Open codec
-       if (avcodec_open(m_VCodecCtx, pCodec) < 0)
-       {
-               log_error("Could not open codec");
-       }
-
-       // Allocate a frame to store the decoded frame in
-       m_Frame = avcodec_alloc_frame();
-       
-       // Determine required buffer size and allocate buffer
-       m_yuv = render::create_YUV_video(m_VCodecCtx->width,    
m_VCodecCtx->height);
-
-       sound_handler* s = get_sound_handler();
-       if (m_audio_index >= 0 && s != NULL)
-       {
-               // Get a pointer to the audio codec context for the video stream
-               m_ACodecCtx = m_FormatCtx->streams[m_audio_index]->codec;
-    
-    // Find the decoder for the audio stream
-    AVCodec* pACodec = avcodec_find_decoder(m_ACodecCtx->codec_id);
-    if(pACodec == NULL)
-               {
-      log_error("No available AUDIO decoder to process MPEG file: '%s'", 
c_url);
-                       return -1;
-               }
-        
-    // Open codec
-    if (avcodec_open(m_ACodecCtx, pACodec) < 0)
-               {
-                       log_error("Could not open AUDIO codec");
-                       return -1;
-               }
-       
-               s->attach_aux_streamer(audio_streamer, (void*) this);
-
-       }
-
-       m_pause = false;
-
-       if (pthread_create(&m_thread, NULL, NetStream::av_streamer, this) != 0)
-       {
-               return -1;
-       };
-
-#else
-       log_error("FFMPEG is needed to play video");
-#endif
-
-       return 0;
-}
-
-// decoder thread
-void* NetStream::av_streamer(void* arg)
-{
-       NetStream* ns = static_cast<NetStream*>(arg);
-       ns->set_status("NetStream.Play.Start");
-
-       raw_videodata_t* video = NULL;
-
-       ns->m_video_clock = 0;
-
-       int delay = 0;
-       ns->m_start_clock = tu_timer::ticks_to_seconds(tu_timer::get_ticks());
-       ns->m_go = true;
-       ns->m_unqueued_data = NULL;
-       while (ns->m_go)
-       {
-               if (ns->m_pause)
-               {
-                       double t = 
tu_timer::ticks_to_seconds(tu_timer::get_ticks());
-                       usleep(100000);
-                       ns->m_start_clock += 
tu_timer::ticks_to_seconds(tu_timer::get_ticks()) - t;
-                       continue;
-               }
-
-               if (ns->read_frame() == false)
-               {
-                       if (ns->m_qvideo.size() == 0)
-                       {
-                               break;
-                       }
-               }
-
-               if (ns->m_qvideo.size() > 0)
-               {
-                       video = ns->m_qvideo.front();
-                       double clock = 
tu_timer::ticks_to_seconds(tu_timer::get_ticks()) - ns->m_start_clock;
-                       double video_clock = video->m_pts;
-
-                       if (clock >= video_clock)
-                       {
-                               ns->m_yuv->update(video->m_data);
-                               ns->m_qvideo.pop();
-                               delete video;
-                               delay = 0;
-                       }
-                       else
-                       {
-                               delay = int(video_clock - clock);
-                       }
-
-                       // Don't hog the CPU.
-                       // Queues have filled, video frame have shown
-                       // now it is possible and to have a rest
-                       if (ns->m_unqueued_data && delay > 0)
-                       {
-                               usleep(delay);
-                       }
-               }
-       }
-
-       ns->set_status("NetStream.Play.Stop");
-       return 0;
-}
-
-// audio callback is running in sound handler thread
-void NetStream::audio_streamer(void *owner, uint8 *stream, int len)
-{
-       NetStream* ns = static_cast<NetStream*>(owner);
-
-       while (len > 0 && ns->m_qaudio.size() > 0)
-       {
-               raw_videodata_t* samples = ns->m_qaudio.front();
-
-               int n = imin(samples->m_size, len);
-               memcpy(stream, samples->m_ptr, n);
-               stream += n;
-               samples->m_ptr += n;
-               samples->m_size -= n;
-               len -= n;
-
-               if (samples->m_size == 0)
-               {
-                       ns->m_qaudio.pop();
-                       delete samples;
-               }
-       }
-}
-
-bool NetStream::read_frame()
-{
-//     raw_videodata_t* ret = NULL;
-       if (m_unqueued_data)
-       {
-               if (m_unqueued_data->m_stream_index == m_audio_index)
-               {
-                       sound_handler* s = get_sound_handler();
-                       if (s)
-                       {
-                               m_unqueued_data = 
m_qaudio.push(m_unqueued_data) ? NULL : m_unqueued_data;
-                       }
-               }
-               else
-               if (m_unqueued_data->m_stream_index == m_video_index)
-               {
-                       m_unqueued_data = m_qvideo.push(m_unqueued_data) ? NULL 
: m_unqueued_data;
-               }
-               else
-               {
-                       printf("read_frame: not audio & video stream\n");
-               }
-
-               return true;
-       }
-
-#ifdef USE_FFMPEG
-
-       AVPacket packet;
-       int rc = av_read_frame(m_FormatCtx, &packet);
-       if (rc >= 0)
-       {
-               if (packet.stream_index == m_audio_index)
-               {
-                       sound_handler* s = get_sound_handler();
-                       if (s)
-                       {
-                               int frame_size;
-                               uint8_t* ptr = (uint8_t*) 
malloc((AVCODEC_MAX_AUDIO_FRAME_SIZE * 3) / 2);
-                               if (avcodec_decode_audio(m_ACodecCtx, 
(int16_t*) ptr, &frame_size, packet.data, packet.size) >= 0)
-                               {
-
-                                       int16_t*        adjusted_data = 0;
-                                       int n = 0;
-
-                                       bool stereo = m_ACodecCtx->channels > 1 
? true : false;
-                                       int samples = stereo ? frame_size >> 2 
: frame_size >> 1;
-                                       s->convert_raw_data(&adjusted_data, &n, 
ptr, samples, 2, m_ACodecCtx->sample_rate, stereo);
-                           raw_videodata_t* raw = new raw_videodata_t;
-                                       raw->m_data = (uint8_t*) adjusted_data;
-                                       raw->m_ptr = raw->m_data;
-                                       raw->m_size = n;
-                                       raw->m_stream_index = m_audio_index;
-
-                                       m_unqueued_data = m_qaudio.push(raw) ? 
NULL : raw;
-                               }
-                               free(ptr);
-                       }
-               }
-               else
-               if (packet.stream_index == m_video_index)
-               {
-                       int got = 0;
-                 avcodec_decode_video(m_VCodecCtx, m_Frame, &got, packet.data, 
packet.size);
-                       if (got)
-                       {
-                               if (m_VCodecCtx->pix_fmt != PIX_FMT_YUV420P)
-                               {
-//                             img_convert((AVPicture*) pFrameYUV, 
PIX_FMT_YUV420P, (AVPicture*) pFrame, pCodecCtx->pix_fmt, pCodecCtx->width, 
pCodecCtx->height);
-                                       assert(0);      // TODO
-                               }
-
-                               raw_videodata_t* video = new raw_videodata_t;
-                               video->m_data = (uint8_t*) 
malloc(m_yuv->size());
-                               video->m_ptr = video->m_data;
-                               video->m_stream_index = m_video_index;
-
-                               // set presentation timestamp
-                               if (packet.dts != AV_NOPTS_VALUE)
-                               {
-                                       video->m_pts = 
as_double(m_video_stream->time_base) * packet.dts;
-                               }
-
-                               if (video->m_pts != 0)
-                               {       
-                                       // update video clock with pts, if 
present
-                                       m_video_clock = video->m_pts;
-                               }
-                               else
-                               {
-                                       video->m_pts = m_video_clock;
-                               }
-
-                               // update video clock for next frame
-                               double frame_delay = 
as_double(m_video_stream->codec->time_base);
-
-                               // for MPEG2, the frame can be repeated, so we 
update the clock accordingly
-                               frame_delay += m_Frame->repeat_pict * 
(frame_delay * 0.5);
-
-                               m_video_clock += frame_delay;
-
-                               int copied = 0;
-                               uint8_t* ptr = video->m_data;
-                               for (int i = 0; i < 3 ; i++)
-                               {
-                                       int shift = (i == 0 ? 0 : 1);
-                                       uint8_t* yuv_factor = m_Frame->data[i];
-                                       int h = m_VCodecCtx->height >> shift;
-                                       int w = m_VCodecCtx->width >> shift;
-                                       for (int j = 0; j < h; j++)
-                                       {
-                                               copied += w;
-                                               assert(copied <= m_yuv->size());
-                                               memcpy(ptr, yuv_factor, w);
-                                               yuv_factor += 
m_Frame->linesize[i];
-                                               ptr += w;
-                                       }
-                               }
-                               video->m_size = copied;
-                               m_unqueued_data = m_qvideo.push(video) ? NULL : 
video;
-                       }
-               }
-               av_free_packet(&packet);
-       }
-       else
-       {
-               return false;
-       }
-#endif
-
-       return true;
-}
-
-
-YUV_video* NetStream::get_video()
-{
-       return m_yuv;
-}
-
-void
-NetStream::seek()
-{
-    log_msg("%s:unimplemented \n", __FUNCTION__);
-}
-
-void
-NetStream::setBufferTime()
-{
-    log_msg("%s:unimplemented \n", __FUNCTION__);
-}
-
 void
 netstream_new(const fn_call& fn)
 {
@@ -550,6 +51,14 @@
     netstream_obj->set_member("setbuffertime", &netstream_setbuffertime);
 
     fn.result->set_as_object(netstream_obj);
+
+       if (fn.nargs > 0)
+       {
+               as_object* nc = static_cast<as_object*>(fn.arg(0).to_object());
+               assert(nc);
+               netstream_obj->obj.setNetCon(nc);
+       }
+
 }
 
 void netstream_close(const fn_call& fn)

Index: server/asobj/NetStream.h
===================================================================
RCS file: /sources/gnash/gnash/server/asobj/NetStream.h,v
retrieving revision 1.14
retrieving revision 1.15
diff -u -b -r1.14 -r1.15
--- server/asobj/NetStream.h    24 Nov 2006 10:38:26 -0000      1.14
+++ server/asobj/NetStream.h    5 Dec 2006 14:26:10 -0000       1.15
@@ -18,7 +18,7 @@
 //
 //
 
-/*  $Id: NetStream.h,v 1.14 2006/11/24 10:38:26 alexeev Exp $ */
+/*  $Id: NetStream.h,v 1.15 2006/12/05 14:26:10 tgc Exp $ */
 
 #ifndef __NETSTREAM_H__
 #define __NETSTREAM_H__
@@ -31,191 +31,44 @@
 #include <pthread.h>
 #include "impl.h"
 #include "video_stream_instance.h"
-#ifdef USE_FFMPEG
-#include <ffmpeg/avformat.h>
-#endif
-
-namespace gnash {
-  
-struct raw_videodata_t
-{
-       raw_videodata_t():
-       m_stream_index(-1),
-       m_size(0),
-       m_data(NULL),
-       m_ptr(NULL),
-       m_pts(0)
-       {
-       };
-
-       ~raw_videodata_t()
-       {
-               if (m_data)
-               {
-                       delete m_data;
-               }
-       };
-
-       int m_stream_index;
-       uint32_t m_size;
-       uint8_t* m_data;
-       uint8_t* m_ptr;
-       double m_pts;   // presentation timestamp in sec
-};
-
-template<class T>
-class multithread_queue
-{
-       public:
+#include "NetStreamFfmpeg.h"
 
-    multithread_queue()
-               {
-                       pthread_mutex_init(&m_mutex, NULL);
-               };
-
-    ~multithread_queue()
-               {
-                       lock();
-                       while (m_queue.size() > 0)
-                       {
-                               T x = m_queue.front();
-                               m_queue.pop();
-                               delete x;
-                       }
-                       unlock();
-                       
-                       pthread_mutex_destroy(&m_mutex);
-               }
-
-               size_t size()
-               {
-                       lock();
-                       size_t n = m_queue.size();
-                       unlock();
-                       return n;
-               }
-
-               bool push(T member)
-               {
-                       bool rc = false;
-                       lock();
-                       if (m_queue.size() < 20)        // hack
-                       {
-                               m_queue.push(member);
-                               rc = true;
-                       }
-                       unlock();
-                       return rc;
-               }
 
-               T front()
-               {
-                       lock();
-                       T member = NULL;
-                       if (m_queue.size() > 0)
-                       {
-                               member = m_queue.front();
-                       }
-                       unlock();
-                       return member;
-               }
-
-               void pop()
-               {
-                       lock();
-                       if (m_queue.size() > 0)
-                       {
-                               m_queue.pop();
-                       }
-                       unlock();
-               }
-
-       private:
-
-               inline void lock()
-               {
-                       pthread_mutex_lock(&m_mutex);
-               }
-
-               inline void unlock()
-               {
-                       pthread_mutex_unlock(&m_mutex);
-               }
-
-               pthread_mutex_t m_mutex;
-               std::queue < T > m_queue;
-};
+namespace gnash {
 
 class netstream_as_object;
 
-class NetStream {
+class NetStreamBase {
 public:
-       NetStream();
-       ~NetStream();
-       void close();
-       void pause(int mode);
-       int play(const char* source);
-       void seek();
-       void setBufferTime();
-       void set_status(const char* code);
+       NetStreamBase(){}
+       ~NetStreamBase(){}
+       void close(){}
+       void pause(int /*mode*/){}
+       int play(const char* /*source*/){ log_error("FFMPEG is needed to play 
video"); return 0; }
+       void seek(unsigned int /*pos*/){}
+       void setBufferTime(unsigned int /*pos*/){}
+       void set_status(const char* /*code*/){}
+       void setNetCon(as_object* /*nc*/) {}
+       image::image_base* get_video(){ return NULL; }
 
-       bool read_frame();
-       YUV_video* get_video();
-
-       inline bool playing()
+       inline void set_parent(netstream_as_object* /*ns*/)
        {
-        return m_go;
        }
 
-        inline void set_parent(netstream_as_object* ns)
+       inline bool playing()
         {
-                m_netstream_object = ns;
+               return false;
         }
 
-#ifdef USE_FFMPEG
-       inline double as_double(AVRational time)
-       {
-               return time.num / (double) time.den;
-       }
-#endif
-        static void* av_streamer(void* arg);
-        static void audio_streamer(void *udata, uint8 *stream, int len);
+};
 
-private:
-    bool _bufferLength;
-    bool _bufferTime;
-    bool _bytesLoaded;
-    bool _bytesTotal;
-    bool _currentFps;
-    bool _onStatus;
-    bool _time;
 #ifdef USE_FFMPEG
-               AVFormatContext *m_FormatCtx;
-
-               // video
-               AVCodecContext* m_VCodecCtx;
-               AVStream* m_video_stream;
-
-               // audio
-               AVCodecContext *m_ACodecCtx;
-               AVStream* m_audio_stream;
-
-               AVFrame* m_Frame;
+class NetStream : public NetStreamFfmpeg {
+#else
+class NetStream : public NetStreamBase {
 #endif
-               int m_video_index;
-               int m_audio_index;
-               volatile bool m_go;
-
-               YUV_video* m_yuv;
-               double m_video_clock;
-
-               pthread_t m_thread;
-               multithread_queue <raw_videodata_t*> m_qaudio;
-               multithread_queue <raw_videodata_t*> m_qvideo;
-               bool m_pause;
-               double m_start_clock;
-               netstream_as_object* m_netstream_object;
-               raw_videodata_t* m_unqueued_data;
+public:
+
 };
 
 class netstream_as_object : public as_object

Index: server/asobj/NetStreamFfmpeg.cpp
===================================================================
RCS file: server/asobj/NetStreamFfmpeg.cpp
diff -N server/asobj/NetStreamFfmpeg.cpp
--- /dev/null   1 Jan 1970 00:00:00 -0000
+++ server/asobj/NetStreamFfmpeg.cpp    5 Dec 2006 14:26:10 -0000       1.1
@@ -0,0 +1,658 @@
+// 
+//   Copyright (C) 2005, 2006 Free Software Foundation, Inc.
+// 
+// This program is free software; you can redistribute it and/or modify
+// it under the terms of the GNU General Public License as published by
+// the Free Software Foundation; either version 2 of the License, or
+// (at your option) any later version.
+// 
+// This program is distributed in the hope that it will be useful,
+// but WITHOUT ANY WARRANTY; without even the implied warranty of
+// MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
+// GNU General Public License for more details.
+// You should have received a copy of the GNU General Public License
+// along with this program; if not, write to the Free Software
+// Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA  02110-1301  USA
+
+// 
+//
+//
+
+#ifdef HAVE_CONFIG_H
+#include "config.h"
+#endif
+
+#ifdef USE_FFMPEG
+
+#include "log.h"
+#include "NetStreamFfmpeg.h"
+#include "fn_call.h"
+#include "NetStream.h"
+
+#include "render.h"    
+#include "movie_root.h"
+
+#include "URL.h"
+#include "tu_file.h"
+
+#if defined(_WIN32) || defined(WIN32)
+       #include <Windows.h>    // for sleep()
+       #define usleep(x) Sleep(x/1000)
+#else
+       #include "unistd.h" // for usleep()
+#endif
+
+
+
+namespace gnash {
+
+
+NetStreamFfmpeg::NetStreamFfmpeg():
+       m_video_index(-1),
+       m_audio_index(-1),
+
+       m_VCodecCtx(NULL),
+       m_ACodecCtx(NULL),
+       m_FormatCtx(NULL),
+       m_Frame(NULL),
+
+       m_go(false),
+       m_imageframe(NULL),
+       m_video_clock(0),
+       m_pause(false),
+       m_unqueued_data(NULL),
+       inputPos(0)
+{
+}
+
+NetStreamFfmpeg::~NetStreamFfmpeg()
+{
+       close();
+}
+
+// called from avstreamer thread
+void NetStreamFfmpeg::set_status(const char* /*code*/)
+{
+       if (m_netstream_object)
+       {
+               //m_netstream_object->set_member("onStatus_Code", code);
+               //push_video_event(m_netstream_object);
+       }
+}
+
+void NetStreamFfmpeg::pause(int mode)
+{
+       if (mode == -1)
+       {
+               m_pause = ! m_pause;
+       }
+       else
+       {
+               m_pause = (mode == 0) ? true : false;
+       }
+}
+
+void NetStreamFfmpeg::close()
+{
+       if (m_go)
+       {
+               // terminate thread
+               m_go = false;
+
+               // wait till thread is complete before main continues
+               pthread_join(m_thread, NULL);
+       }
+
+       // When closing gnash before playback is finished, the soundhandler 
+       // seems to be removed before netstream is destroyed.
+       /*sound_handler* s = get_sound_handler();
+       if (s != NULL)
+       {
+               s->detach_aux_streamer((void*) NULL);
+       }*/
+
+       if (m_Frame) av_free(m_Frame);
+       m_Frame = NULL;
+
+       if (m_VCodecCtx) avcodec_close(m_VCodecCtx);
+       m_VCodecCtx = NULL;
+
+       if (m_ACodecCtx) avcodec_close(m_ACodecCtx);
+       m_ACodecCtx = NULL;
+
+       m_FormatCtx->iformat->flags = AVFMT_NOFILE;
+       if (m_FormatCtx) av_close_input_file(m_FormatCtx);
+       m_FormatCtx = NULL;
+
+       if (m_imageframe) delete m_imageframe;
+
+       while (m_qvideo.size() > 0)
+       {
+               delete m_qvideo.front();
+               m_qvideo.pop();
+       }
+
+       while (m_qaudio.size() > 0)
+       {
+               delete m_qaudio.front();
+               m_qaudio.pop();
+       }
+
+       delete input;
+
+}
+
+// ffmpeg callback function
+int 
+NetStreamFfmpeg::readPacket(void* opaque, uint8_t* buf, int buf_size){
+
+       NetStreamFfmpeg* ns = static_cast<NetStreamFfmpeg*>(opaque);
+
+       int ret = ns->input->read_bytes(static_cast<void*>(buf), buf_size);
+       ns->inputPos += ret;
+       return ret;
+}
+
+// ffmpeg callback function
+offset_t 
+NetStreamFfmpeg::seekMedia(void *opaque, offset_t offset, int whence){
+
+       NetStreamFfmpeg* ns = static_cast<NetStreamFfmpeg*>(opaque);
+
+       // Offset is absolute new position in the file
+       if (whence == SEEK_SET) {
+               ns->input->set_position(offset);
+               ns->inputPos = offset;
+
+       // New position is offset + old position
+       } else if (whence == SEEK_CUR) {
+               ns->input->set_position(ns->inputPos + offset);
+               ns->inputPos = ns->inputPos + offset;
+
+       //      // New position is offset + end of file
+       } else if (whence == SEEK_END) {
+               // This is (most likely) a streamed file, so we can't seek to 
the end!
+               // Instead we seek to 50.000 bytes... seems to work fine...
+               ns->input->set_position(50000);
+               ns->inputPos = 50000;
+               
+       }
+
+       return ns->inputPos;
+}
+
+int
+NetStreamFfmpeg::play(const char* c_url)
+{
+
+       // Is it already playing ?
+       if (m_go)
+       {
+               if (m_pause) m_pause = false;
+               return 0;
+       }
+
+       url += c_url;
+       m_go = true;
+
+       // To avoid blocking while connecting, we use a thread.
+       if (pthread_create(&startThread, NULL, NetStreamFfmpeg::startPlayback, 
this) != 0)
+       {
+               return 0;
+       };
+       return 0;
+}
+
+void*
+NetStreamFfmpeg::startPlayback(void* arg)
+{
+       NetStreamFfmpeg* ns = static_cast<NetStreamFfmpeg*>(arg);
+
+       URL uri(ns->url);
+
+       ns->input = ns->streamProvider.getStream(uri);
+       if (ns->input == NULL)
+       {
+           log_error("failed to open '%s'; can't create movie.\n", 
ns->url.c_str());
+           return 0;
+       }
+       else if (ns->input->get_error())
+       {
+           log_error("streamProvider opener can't open '%s'\n", 
ns->url.c_str());
+           return 0;
+       }
+
+       ns->inputPos = 0;
+
+       // This registers all available file formats and codecs 
+       // with the library so they will be used automatically when
+       // a file with the corresponding format/codec is opened
+
+       av_register_all();
+
+       // Open video file
+       // The last three parameters specify the file format, buffer size and 
format parameters;
+       // by simply specifying NULL or 0 we ask libavformat to auto-detect the 
format 
+       // and use a default buffer size
+
+       // Probe the file to detect the format
+       AVProbeData probe_data, *pd = &probe_data;
+       pd->filename = "";
+       pd->buf = new uint8_t[2048];
+       pd->buf_size = 2048;
+       readPacket(ns, pd->buf, pd->buf_size);
+
+       AVInputFormat* inputFmt = av_probe_input_format(pd, 1);
+
+       AVFormatParameters ap;
+       memset(&ap, 0, sizeof(AVFormatParameters));
+       ap.prealloced_context = 1;
+
+       // Setup the filereader/seeker mechanism. 7th argument (NULL) is the 
writer function,
+       // which isn't needed.
+       init_put_byte(&ns->ByteIOCxt, new uint8_t[500000], 500000, 0, ns, 
NetStreamFfmpeg::readPacket, NULL, NetStreamFfmpeg::seekMedia);
+       ns->ByteIOCxt.is_streamed = 0;
+
+       ns->m_FormatCtx = av_alloc_format_context();
+
+       // Open the stream. the 4th argument is the filename, which we ignore.
+       if(av_open_input_stream(&ns->m_FormatCtx, &ns->ByteIOCxt, "", inputFmt, 
&ap) < 0){
+               log_error("Couldn't open file '%s'", ns->url.c_str());
+               ns->set_status("NetStream.Play.StreamNotFound");
+               return 0;
+       }
+
+       // Next, we need to retrieve information about the streams contained in 
the file
+       // This fills the streams field of the AVFormatContext with valid 
information
+       int ret = av_find_stream_info(ns->m_FormatCtx);
+       if (ret < 0)
+       {
+               log_error("Couldn't find stream information from '%s', error 
code: %d", ns->url.c_str(), ret);
+               return 0;
+       }
+
+//     m_FormatCtx->pb.eof_reached = 0;
+//     av_read_play(m_FormatCtx);
+
+       // Find the first video & audio stream
+       ns->m_video_index = -1;
+       ns->m_audio_index = -1;
+       for (int i = 0; i < ns->m_FormatCtx->nb_streams; i++)
+       {
+               AVCodecContext* enc = ns->m_FormatCtx->streams[i]->codec; 
+
+               switch (enc->codec_type)
+               {
+                       case CODEC_TYPE_AUDIO:
+                               if (ns->m_audio_index < 0)
+                               {
+                                       ns->m_audio_index = i;
+                                       ns->m_audio_stream = 
ns->m_FormatCtx->streams[i];
+                               }
+                               break;
+
+                       case CODEC_TYPE_VIDEO:
+                               if (ns->m_video_index < 0)
+                               {
+                                       ns->m_video_index = i;
+                                       ns->m_video_stream = 
ns->m_FormatCtx->streams[i];
+                               }
+                               break;
+                       case CODEC_TYPE_DATA:
+                       case CODEC_TYPE_SUBTITLE:
+                       case CODEC_TYPE_UNKNOWN:
+                               break;
+    }
+       }
+
+       if (ns->m_video_index < 0)
+       {
+               log_error("Didn't find a video stream from '%s'", 
ns->url.c_str());
+               return 0;
+       }
+
+       // Get a pointer to the codec context for the video stream
+       ns->m_VCodecCtx = ns->m_FormatCtx->streams[ns->m_video_index]->codec;
+
+       // Find the decoder for the video stream
+       AVCodec* pCodec = avcodec_find_decoder(ns->m_VCodecCtx->codec_id);
+       if (pCodec == NULL)
+       {
+               ns->m_VCodecCtx = NULL;
+               log_error("Decoder not found");
+               return 0;
+       }
+
+       // Open codec
+       if (avcodec_open(ns->m_VCodecCtx, pCodec) < 0)
+       {
+               log_error("Could not open codec");
+       }
+
+       // Allocate a frame to store the decoded frame in
+       ns->m_Frame = avcodec_alloc_frame();
+       
+       // Determine required buffer size and allocate buffer
+       int videoFrameFormat = gnash::render::videoFrameFormat();
+
+       if (videoFrameFormat == render::YUV) {
+               ns->m_imageframe = new image::yuv(ns->m_VCodecCtx->width,       
ns->m_VCodecCtx->height);
+       } else if (videoFrameFormat == render::RGB) {
+               ns->m_imageframe = new image::rgb(ns->m_VCodecCtx->width,       
ns->m_VCodecCtx->height);
+       }
+
+       sound_handler* s = get_sound_handler();
+       if (ns->m_audio_index >= 0 && s != NULL)
+       {
+               // Get a pointer to the audio codec context for the video stream
+               ns->m_ACodecCtx = 
ns->m_FormatCtx->streams[ns->m_audio_index]->codec;
+
+               // Find the decoder for the audio stream
+               AVCodec* pACodec = 
avcodec_find_decoder(ns->m_ACodecCtx->codec_id);
+           if(pACodec == NULL)
+               {
+                       log_error("No available AUDIO decoder to process MPEG 
file: '%s'", ns->url.c_str());
+                       return 0;
+               }
+        
+               // Open codec
+               if (avcodec_open(ns->m_ACodecCtx, pACodec) < 0)
+               {
+                       log_error("Could not open AUDIO codec");
+                       return 0;
+               }
+
+               s->attach_aux_streamer(audio_streamer, (void*) ns);
+
+       }
+
+       ns->m_pause = false;
+
+       if (pthread_create(&ns->m_thread, NULL, NetStreamFfmpeg::av_streamer, 
ns) != 0)
+       {
+               return 0;
+       };
+       return 0;
+}
+
+void
+NetStreamFfmpeg::setNetCon(as_object* nc){
+       netCon = nc;
+}
+
+// decoder thread
+void* NetStreamFfmpeg::av_streamer(void* arg)
+{
+       NetStreamFfmpeg* ns = static_cast<NetStreamFfmpeg*>(arg);
+       ns->set_status("NetStream.Play.Start");
+
+       raw_videodata_t* video = NULL;
+
+       ns->m_video_clock = 0;
+
+       int delay = 0;
+       ns->m_start_clock = tu_timer::ticks_to_seconds(tu_timer::get_ticks());
+       ns->m_go = true;
+       ns->m_unqueued_data = NULL;
+       while (ns->m_go)
+       {
+               if (ns->m_pause)
+               {
+                       double t = 
tu_timer::ticks_to_seconds(tu_timer::get_ticks());
+                       usleep(100000);
+                       ns->m_start_clock += 
tu_timer::ticks_to_seconds(tu_timer::get_ticks()) - t;
+                       continue;
+               }
+
+               if (ns->read_frame() == false)
+               {
+                       if (ns->m_qvideo.size() == 0)
+                       {
+                               break;
+                       }
+               }
+
+               if (ns->m_qvideo.size() > 0)
+               {
+                       video = ns->m_qvideo.front();
+                       double clock = 
tu_timer::ticks_to_seconds(tu_timer::get_ticks()) - ns->m_start_clock;
+                       double video_clock = video->m_pts;
+
+                       if (clock >= video_clock)
+                       {
+                               int videoFrameFormat = 
gnash::render::videoFrameFormat();
+                               if (videoFrameFormat == render::YUV) {
+                                       
static_cast<image::yuv*>(ns->m_imageframe)->update(video->m_data);
+                               } else if (videoFrameFormat == render::RGB) {
+                                       ns->m_imageframe->update(video->m_data);
+                               }
+                               ns->m_qvideo.pop();
+                               delete video;
+                               delay = 0;
+                       }
+                       else
+                       {
+                               delay = int(video_clock - clock);
+                       }
+
+                       // Don't hog the CPU.
+                       // Queues have filled, video frame have shown
+                       // now it is possible and to have a rest
+                       if (ns->m_unqueued_data && delay > 0)
+                       {
+                               usleep(delay);
+                       }
+               }
+       }
+
+       ns->set_status("NetStream.Play.Stop");
+       return 0;
+}
+
+// audio callback is running in sound handler thread
+void NetStreamFfmpeg::audio_streamer(void *owner, uint8 *stream, int len)
+{
+       NetStreamFfmpeg* ns = static_cast<NetStreamFfmpeg*>(owner);
+
+       while (len > 0 && ns->m_qaudio.size() > 0)
+       {
+               raw_videodata_t* samples = ns->m_qaudio.front();
+
+               int n = imin(samples->m_size, len);
+               memcpy(stream, samples->m_ptr, n);
+               stream += n;
+               samples->m_ptr += n;
+               samples->m_size -= n;
+               len -= n;
+
+               if (samples->m_size == 0)
+               {
+                       ns->m_qaudio.pop();
+                       delete samples;
+               }
+       }
+}
+
+bool NetStreamFfmpeg::read_frame()
+{
+//     raw_videodata_t* ret = NULL;
+       if (m_unqueued_data)
+       {
+               if (m_unqueued_data->m_stream_index == m_audio_index)
+               {
+                       sound_handler* s = get_sound_handler();
+                       if (s)
+                       {
+                               m_unqueued_data = 
m_qaudio.push(m_unqueued_data) ? NULL : m_unqueued_data;
+                       }
+               }
+               else
+               if (m_unqueued_data->m_stream_index == m_video_index)
+               {
+                       m_unqueued_data = m_qvideo.push(m_unqueued_data) ? NULL 
: m_unqueued_data;
+               }
+               else
+               {
+                       log_warning("read_frame: not audio & video stream\n");
+               }
+
+               return true;
+       }
+
+       AVPacket packet;
+       int rc = av_read_frame(m_FormatCtx, &packet);
+       if (rc >= 0)
+       {
+               if (packet.stream_index == m_audio_index)
+               {
+                       sound_handler* s = get_sound_handler();
+                       if (s)
+                       {
+                               int frame_size;
+                               uint8_t* ptr = (uint8_t*) 
malloc((AVCODEC_MAX_AUDIO_FRAME_SIZE * 3) / 2);
+                               if (avcodec_decode_audio(m_ACodecCtx, 
(int16_t*) ptr, &frame_size, packet.data, packet.size) >= 0)
+                               {
+
+                                       int16_t*        adjusted_data = 0;
+                                       int n = 0;
+
+                                       bool stereo = m_ACodecCtx->channels > 1 
? true : false;
+                                       int samples = stereo ? frame_size >> 2 
: frame_size >> 1;
+                                       s->convert_raw_data(&adjusted_data, &n, 
ptr, samples, 2, m_ACodecCtx->sample_rate, stereo);
+                                       raw_videodata_t* raw = new 
raw_videodata_t;
+                                       raw->m_data = (uint8_t*) adjusted_data;
+                                       raw->m_ptr = raw->m_data;
+                                       raw->m_size = n;
+                                       raw->m_stream_index = m_audio_index;
+
+                                       m_unqueued_data = m_qaudio.push(raw) ? 
NULL : raw;
+                               }
+                               free(ptr);
+                       }
+               }
+               else
+               if (packet.stream_index == m_video_index)
+               {
+                       int got = 0;
+                       avcodec_decode_video(m_VCodecCtx, m_Frame, &got, 
packet.data, packet.size);
+                       if (got) {
+                               int videoFrameFormat = 
gnash::render::videoFrameFormat();
+
+                               if (videoFrameFormat == render::NONE) { // 
NullGui?
+                                       av_free_packet(&packet);
+                                       return false;
+
+                               } else if (videoFrameFormat == render::YUV && 
m_VCodecCtx->pix_fmt != PIX_FMT_YUV420P) {
+                                       assert(0);      // TODO
+                                       //img_convert((AVPicture*) pFrameYUV, 
PIX_FMT_YUV420P, (AVPicture*) pFrame, pCodecCtx->pix_fmt, pCodecCtx->width, 
pCodecCtx->height);
+
+                               } else if (videoFrameFormat == render::RGB && 
m_VCodecCtx->pix_fmt != PIX_FMT_RGB24) {
+                                       AVFrame* frameRGB = 
avcodec_alloc_frame();
+                                       unsigned int numBytes = 
avpicture_get_size(PIX_FMT_RGB24, m_VCodecCtx->width, m_VCodecCtx->height);
+                                       uint8_t buffer[numBytes];
+                                       avpicture_fill((AVPicture *)frameRGB, 
buffer, PIX_FMT_RGB24, m_VCodecCtx->width, m_VCodecCtx->height);
+                                       img_convert((AVPicture*) frameRGB, 
PIX_FMT_RGB24, (AVPicture*) m_Frame, m_VCodecCtx->pix_fmt, m_VCodecCtx->width, 
m_VCodecCtx->height);
+                                       av_free(m_Frame);
+                                       m_Frame = frameRGB;
+                               }
+
+                               raw_videodata_t* video = new raw_videodata_t;
+                               if (videoFrameFormat == render::YUV) {
+                                       video->m_data = new 
uint8_t[static_cast<image::yuv*>(m_imageframe)->size()];
+                               } else if (videoFrameFormat == render::RGB) {
+                                       image::rgb* tmp = 
static_cast<image::rgb*>(m_imageframe);
+                                       video->m_data = new 
uint8_t[tmp->m_pitch * tmp->m_height];
+                               }
+
+                               video->m_ptr = video->m_data;
+                               video->m_stream_index = m_video_index;
+
+                               // set presentation timestamp
+                               if (packet.dts != AV_NOPTS_VALUE)
+                               {
+                                       video->m_pts = 
as_double(m_video_stream->time_base) * packet.dts;
+                               }
+
+                               if (video->m_pts != 0)
+                               {       
+                                       // update video clock with pts, if 
present
+                                       m_video_clock = video->m_pts;
+                               }
+                               else
+                               {
+                                       video->m_pts = m_video_clock;
+                               }
+
+                               // update video clock for next frame
+                               double frame_delay = 
as_double(m_video_stream->codec->time_base);
+
+                               // for MPEG2, the frame can be repeated, so we 
update the clock accordingly
+                               frame_delay += m_Frame->repeat_pict * 
(frame_delay * 0.5);
+
+                               m_video_clock += frame_delay;
+
+                               if (videoFrameFormat == render::YUV) {
+                                       image::yuv* yuvframe = 
static_cast<image::yuv*>(m_imageframe);
+                                       int copied = 0;
+                                       uint8_t* ptr = video->m_data;
+                                       for (int i = 0; i < 3 ; i++)
+                                       {
+                                               int shift = (i == 0 ? 0 : 1);
+                                               uint8_t* yuv_factor = 
m_Frame->data[i];
+                                               int h = m_VCodecCtx->height >> 
shift;
+                                               int w = m_VCodecCtx->width >> 
shift;
+                                               for (int j = 0; j < h; j++)
+                                               {
+                                                       copied += w;
+                                                       assert(copied <= 
yuvframe->size());
+                                                       memcpy(ptr, yuv_factor, 
w);
+                                                       yuv_factor += 
m_Frame->linesize[i];
+                                                       ptr += w;
+                                               }
+                                       }
+                                       video->m_size = copied;
+                               } else if (videoFrameFormat == render::RGB) {
+                                       for(int line = 0; line < 
m_VCodecCtx->height; line++)
+                                       {
+                                               int lineInv = 
m_VCodecCtx->height - line - 1; //the picture is y-inverted we have to flip 
lines
+                                               for(int byte = 0; byte < 
(m_VCodecCtx->width*3); byte++)
+                                               {
+                                               //printf("dst: %d, src: %d\n", 
byte + (lineInv*m_VCodecCtx->width*3), (line*m_Frame->linesize[0])+byte);
+                                                       video->m_data[byte + 
(lineInv*m_VCodecCtx->width*3)] = (unsigned char) 
*(m_Frame->data[0]+(line*m_Frame->linesize[0])+byte);
+                                               }
+                                       }   
+                               }
+
+                               m_unqueued_data = m_qvideo.push(video) ? NULL : 
video;
+                       }
+               }
+               av_free_packet(&packet);
+       }
+       else
+       {
+               return false;
+       }
+
+       return true;
+}
+
+image::image_base* NetStreamFfmpeg::get_video()
+{
+       return m_imageframe;
+}
+
+void
+NetStreamFfmpeg::seek()
+{
+    log_msg("%s:unimplemented \n", __FUNCTION__);
+}
+
+void
+NetStreamFfmpeg::setBufferTime()
+{
+    log_msg("%s:unimplemented \n", __FUNCTION__);
+}
+
+} // gnash namespcae
+
+#endif // USE_FFMPEG

Index: server/asobj/NetStreamFfmpeg.h
===================================================================
RCS file: server/asobj/NetStreamFfmpeg.h
diff -N server/asobj/NetStreamFfmpeg.h
--- /dev/null   1 Jan 1970 00:00:00 -0000
+++ server/asobj/NetStreamFfmpeg.h      5 Dec 2006 14:26:10 -0000       1.1
@@ -0,0 +1,243 @@
+// 
+//   Copyright (C) 2005, 2006 Free Software Foundation, Inc.
+// 
+// This program is free software; you can redistribute it and/or modify
+// it under the terms of the GNU General Public License as published by
+// the Free Software Foundation; either version 2 of the License, or
+// (at your option) any later version.
+// 
+// This program is distributed in the hope that it will be useful,
+// but WITHOUT ANY WARRANTY; without even the implied warranty of
+// MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
+// GNU General Public License for more details.
+// You should have received a copy of the GNU General Public License
+// along with this program; if not, write to the Free Software
+// Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA  02110-1301  USA
+
+// 
+//
+//
+
+#ifndef __NETSTREAMFFMPEG_H__
+#define __NETSTREAMFFMPEG_H__
+
+#ifdef HAVE_CONFIG_H
+#include "config.h"
+#endif
+
+#ifdef USE_FFMPEG
+
+#include <queue>
+#include <pthread.h>
+#include "impl.h"
+#include "video_stream_instance.h"
+#include <ffmpeg/avformat.h>
+#include "image.h"
+#include "StreamProvider.h"    
+
+namespace gnash {
+  
+struct raw_videodata_t
+{
+       raw_videodata_t():
+       m_stream_index(-1),
+       m_size(0),
+       m_data(NULL),
+       m_ptr(NULL),
+       m_pts(0)
+       {
+       };
+
+       ~raw_videodata_t()
+       {
+               if (m_data)
+               {
+                       delete m_data;
+               }
+       };
+
+       int m_stream_index;
+       uint32_t m_size;
+       uint8_t* m_data;
+       uint8_t* m_ptr;
+       double m_pts;   // presentation timestamp in sec
+};
+
+template<class T>
+class multithread_queue
+{
+       public:
+
+       multithread_queue()
+               {
+                       pthread_mutex_init(&m_mutex, NULL);
+               };
+
+       ~multithread_queue()
+               {
+                       lock();
+                       while (m_queue.size() > 0)
+                       {
+                               T x = m_queue.front();
+                               m_queue.pop();
+                               delete x;
+                       }
+                       unlock();
+
+                       pthread_mutex_destroy(&m_mutex);
+               }
+
+               size_t size()
+               {
+                       lock();
+                       size_t n = m_queue.size();
+                       unlock();
+                       return n;
+               }
+
+               bool push(T member)
+               {
+                       bool rc = false;
+                       lock();
+                       if (m_queue.size() < 20)        // hack
+                       {
+                               m_queue.push(member);
+                               rc = true;
+                       }
+                       unlock();
+                       return rc;
+               }
+
+               T front()
+               {
+                       lock();
+                       T member = NULL;
+                       if (m_queue.size() > 0)
+                       {
+                               member = m_queue.front();
+                       }
+                       unlock();
+                       return member;
+               }
+
+               void pop()
+               {
+                       lock();
+                       if (m_queue.size() > 0)
+                       {
+                               m_queue.pop();
+                       }
+                       unlock();
+               }
+
+       private:
+
+               inline void lock()
+               {
+                       pthread_mutex_lock(&m_mutex);
+               }
+
+               inline void unlock()
+               {
+                       pthread_mutex_unlock(&m_mutex);
+               }
+
+               pthread_mutex_t m_mutex;
+               std::queue < T > m_queue;
+};
+
+class netstream_as_object;
+
+class NetStreamFfmpeg {
+public:
+       NetStreamFfmpeg();
+       ~NetStreamFfmpeg();
+       void close();
+       void pause(int mode);
+       int play(const char* source);
+       void seek();
+       void setBufferTime();
+       void set_status(const char* code);
+       void setNetCon(as_object* nc);
+
+       // Used for ffmpeg data read and seek callbacks
+       static int readPacket(void* opaque, uint8_t* buf, int buf_size);
+       static offset_t seekMedia(void *opaque, offset_t offset, int whence);
+
+       bool read_frame();
+
+       image::image_base* get_video();
+
+       inline bool playing()
+       {
+               return m_go;
+       }
+
+       inline void set_parent(netstream_as_object* ns)
+       {
+               m_netstream_object = ns;
+       }
+
+       inline double as_double(AVRational time)
+       {
+               return time.num / (double) time.den;
+       }
+
+       static void* startPlayback(void* arg);
+       static void* av_streamer(void* arg);
+       static void audio_streamer(void *udata, uint8 *stream, int len);
+
+private:
+
+       bool _bufferLength;
+       bool _bufferTime;
+       bool _bytesLoaded;
+       bool _bytesTotal;
+       bool _currentFps;
+       bool _onStatus;
+       bool _time;
+
+       int m_video_index;
+       int m_audio_index;
+       
+       // video
+       AVCodecContext* m_VCodecCtx;
+       AVStream* m_video_stream;
+
+       // audio
+       AVCodecContext *m_ACodecCtx;
+       AVStream* m_audio_stream;
+
+       AVFormatContext *m_FormatCtx;
+
+       AVFrame* m_Frame;
+
+       volatile bool m_go;
+       unsigned int runtime;
+
+       image::image_base* m_imageframe;
+
+       double m_video_clock;
+
+       pthread_t m_thread;
+       pthread_t startThread;
+       multithread_queue <raw_videodata_t*> m_qaudio;
+       multithread_queue <raw_videodata_t*> m_qvideo;
+       bool m_pause;
+       double m_start_clock;
+       netstream_as_object* m_netstream_object;
+       raw_videodata_t* m_unqueued_data;
+
+       as_object* netCon;
+       ByteIOContext ByteIOCxt;
+       tu_file* input;
+       long inputPos;
+       StreamProvider streamProvider;
+       std::string url;
+};
+
+} // gnash namespace
+
+#endif // USE_FFMPEG
+
+#endif //  __NETSTREAMFFMPEG_H__




reply via email to

[Prev in Thread] Current Thread [Next in Thread]