Recherche avancée

Médias (1)

Mot : - Tags -/belgique

Autres articles (95)

  • Amélioration de la version de base

    13 septembre 2013

    Jolie sélection multiple
    Le plugin Chosen permet d’améliorer l’ergonomie des champs de sélection multiple. Voir les deux images suivantes pour comparer.
    Il suffit pour cela d’activer le plugin Chosen (Configuration générale du site > Gestion des plugins), puis de configurer le plugin (Les squelettes > Chosen) en activant l’utilisation de Chosen dans le site public et en spécifiant les éléments de formulaires à améliorer, par exemple select[multiple] pour les listes à sélection multiple (...)

  • Contribute to a better visual interface

    13 avril 2011

    MediaSPIP is based on a system of themes and templates. Templates define the placement of information on the page, and can be adapted to a wide range of uses. Themes define the overall graphic appearance of the site.
    Anyone can submit a new graphic theme or template and make it available to the MediaSPIP community.

  • Support de tous types de médias

    10 avril 2011

    Contrairement à beaucoup de logiciels et autres plate-formes modernes de partage de documents, MediaSPIP a l’ambition de gérer un maximum de formats de documents différents qu’ils soient de type : images (png, gif, jpg, bmp et autres...) ; audio (MP3, Ogg, Wav et autres...) ; vidéo (Avi, MP4, Ogv, mpg, mov, wmv et autres...) ; contenu textuel, code ou autres (open office, microsoft office (tableur, présentation), web (html, css), LaTeX, Google Earth) (...)

Sur d’autres sites (8596)

  • FFmpeg does not decode h264 stream

    5 juillet 2012, par HAPPY_TIGER

    I am trying to decode h264 stream from rtsp server and render it on iPhone.

    I found some libraries and read some articles about it.

    Libraries are from dropCam for iPhone called RTSPClient and DecoderWrapper.

    But I can not decode frame data with DecodeWrapper that using on ffmpeg.

    Here are my code.

    VideoViewer.m

    - (void)didReceiveFrame:(NSData*)frameData presentationTime:(NSDate*)presentationTime
    {
       [VideoDecoder staticInitialize];
       mConverter = [[VideoDecoder alloc] initWithCodec:kVCT_H264 colorSpace:kVCS_RGBA32 width:320 height:240 privateData:nil];


       [mConverter decodeFrame:frameData];

       if ([mConverter isFrameReady]) {
           UIImage *imageData =[mConverter getDecodedFrame];
           if (imageData) {
               [mVideoView setImage:imageData];
               NSLog(@"decoded!");
           }
       }
    }

    ---VideoDecoder.m---
    - (id)initWithCodec:(enum VideoCodecType)codecType
            colorSpace:(enum VideoColorSpace)colorSpace
                 width:(int)width
                height:(int)height
           privateData:(NSData*)privateData {
       if(self = [super init]) {

           codec = avcodec_find_decoder(CODEC_ID_H264);
           codecCtx = avcodec_alloc_context();

           // Note: for H.264 RTSP streams, the width and height are usually not specified (width and height are 0).  
           // These fields will become filled in once the first frame is decoded and the SPS is processed.
           codecCtx->width = width;
           codecCtx->height = height;

           codecCtx->extradata = av_malloc([privateData length]);
           codecCtx->extradata_size = [privateData length];
           [privateData getBytes:codecCtx->extradata length:codecCtx->extradata_size];
           codecCtx->pix_fmt = PIX_FMT_RGBA;
    #ifdef SHOW_DEBUG_MV
           codecCtx->debug_mv = 0xFF;
    #endif

           srcFrame = avcodec_alloc_frame();
           dstFrame = avcodec_alloc_frame();

           int res = avcodec_open(codecCtx, codec);
           if (res < 0)
           {
               NSLog(@"Failed to initialize decoder");
           }

       }

       return self;    
    }

    - (void)decodeFrame:(NSData*)frameData {


       AVPacket packet = {0};
       packet.data = (uint8_t*)[frameData bytes];
       packet.size = [frameData length];

       int frameFinished=0;
       NSLog(@"Packet size===>%d",packet.size);
       // Is this a packet from the video stream?
       if(packet.stream_index==0)
       {
           int res = avcodec_decode_video2(codecCtx, srcFrame, &frameFinished, &packet);
           NSLog(@"Res value===>%d",res);
           NSLog(@"frame data===>%d",(int)srcFrame->data);
           if (res < 0)
           {
               NSLog(@"Failed to decode frame");
           }
       }
       else
       {
           NSLog(@"No video stream found");
       }


       // Need to delay initializing the output buffers because we don't know the dimensions until we decode the first frame.
       if (!outputInit) {
           if (codecCtx->width > 0 && codecCtx->height > 0) {
    #ifdef _DEBUG
               NSLog(@"Initializing decoder with frame size of: %dx%d", codecCtx->width, codecCtx->height);
    #endif

               outputBufLen = avpicture_get_size(PIX_FMT_RGBA, codecCtx->width, codecCtx->height);
               outputBuf = av_malloc(outputBufLen);

               avpicture_fill((AVPicture*)dstFrame, outputBuf, PIX_FMT_RGBA, codecCtx->width, codecCtx->height);

               convertCtx = sws_getContext(codecCtx->width, codecCtx->height, codecCtx->pix_fmt,  codecCtx->width,
                                           codecCtx->height, PIX_FMT_RGBA, SWS_FAST_BILINEAR, NULL, NULL, NULL);

               outputInit = YES;
               frameFinished=1;
           }
           else {
               NSLog(@"Could not get video output dimensions");
           }
       }

       if (frameFinished)
           frameReady = YES;

    }

    The console shows me as follows.

    2011-05-16 20:16:04.223 RTSPTest1[41226:207] Packet size===>359
    [h264 @ 0x5815c00] no frame!
    2011-05-16 20:16:04.223 RTSPTest1[41226:207] Res value===>-1
    2011-05-16 20:16:04.224 RTSPTest1[41226:207] frame data===>101791200
    2011-05-16 20:16:04.224 RTSPTest1[41226:207] Failed to decode frame
    2011-05-16 20:16:04.225 RTSPTest1[41226:207] decoded!
    2011-05-16 20:16:04.226 RTSPTest1[41226:207] Packet size===>424
    [h264 @ 0x5017c00] no frame!
    2011-05-16 20:16:04.226 RTSPTest1[41226:207] Res value===>-1
    2011-05-16 20:16:04.227 RTSPTest1[41226:207] frame data===>81002704
    2011-05-16 20:16:04.227 RTSPTest1[41226:207] Failed to decode frame
    2011-05-16 20:16:04.228 RTSPTest1[41226:207] decoded!
    2011-05-16 20:16:04.229 RTSPTest1[41226:207] Packet size===>424
    [h264 @ 0x581d000] no frame!
    2011-05-16 20:16:04.229 RTSPTest1[41226:207] Res value===>-1
    2011-05-16 20:16:04.230 RTSPTest1[41226:207] frame data===>101791616
    2011-05-16 20:16:04.230 RTSPTest1[41226:207] Failed to decode frame
    2011-05-16 20:16:04.231 RTSPTest1[41226:207] decoded!
    . . . .  .

    But the simulator shows nothing.

    What's wrong with my code.

    Help me solve this problem.

    Thanks for your answers.

  • Live555 : X264 Stream Live source based on "testOnDemandRTSPServer"

    12 janvier 2017, par user2660369

    I am trying to create a rtsp Server that streams the OpenGL output of my program. I had a look at How to write a Live555 FramedSource to allow me to stream H.264 live, but I need the stream to be unicast. So I had a look at testOnDemandRTSPServer. Using the same Code fails. To my understanding I need to provide memory in which I store my h264 frames so the OnDemandServer can read them on Demand.

    H264VideoStreamServerMediaSubsession.cpp

    H264VideoStreamServerMediaSubsession*
    H264VideoStreamServerMediaSubsession::createNew(UsageEnvironment& env,
                             Boolean reuseFirstSource) {
     return new H264VideoStreamServerMediaSubsession(env, reuseFirstSource);
    }

    H264VideoStreamServerMediaSubsession::H264VideoStreamServerMediaSubsession(UsageEnvironment& env, Boolean reuseFirstSource)
     : OnDemandServerMediaSubsession(env, reuseFirstSource), fAuxSDPLine(NULL), fDoneFlag(0), fDummyRTPSink(NULL) {
    }

    H264VideoStreamServerMediaSubsession::~H264VideoStreamServerMediaSubsession() {
     delete[] fAuxSDPLine;
    }

    static void afterPlayingDummy(void* clientData) {
     H264VideoStreamServerMediaSubsession* subsess = (H264VideoStreamServerMediaSubsession*)clientData;
     subsess->afterPlayingDummy1();
    }

    void H264VideoStreamServerMediaSubsession::afterPlayingDummy1() {
     // Unschedule any pending 'checking' task:
     envir().taskScheduler().unscheduleDelayedTask(nextTask());
     // Signal the event loop that we're done:
     setDoneFlag();
    }

    static void checkForAuxSDPLine(void* clientData) {
     H264VideoStreamServerMediaSubsession* subsess = (H264VideoStreamServerMediaSubsession*)clientData;
     subsess->checkForAuxSDPLine1();
    }

    void H264VideoStreamServerMediaSubsession::checkForAuxSDPLine1() {
     char const* dasl;

     if (fAuxSDPLine != NULL) {
       // Signal the event loop that we're done:
       setDoneFlag();
     } else if (fDummyRTPSink != NULL && (dasl = fDummyRTPSink->auxSDPLine()) != NULL) {
       fAuxSDPLine = strDup(dasl);
       fDummyRTPSink = NULL;

       // Signal the event loop that we're done:
       setDoneFlag();
     } else {
       // try again after a brief delay:
       int uSecsToDelay = 100000; // 100 ms
       nextTask() = envir().taskScheduler().scheduleDelayedTask(uSecsToDelay,
                     (TaskFunc*)checkForAuxSDPLine, this);
     }
    }

    char const* H264VideoStreamServerMediaSubsession::getAuxSDPLine(RTPSink* rtpSink, FramedSource* inputSource) {
     if (fAuxSDPLine != NULL) return fAuxSDPLine; // it's already been set up (for a previous client)

     if (fDummyRTPSink == NULL) { // we're not already setting it up for another, concurrent stream
       // Note: For H264 video files, the 'config' information ("profile-level-id" and "sprop-parameter-sets") isn't known
       // until we start reading the file.  This means that "rtpSink"s "auxSDPLine()" will be NULL initially,
       // and we need to start reading data from our file until this changes.
       fDummyRTPSink = rtpSink;

       // Start reading the file:
       fDummyRTPSink->startPlaying(*inputSource, afterPlayingDummy, this);

       // Check whether the sink's 'auxSDPLine()' is ready:
       checkForAuxSDPLine(this);
     }

     envir().taskScheduler().doEventLoop(&fDoneFlag);

     return fAuxSDPLine;
    }

    FramedSource* H264VideoStreamServerMediaSubsession::createNewStreamSource(unsigned /*clientSessionId*/, unsigned& estBitrate) {
     estBitrate = 500; // kb
     megamol::remotecontrol::View3D_MRC *parent = (megamol::remotecontrol::View3D_MRC*)this->parent;
     return H264VideoStreamFramer::createNew(envir(), parent->h264FramedSource);
    }

    RTPSink* H264VideoStreamServerMediaSubsession::createNewRTPSink(Groupsock* rtpGroupsock, unsigned char rtpPayloadTypeIfDynamic, FramedSource* /*inputSource*/) {
     return H264VideoRTPSink::createNew(envir(), rtpGroupsock, rtpPayloadTypeIfDynamic);
    }

    FramedSource.cpp

    H264FramedSource* H264FramedSource::createNew(UsageEnvironment& env,
                                             unsigned preferredFrameSize,
                                             unsigned playTimePerFrame)
    {
       return new H264FramedSource(env, preferredFrameSize, playTimePerFrame);
    }

    H264FramedSource::H264FramedSource(UsageEnvironment& env,
                                  unsigned preferredFrameSize,
                                  unsigned playTimePerFrame)
       : FramedSource(env),
       fPreferredFrameSize(fMaxSize),
       fPlayTimePerFrame(playTimePerFrame),
       fLastPlayTime(0),
       fCurIndex(0)
    {

       x264_param_default_preset(&param, "veryfast", "zerolatency");
       param.i_threads = 1;
       param.i_width = 1024;
       param.i_height = 768;
       param.i_fps_num = 30;
       param.i_fps_den = 1;
       // Intra refres:
       param.i_keyint_max = 60;
       param.b_intra_refresh = 1;
       //Rate control:
       param.rc.i_rc_method = X264_RC_CRF;
       param.rc.f_rf_constant = 25;
       param.rc.f_rf_constant_max = 35;
       param.i_sps_id = 7;
       //For streaming:
       param.b_repeat_headers = 1;
       param.b_annexb = 1;
       x264_param_apply_profile(&param, "baseline");

       param.i_log_level = X264_LOG_ERROR;

       encoder = x264_encoder_open(&param);
       pic_in.i_type            = X264_TYPE_AUTO;
       pic_in.i_qpplus1         = 0;
       pic_in.img.i_csp         = X264_CSP_I420;
       pic_in.img.i_plane       = 3;


       x264_picture_alloc(&pic_in, X264_CSP_I420, 1024, 768);

       convertCtx = sws_getContext(1024, 768, PIX_FMT_RGBA, 1024, 768, PIX_FMT_YUV420P, SWS_FAST_BILINEAR, NULL, NULL, NULL);
       eventTriggerId = envir().taskScheduler().createEventTrigger(deliverFrame0);
    }

    H264FramedSource::~H264FramedSource()
    {
       envir().taskScheduler().deleteEventTrigger(eventTriggerId);
       eventTriggerId = 0;
    }

    void H264FramedSource::AddToBuffer(uint8_t* buf, int surfaceSizeInBytes)
    {
       uint8_t* surfaceData = (new uint8_t[surfaceSizeInBytes]);

       memcpy(surfaceData, buf, surfaceSizeInBytes);

       int srcstride = 1024*4;
       sws_scale(convertCtx, &surfaceData, &srcstride,0, 768, pic_in.img.plane, pic_in.img.i_stride);
       x264_nal_t* nals = NULL;
       int i_nals = 0;
       int frame_size = -1;


       frame_size = x264_encoder_encode(encoder, &nals, &i_nals, &pic_in, &pic_out);

       static bool finished = false;

       if (frame_size >= 0)
       {
       static bool alreadydone = false;
       if(!alreadydone)
       {

           x264_encoder_headers(encoder, &nals, &i_nals);
           alreadydone = true;
       }
       for(int i = 0; i < i_nals; ++i)
       {
           m_queue.push(nals[i]);
       }
       }
       delete [] surfaceData;
       surfaceData = nullptr;

       envir().taskScheduler().triggerEvent(eventTriggerId, this);
    }

    void H264FramedSource::doGetNextFrame()
    {
       deliverFrame();
    }

    void H264FramedSource::deliverFrame0(void* clientData)
    {
       ((H264FramedSource*)clientData)->deliverFrame();
    }

    void H264FramedSource::deliverFrame()
    {
       x264_nal_t nalToDeliver;

       if (fPlayTimePerFrame > 0 && fPreferredFrameSize > 0) {
       if (fPresentationTime.tv_sec == 0 && fPresentationTime.tv_usec == 0) {
           // This is the first frame, so use the current time:
           gettimeofday(&fPresentationTime, NULL);
       } else {
           // Increment by the play time of the previous data:
           unsigned uSeconds   = fPresentationTime.tv_usec + fLastPlayTime;
           fPresentationTime.tv_sec += uSeconds/1000000;
           fPresentationTime.tv_usec = uSeconds%1000000;
       }

       // Remember the play time of this data:
       fLastPlayTime = (fPlayTimePerFrame*fFrameSize)/fPreferredFrameSize;
       fDurationInMicroseconds = fLastPlayTime;
       } else {
       // We don't know a specific play time duration for this data,
       // so just record the current time as being the 'presentation time':
       gettimeofday(&fPresentationTime, NULL);
       }

       if(!m_queue.empty())
       {
       m_queue.wait_and_pop(nalToDeliver);

       uint8_t* newFrameDataStart = (uint8_t*)0xD15EA5E;

       newFrameDataStart = (uint8_t*)(nalToDeliver.p_payload);
       unsigned newFrameSize = nalToDeliver.i_payload;

       // Deliver the data here:
       if (newFrameSize > fMaxSize) {
           fFrameSize = fMaxSize;
           fNumTruncatedBytes = newFrameSize - fMaxSize;
       }
       else {
           fFrameSize = newFrameSize;
       }

       memcpy(fTo, nalToDeliver.p_payload, nalToDeliver.i_payload);

       FramedSource::afterGetting(this);
       }
    }

    Relevant part of the RTSP-Server Therad

     RTSPServer* rtspServer = RTSPServer::createNew(*(parent->env), 8554, NULL);
     if (rtspServer == NULL) {
       *(parent->env) << "Failed to create RTSP server: " << (parent->env)->getResultMsg() << "\n";
       exit(1);
     }
     char const* streamName = "Stream";
     parent->h264FramedSource = H264FramedSource::createNew(*(parent->env), 0, 0);
     H264VideoStreamServerMediaSubsession *h264VideoStreamServerMediaSubsession = H264VideoStreamServerMediaSubsession::createNew(*(parent->env), true);
     h264VideoStreamServerMediaSubsession->parent = parent;
     sms->addSubsession(h264VideoStreamServerMediaSubsession);
     rtspServer->addServerMediaSession(sms);

     parent->env->taskScheduler().doEventLoop(); // does not return

    Once a connection exists the render loop calls

    h264FramedSource->AddToBuffer(videoData, 1024*768*4);
  • Unable to stream file onto localhost - ffmpeg

    18 octobre 2013, par trueblue

    I am new to ffmpeg/ffserver. I am trying to stream a local file named Trial onto a localhost using ffserver. I want to run the file in browser as http://localhost:8090/feed1.ffm
    I am executing the below command in Ubuntu(Trial is a Mpeg TS file) :

     ffmpeg -i Trial http://localhost:8090/feed1.ffm

    Upon execution of above command I am getting below error :

    FFmpeg version SVN-r0.5.9-4:0.5.9-0ubuntu0.10.04.3, Copyright (c) 2000-2009 Fabrice Bellard, et al.
     configuration: --extra-version=4:0.5.9-0ubuntu0.10.04.3 --prefix=/usr --enable-avfilter --enable-avfilter-lavf --enable-vdpau --enable-bzlib --enable-libgsm --enable-libschroedinger --enable-libspeex --enable-libtheora --enable-libvorbis --enable-pthreads --enable-zlib --disable-stripping --disable-vhook --enable-runtime-cpudetect --enable-gpl --enable-postproc --enable-swscale --enable-x11grab --enable-libdc1394 --enable-shared --disable-static
     libavutil     49.15. 0 / 49.15. 0
     libavcodec    52.20. 1 / 52.20. 1
     libavformat   52.31. 0 / 52.31. 0
     libavdevice   52. 1. 0 / 52. 1. 0
     libavfilter    0. 4. 0 /  0. 4. 0
     libswscale     0. 7. 1 /  0. 7. 1
     libpostproc   51. 2. 0 / 51. 2. 0
     built on Jan 24 2013 19:42:59, gcc: 4.4.3

    Seems stream 0 codec frame rate differs from container frame rate: 119.88 (120000/1001) -> 59.94 (60000/1001)
    Input #0, mpegts, from 'Trial':
     Duration: 00:00:04.22, start: 0.177633, bitrate: 40368 kb/s
     Program 2
       Stream #0.0[0x21]: Video: mpeg2video, yuv420p, 1280x720 [PAR 1:1 DAR 16:9], 45000 kb/s, 59.94 tbr, 90k tbn, 119.88 tbc
    Output #0, ffm, to 'http://localhost:8090/feed1.ffm':
       Stream #0.0: Video: flv, yuv420p, 352x288, q=1-5, 100 kb/s, 1000k tbn, 15 tbc
       Stream #0.1: Audio: mp2, 44100 Hz, mono, s16, 32 kb/s
       Stream #0.2: Video: mpeg1video, yuv420p, 160x128, q=3-31, 64 kb/s, 1000k tbn, 3 tbc
       Stream #0.3: Audio: mp2, 22050 Hz, mono, s16, 64 kb/s
       Stream #0.4: Video: msmpeg4, yuv420p, 352x240, q=3-31, 256 kb/s, 1000k tbn, 15 tbc
    Could not find input stream matching output stream #0.1

    My ffserver.conf file goes like this :

    # Port on which the server is listening. You must select a different
    # port from your standard HTTP web server if it is running on the same
    # computer.
    Port 8090

    # Address on which the server is bound. Only useful if you have
    # several network interfaces.
    BindAddress 0.0.0.0

    # Number of simultaneous HTTP connections that can be handled. It has
    # to be defined *before* the MaxClients parameter, since it defines the
    # MaxClients maximum limit.
    MaxHTTPConnections 2000

    # Number of simultaneous requests that can be handled. Since FFServer
    # is very fast, it is more likely that you will want to leave this high
    # and use MaxBandwidth, below.
    MaxClients 1000

    # This the maximum amount of kbit/sec that you are prepared to
    # consume when streaming to clients.
    MaxBandwidth 1000

    # Access log file (uses standard Apache log file format)
    # '-' is the standard output.
    CustomLog -

    # Suppress that if you want to launch ffserver as a daemon.
    NoDaemon


    ##################################################################
    # Definition of the live feeds. Each live feed contains one video
    # and/or audio sequence coming from an ffmpeg encoder or another
    # ffserver. This sequence may be encoded simultaneously with several
    # codecs at several resolutions.

    <feed>

    # You must use &#39;ffmpeg&#39; to send a live feed to ffserver. In this
    # example, you can type:
    #
    # ffmpeg http://localhost:8090/feed1.ffm

    # ffserver can also do time shifting. It means that it can stream any
    # previously recorded live stream. The request should contain:
    # "http://xxxx?date=[YYYY-MM-DDT][[HH:]MM:]SS[.m...]".You must specify
    # a path where the feed is stored on disk. You also specify the
    # maximum size of the feed, where zero means unlimited. Default:
    # File=/tmp/feed_name.ffm FileMaxSize=5M
    File /tmp/feed1.ffm
    FileMaxSize 5M

    # You could specify
    # ReadOnlyFile /saved/specialvideo.ffm
    # This marks the file as readonly and it will not be deleted or updated.

    # Specify launch in order to start ffmpeg automatically.
    # First ffmpeg must be defined with an appropriate path if needed,
    # after that options can follow, but avoid adding the http:// field
    #Launch ffmpeg

    # Only allow connections from localhost to the feed.
    ACL allow 127.0.0.1

    </feed>



    <stream>
    Feed feed1.ffm
    Format swf
    VideoCodec flv
    VideoFrameRate 15
    VideoBufferSize 80000
    VideoBitRate 100
    VideoQMin 1
    VideoQMax 5
    VideoSize 352x288
    PreRoll 0
    Noaudio
    </stream>

    ##################################################################
    # Now you can define each stream which will be generated from the
    # original audio and video stream. Each format has a filename (here
    # &#39;test1.mpg&#39;). FFServer will send this stream when answering a
    # request containing this filename.

    <stream>

    # coming from live feed &#39;feed1&#39;
    Feed feed1.ffm

    # Format of the stream : you can choose among:
    # mpeg       : MPEG-1 multiplexed video and audio
    # mpegvideo  : only MPEG-1 video
    # mp2        : MPEG-2 audio (use AudioCodec to select layer 2 and 3 codec)
    # ogg        : Ogg format (Vorbis audio codec)
    # rm         : RealNetworks-compatible stream. Multiplexed audio and video.
    # ra         : RealNetworks-compatible stream. Audio only.
    # mpjpeg     : Multipart JPEG (works with Netscape without any plugin)
    # jpeg       : Generate a single JPEG image.
    # asf        : ASF compatible streaming (Windows Media Player format).
    # swf        : Macromedia Flash compatible stream
    # avi        : AVI format (MPEG-4 video, MPEG audio sound)
    Format mpeg

    # Bitrate for the audio stream. Codecs usually support only a few
    # different bitrates.
    AudioBitRate 32

    # Number of audio channels: 1 = mono, 2 = stereo
    AudioChannels 1

    # Sampling frequency for audio. When using low bitrates, you should
    # lower this frequency to 22050 or 11025. The supported frequencies
    # depend on the selected audio codec.
    AudioSampleRate 44100

    # Bitrate for the video stream
    VideoBitRate 64


    # Ratecontrol buffer size
    VideoBufferSize 40

    # Number of frames per second
    VideoFrameRate 3

    # Size of the video frame: WxH (default: 160x128)
    # The following abbreviations are defined: sqcif, qcif, cif, 4cif, qqvga,
    # qvga, vga, svga, xga, uxga, qxga, sxga, qsxga, hsxga, wvga, wxga, wsxga,
    # wuxga, woxga, wqsxga, wquxga, whsxga, whuxga, cga, ega, hd480, hd720,
    # hd1080
    VideoSize 160x128

    # Transmit only intra frames (useful for low bitrates, but kills frame rate).
    #VideoIntraOnly

    # If non-intra only, an intra frame is transmitted every VideoGopSize
    # frames. Video synchronization can only begin at an intra frame.
    VideoGopSize 12

    # More MPEG-4 parameters
    # VideoHighQuality
    # Video4MotionVector

    # Choose your codecs:
    #AudioCodec mp2
    #VideoCodec mpeg1video

    # Suppress audio
    #NoAudio

    # Suppress video
    #NoVideo

    #VideoQMin 3
    #VideoQMax 31

    # Set this to the number of seconds backwards in time to start. Note that
    # most players will buffer 5-10 seconds of video, and also you need to allow
    # for a keyframe to appear in the data stream.
    #Preroll 15

    # ACL:

    # You can allow ranges of addresses (or single addresses)
    #ACL ALLOW <first address="address"> <last address="address">

    # You can deny ranges of addresses (or single addresses)
    #ACL DENY <first address="address"> <last address="address">

    # You can repeat the ACL allow/deny as often as you like. It is on a per
    # stream basis. The first match defines the action. If there are no matches,
    # then the default is the inverse of the last ACL statement.
    #
    # Thus &#39;ACL allow localhost&#39; only allows access from localhost.
    # &#39;ACL deny 1.0.0.0 1.255.255.255&#39; would deny the whole of network 1 and
    # allow everybody else.

    </last></first></last></first></stream>


    ##################################################################
    # Example streams


    # Multipart JPEG

    #<stream>
    #Feed feed1.ffm
    #Format mpjpeg
    #VideoFrameRate 2
    #VideoIntraOnly
    #NoAudio
    #Strict -1
    #</stream>


    # Single JPEG

    #<stream>
    #Feed feed1.ffm
    #Format jpeg
    #VideoFrameRate 2
    #VideoIntraOnly
    ##VideoSize 352x240
    #NoAudio
    #Strict -1
    #</stream>



    # Flash

    #<stream>
    #Feed feed1.ffm
    #Format swf
    #VideoFrameRate 2
    #VideoIntraOnly
    #NoAudio
    #</stream>


    # ASF compatible

    <stream>
    Feed feed1.ffm
    Format asf
    VideoFrameRate 15
    VideoSize 352x240
    VideoBitRate 256
    VideoBufferSize 40
    VideoGopSize 30
    AudioBitRate 64
    StartSendOnKey
    </stream>


    # MP3 audio

    #<stream>
    #Feed feed1.ffm
    #Format mp2
    #AudioCodec mp3
    #AudioBitRate 64
    #AudioChannels 1
    #AudioSampleRate 44100
    #NoVideo
    #</stream>


    # Ogg Vorbis audio

    #<stream>
    #Feed feed1.ffm
    #Title "Stream title"
    #AudioBitRate 64
    #AudioChannels 2
    #AudioSampleRate 44100
    #NoVideo
    #</stream>


    # Real with audio only at 32 kbits

    #<stream>
    #Feed feed1.ffm
    #Format rm
    #AudioBitRate 32
    #NoVideo
    #NoAudio
    #</stream>


    # Real with audio and video at 64 kbits

    #<stream>
    #Feed feed1.ffm
    #Format rm
    #AudioBitRate 32
    #VideoBitRate 128
    #VideoFrameRate 25
    #VideoGopSize 25
    #NoAudio
    #</stream>


    ##################################################################
    # A stream coming from a file: you only need to set the input
    # filename and optionally a new format. Supported conversions:
    #    AVI -> ASF

    #<stream>
    #File "/usr/local/httpd/htdocs/tlive.rm"
    #NoAudio
    #</stream>

    #<stream>
    #File "/usr/local/httpd/htdocs/test.asf"
    #NoAudio
    #Author "Me"
    #Copyright "Super MegaCorp"
    #Title "Test stream from disk"
    #Comment "Test comment"
    #</stream>


    ##################################################################
    # RTSP examples
    #
    # You can access this stream with the RTSP URL:
    #   rtsp://localhost:5454/test1-rtsp.mpg
    #
    # A non-standard RTSP redirector is also created. Its URL is:
    #   http://localhost:8090/test1-rtsp.rtsp

    #<stream>
    #Format rtp
    #File "/usr/local/httpd/htdocs/test1.mpg"
    #</stream>


    ##################################################################
    # SDP/multicast examples
    #
    # If you want to send your stream in multicast, you must set the
    # multicast address with MulticastAddress. The port and the TTL can
    # also be set.
    #
    # An SDP file is automatically generated by ffserver by adding the
    # &#39;sdp&#39; extension to the stream name (here
    # http://localhost:8090/test1-sdp.sdp). You should usually give this
    # file to your player to play the stream.
    #
    # The &#39;NoLoop&#39; option can be used to avoid looping when the stream is
    # terminated.

    #<stream>
    #Format rtp
    #File "/usr/local/httpd/htdocs/test1.mpg"
    #MulticastAddress 224.124.0.1
    #MulticastPort 5000
    #MulticastTTL 16
    #NoLoop
    #</stream>


    ##################################################################
    # Special streams

    # Server status

    <stream>
    Format status

    # Only allow local people to get the status
    ACL allow localhost
    ACL allow 192.168.0.0 192.168.255.255

    #FaviconURL http://pond1.gladstonefamily.net:8080/favicon.ico
    </stream>


    # Redirect index.html to the appropriate site

    <redirect>
    URL http://www.ffmpeg.org/
    </redirect>

    Kindly anyone please assist me whether I am missing something or do i need to change my server.conf file ? I have referred many websites. But still I am unable to fix it. Thanks in advance.