Recherche avancée

Médias (0)

Mot : - Tags -/page unique

Aucun média correspondant à vos critères n’est disponible sur le site.

Autres articles (62)

  • Le plugin : Podcasts.

    14 juillet 2010, par

    Le problème du podcasting est à nouveau un problème révélateur de la normalisation des transports de données sur Internet.
    Deux formats intéressants existent : Celui développé par Apple, très axé sur l’utilisation d’iTunes dont la SPEC est ici ; Le format "Media RSS Module" qui est plus "libre" notamment soutenu par Yahoo et le logiciel Miro ;
    Types de fichiers supportés dans les flux
    Le format d’Apple n’autorise que les formats suivants dans ses flux : .mp3 audio/mpeg .m4a audio/x-m4a .mp4 (...)

  • Publier sur MédiaSpip

    13 juin 2013

    Puis-je poster des contenus à partir d’une tablette Ipad ?
    Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir

  • Récupération d’informations sur le site maître à l’installation d’une instance

    26 novembre 2010, par

    Utilité
    Sur le site principal, une instance de mutualisation est définie par plusieurs choses : Les données dans la table spip_mutus ; Son logo ; Son auteur principal (id_admin dans la table spip_mutus correspondant à un id_auteur de la table spip_auteurs)qui sera le seul à pouvoir créer définitivement l’instance de mutualisation ;
    Il peut donc être tout à fait judicieux de vouloir récupérer certaines de ces informations afin de compléter l’installation d’une instance pour, par exemple : récupérer le (...)

Sur d’autres sites (6298)

  • lavf/mp3dec : avoid printing useless message in default log level

    8 mars 2016, par Moritz Barsnick
    lavf/mp3dec : avoid printing useless message in default log level
    

    "Skipping 0 bytes of junk" is useless to the user, and essentially
    indicates a NOP. At 0 bytes, this message is now pushed back to
    the verbose log level.

    Signed-off-by : Moritz Barsnick <barsnick@gmx.net>

    • [DH] libavformat/mp3dec.c
  • OpenCV VideoWriter using ffmpeg with "Could not open codec 'libx264'" Error

    19 avril, par user2262504

    I am new to OpenCV, and I want write Mat images into video using VideoWriter on Ubuntu 12.04. But when constructing VideoWriter, errors came out.

    &#xA;&#xA;

    It seems that OpenCV invoke ffmpeg API using default parameters and ffmpeg invoke x264 using its default parameters. Then these setting is broken for libx264. Thus the "Could not open codec 'libx264'" error.

    &#xA;&#xA;

    Anyone has ideas to solve this problem ?

    &#xA;&#xA;

    More specifically :

    &#xA;&#xA;

      &#xA;
    1. anyone knows where and how OpenCV invoke ffmpeg API ?
    2. &#xA;

    3. how to change ffmpeg default settings using code, hopefull, can be easily embeded into OpenCV ?
    4. &#xA;

    5. will changes of default in ffmpeg be carried to libx264 ?
    6. &#xA;

    &#xA;&#xA;

    Errors :

    &#xA;&#xA;

    1. Uising CV_FOURCC(&#x27;H&#x27;, &#x27;2&#x27;, &#x27;6&#x27;, &#x27;4&#x27;)&#xA;[libx264 @ 0x255de40] broken ffmpeg default settings detected&#xA;[libx264 @ 0x255de40] use an encoding preset (e.g. -vpre medium)&#xA;[libx264 @ 0x255de40] preset usage: -vpre <speed> -vpre <profile>&#xA;[libx264 @ 0x255de40] speed presets are listed in x264 --help&#xA;[libx264 @ 0x255de40] profile is optional; x264 defaults to high&#xA;Could not open codec &#x27;libx264&#x27;: Unspecified error&#xA;&#xA;2. Using FOURCC = -1 to invoke user customized codec&#xA;OpenCV Error: Unsupported format or combination of formats (Gstreamer Opencv &#xA;backend doesn&#x27;t support this codec acutally.) in CvVideoWriter_GStreamer::open, &#xA;file /home/XXX/Downloads/opencv-2.4.8/modules/highgui/src/cap_gstreamer.cpp, &#xA;line 505 terminate called after throwing an instance of &#x27;cv::Exception&#x27;&#xA;what():  /home/XXX/Downloads/opencv-2.4.8/modules/highgui/src/cap_gstreamer.cpp:&#xA;505: error: (-210) Gstreamer Opencv backend doesn&#x27;t support this codec acutally.&#xA;in function CvVideoWriter_GStreamer::open&#xA;</profile></speed>

    &#xA;&#xA;

    Codes :

    &#xA;&#xA;

    int main(int argc, char *argv[])&#xA;{&#xA;    VideoWriter outputVideo;&#xA;    bool fourcc_on = true; //switch on / off different error&#xA;    if (fourcc_on)&#xA;        outputVideo.open("outVideo.avi", CV_FOURCC(&#x27;H&#x27;, &#x27;2&#x27;, &#x27;6&#x27;, &#x27;4&#x27;), 25, Size(100, 100), true);&#xA;    else&#xA;        outputVideo.open("outVideo.avi", -1, 25, Size(100, 100), true);&#xA;&#xA;    if (!outputVideo.isOpened())&#xA;    {&#xA;        cout  &lt;&lt; "Could not open the output video for write" &lt;&lt; endl;&#xA;        return -1;&#xA;    }&#xA;    return 0;&#xA;}&#xA;

    &#xA;&#xA;

    OpenCV Configuration :

    &#xA;&#xA;

    -- Detected version of GNU GCC: 46 (406)&#xA;-- Found OpenEXR: /usr/lib/libIlmImf.so&#xA;-- Looking for linux/videodev.h&#xA;-- Looking for linux/videodev.h - not found&#xA;-- Looking for linux/videodev2.h&#xA;-- Looking for linux/videodev2.h - found&#xA;-- Looking for sys/videoio.h&#xA;-- Looking for sys/videoio.h - not found&#xA;-- Looking for libavformat/avformat.h&#xA;-- Looking for libavformat/avformat.h - found&#xA;-- Looking for ffmpeg/avformat.h&#xA;-- Looking for ffmpeg/avformat.h - not found&#xA;-- Could NOT find JNI (missing:  JAVA_INCLUDE_PATH JAVA_INCLUDE_PATH2 JAVA_AWT_INCLUDE_PATH) &#xA;-- &#xA;-- General configuration for OpenCV 2.4.8 =====================================&#xA;--   Version control:               unknown&#xA;-- &#xA;--   Platform:&#xA;--     Host:                        Linux 3.8.0-38-generic x86_64&#xA;--     CMake:                       2.8.7&#xA;--     CMake generator:             Unix Makefiles&#xA;--     CMake build tool:            /usr/bin/make&#xA;--     Configuration:               RELEASE&#xA;-- &#xA;--   C/C&#x2B;&#x2B;:&#xA;--     Built as dynamic libs?:      YES&#xA;--     C&#x2B;&#x2B; Compiler:                /usr/bin/c&#x2B;&#x2B;  (ver 4.6)&#xA;--     C&#x2B;&#x2B; flags (Release):         -fsigned-char -W -Wall -Werror=return-type -Werror=address -Werror=sequence-point -Wformat -Werror=format-security -Wmissing-declarations -Wundef -Winit-self -Wpointer-arith -Wshadow -Wsign-promo -fdiagnostics-show-option -Wno-long-long -pthread -fomit-frame-pointer -msse -msse2 -msse3 -ffunction-sections -O3 -DNDEBUG  -DNDEBUG&#xA;--     C&#x2B;&#x2B; flags (Debug):           -fsigned-char -W -Wall -Werror=return-type -Werror=address -Werror=sequence-point -Wformat -Werror=format-security -Wmissing-declarations -Wundef -Winit-self -Wpointer-arith -Wshadow -Wsign-promo -fdiagnostics-show-option -Wno-long-long -pthread -fomit-frame-pointer -msse -msse2 -msse3 -ffunction-sections -g  -O0 -DDEBUG -D_DEBUG&#xA;--     C Compiler:                  /usr/bin/gcc&#xA;--     C flags (Release):           -fsigned-char -W -Wall -Werror=return-type -Werror=address -Werror=sequence-point -Wformat -Werror=format-security -Wmissing-declarations -Wmissing-prototypes -Wstrict-prototypes -Wundef -Winit-self -Wpointer-arith -Wshadow -fdiagnostics-show-option -Wno-long-long -pthread -fomit-frame-pointer -msse -msse2 -msse3 -ffunction-sections -O3 -DNDEBUG  -DNDEBUG&#xA;--     C flags (Debug):             -fsigned-char -W -Wall -Werror=return-type -Werror=address -Werror=sequence-point -Wformat -Werror=format-security -Wmissing-declarations -Wmissing-prototypes -Wstrict-prototypes -Wundef -Winit-self -Wpointer-arith -Wshadow -fdiagnostics-show-option -Wno-long-long -pthread -fomit-frame-pointer -msse -msse2 -msse3 -ffunction-sections -g  -O0 -DDEBUG -D_DEBUG&#xA;--     Linker flags (Release):      &#xA;--     Linker flags (Debug):        &#xA;--     Precompiled headers:         YES&#xA;-- &#xA;--   OpenCV modules:&#xA;--     To be built:                 core flann imgproc highgui features2d calib3d ml video legacy objdetect photo gpu ocl nonfree contrib python stitching superres ts videostab&#xA;--     Disabled:                    world&#xA;--     Disabled by dependency:      -&#xA;--     Unavailable:                 androidcamera dynamicuda java&#xA;-- &#xA;--   GUI: &#xA;--     QT:                          NO&#xA;--     GTK&#x2B; 2.x:                    YES (ver 2.24.10)&#xA;--     GThread :                    YES (ver 2.32.4)&#xA;--     GtkGlExt:                    NO&#xA;--     OpenGL support:              NO&#xA;-- &#xA;--   Media I/O: &#xA;--     ZLib:                        /usr/lib/x86_64-linux-gnu/libz.so (ver 1.2.3.4)&#xA;--     JPEG:                        /usr/lib/x86_64-linux-gnu/libjpeg.so (ver )&#xA;--     PNG:                         /usr/lib/x86_64-linux-gnu/libpng.so (ver 1.2.46)&#xA;--     TIFF:                        /usr/lib/x86_64-linux-gnu/libtiff.so (ver 42 - 3.9.5)&#xA;--     JPEG 2000:                   /usr/lib/x86_64-linux-gnu/libjasper.so (ver 1.900.1)&#xA;--     OpenEXR:                     /usr/lib/libImath.so /usr/lib/libIlmImf.so /usr/lib/libIex.so /usr/lib/libHalf.so /usr/lib/libIlmThread.so (ver 1.6.1)&#xA;-- &#xA;--   Video I/O:&#xA;--     DC1394 1.x:                  NO&#xA;--     DC1394 2.x:                  YES (ver 2.2.0)&#xA;--     FFMPEG:                      YES&#xA;--       codec:                     YES (ver 55.58.105)&#xA;--       format:                    YES (ver 55.37.101)&#xA;--       util:                      YES (ver 52.78.100)&#xA;--       swscale:                   YES (ver 2.6.100)&#xA;--       gentoo-style:              YES&#xA;--     GStreamer:                   &#xA;--       base:                      YES (ver 0.10.36)&#xA;--       app:                       YES (ver 0.10.36)&#xA;--       video:                     YES (ver 0.10.36)&#xA;--     OpenNI:                      NO&#xA;--     OpenNI PrimeSensor Modules:  NO&#xA;--     PvAPI:                       NO&#xA;--     GigEVisionSDK:               NO&#xA;--     UniCap:                      NO&#xA;--     UniCap ucil:                 NO&#xA;--     V4L/V4L2:                    Using libv4l (ver 1.0.1)&#xA;--     XIMEA:                       NO&#xA;--     Xine:                        NO&#xA;-- &#xA;--   Other third-party libraries:&#xA;--     Use IPP:                     NO&#xA;--     Use Eigen:                   NO&#xA;--     Use TBB:                     NO&#xA;--     Use OpenMP:                  NO&#xA;--     Use GCD                      NO&#xA;--     Use Concurrency              NO&#xA;--     Use C=:                      NO&#xA;--     Use Cuda:                    NO&#xA;--     Use OpenCL:                  YES&#xA;-- &#xA;--   OpenCL:&#xA;--     Version:                     dynamic&#xA;--     Include path:                /home/shixudongleo/Downloads/opencv-2.4.8/3rdparty/include/opencl/1.2&#xA;--     Use AMD FFT:                 NO&#xA;--     Use AMD BLAS:                NO&#xA;-- &#xA;--   Python:&#xA;--     Interpreter:                 /usr/bin/python (ver 2.7.3)&#xA;--     Libraries:                   /usr/lib/libpython2.7.so&#xA;--     numpy:                       /usr/lib/python2.7/dist-packages/numpy/core/include (ver 1.6.1)&#xA;--     packages path:               lib/python2.7/dist-packages&#xA;-- &#xA;--   Java:&#xA;--     ant:                         NO&#xA;--     JNI:                         NO&#xA;--     Java tests:                  NO&#xA;-- &#xA;--   Documentation:&#xA;--     Build Documentation:         NO&#xA;--     Sphinx:                      NO&#xA;--     PdfLaTeX compiler:           /usr/bin/pdflatex&#xA;-- &#xA;--   Tests and samples:&#xA;--     Tests:                       YES&#xA;--     Performance tests:           YES&#xA;--     C/C&#x2B;&#x2B; Examples:              NO&#xA;-- &#xA;--   Install path:                  /usr/local&#xA;-- &#xA;--   cvconfig.h is in:              /home/shixudongleo/Downloads/opencv-2.4.8/build&#xA;-- -----------------------------------------------------------------&#xA;-- &#xA;-- Configuring done&#xA;-- Generating done&#xA;-- Build files have been written to: /home/XXX/Downloads/opencv-2.4.8/build&#xA;

    &#xA;&#xA;

    FFMPEG

    &#xA;&#xA;

    ffmpeg is enable to support OpenCV and libx264 is enabled when compiling ffmpeg.&#xA;By using ffmpeg command line, libx264 is running normally.

    &#xA;&#xA;

    $ ffmpeg -i test.avi -vcodec libx264 test.mp4&#xA;ffmpeg -i test.avi -vcodec libx264 test.mp4 > ~/Downloads/ffmpeg_log.txt&#xA;ffmpeg version 2.2.git Copyright (c) 2000-2014 the FFmpeg developers&#xA;  built on Apr 24 2014 16:39:51 with gcc 4.6 (Ubuntu/Linaro 4.6.3-1ubuntu5)&#xA;  configuration: --enable-gpl --enable-libfaac --enable-libmp3lame --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libtheora --enable-libvorbis --enable-libx264 --enable-libxvid --enable-nonfree --enable-postproc --enable-version3 --enable-x11grab --enable-shared --enable-pic&#xA;  libavutil      52. 78.100 / 52. 78.100&#xA;  libavcodec     55. 58.105 / 55. 58.105&#xA;  libavformat    55. 37.101 / 55. 37.101&#xA;  libavdevice    55. 13.100 / 55. 13.100&#xA;  libavfilter     4.  4.100 /  4.  4.100&#xA;  libswscale      2.  6.100 /  2.  6.100&#xA;  libswresample   0. 18.100 /  0. 18.100&#xA;  libpostproc    52.  3.100 / 52.  3.100&#xA;Input #0, avi, from &#x27;test.avi&#x27;:&#xA;  Duration: 00:00:03.73, start: 0.000000, bitrate: 1757 kb/s&#xA;    Stream #0:0: Video: msvideo1 (CRAM / 0x4D415243), rgb555le, 320x240, 1781 kb/s, 15 tbr, 15 tbn, 15 tbc&#xA;    Metadata:&#xA;      title           : julius.avi Video #1&#xA;File &#x27;test.mp4&#x27; already exists. Overwrite ? [y/N] y&#xA;No pixel format specified, yuv444p for H.264 encoding chosen.&#xA;Use -pix_fmt yuv420p for compatibility with outdated media players.&#xA;[libx264 @ 0x25d08e0] using cpu capabilities: none!&#xA;[libx264 @ 0x25d08e0] profile High 4:4:4 Predictive, level 1.2, 4:4:4 8-bit&#xA;[libx264 @ 0x25d08e0] 264 - core 142 - H.264/MPEG-4 AVC codec - Copyleft 2003-2014 - http://www.videolan.org/x264.html - options: cabac=1 ref=3 deblock=1:0:0 analyse=0x3:0x113 me=hex subme=7 psy=1 psy_rd=1.00:0.00 mixed_ref=1 me_range=16 chroma_me=1 trellis=1 8x8dct=1 cqm=0 deadzone=21,11 fast_pskip=1 chroma_qp_offset=4 threads=12 lookahead_threads=1 sliced_threads=0 nr=0 decimate=1 interlaced=0 bluray_compat=0 constrained_intra=0 bframes=3 b_pyramid=2 b_adapt=1 b_bias=0 direct=1 weightb=1 open_gop=0 weightp=2 keyint=250 keyint_min=15 scenecut=40 intra_refresh=0 rc_lookahead=40 rc=crf mbtree=1 crf=23.0 qcomp=0.60 qpmin=0 qpmax=69 qpstep=4 ip_ratio=1.40 aq=1:1.00&#xA;Output #0, mp4, to &#x27;test.mp4&#x27;:&#xA;  Metadata:&#xA;    encoder         : Lavf55.37.101&#xA;    Stream #0:0: Video: h264 (libx264) ([33][0][0][0] / 0x0021), yuv444p, 320x240, q=-1--1, 15360 tbn, 15 tbc&#xA;    Metadata:&#xA;      title           : julius.avi Video #1&#xA;Stream mapping:&#xA;  Stream #0:0 -> #0:0 (msvideo1 -> libx264)&#xA;Press [q] to stop, [?] for help&#xA;frame=   56 fps=0.0 q=-1.0 Lsize=     321kB time=00:00:03.60 bitrate= 731.0kbits/s    &#xA;video:320kB audio:0kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: 0.409949%&#xA;[libx264 @ 0x25d08e0] frame I:3     Avg QP:15.36  size:  7975&#xA;[libx264 @ 0x25d08e0] frame P:38    Avg QP:26.05  size:  6230&#xA;[libx264 @ 0x25d08e0] frame B:15    Avg QP:28.25  size:  4418&#xA;[libx264 @ 0x25d08e0] consecutive B-frames: 46.4% 53.6%  0.0%  0.0%&#xA;[libx264 @ 0x25d08e0] mb I  I16..4:  1.4% 72.8% 25.8%&#xA;[libx264 @ 0x25d08e0] mb P  I16..4:  1.6%  5.7% 15.1%  P16..4:  7.6%  6.3%  7.4%  0.0%  0.0%    skip:56.3%&#xA;[libx264 @ 0x25d08e0] mb B  I16..4:  0.2%  1.0%  2.0%  B16..8: 13.3%  7.8%  8.7%  direct: 8.3%  skip:58.8%  L0:34.9% L1:36.6% BI:28.5%&#xA;[libx264 @ 0x25d08e0] 8x8 transform intra:37.7% inter:2.3%&#xA;[libx264 @ 0x25d08e0] coded y,u,v intra: 52.1% 42.1% 30.1% inter: 19.6% 9.2% 5.2%&#xA;[libx264 @ 0x25d08e0] i16 v,h,dc,p: 56% 17% 24%  2%&#xA;[libx264 @ 0x25d08e0] i8 v,h,dc,ddl,ddr,vr,hd,vl,hu: 10% 16% 68%  1%  1%  1%  1%  1%  1%&#xA;[libx264 @ 0x25d08e0] i4 v,h,dc,ddl,ddr,vr,hd,vl,hu: 19% 18% 28%  5%  6%  5%  7%  5%  6%&#xA;[libx264 @ 0x25d08e0] Weighted P-Frames: Y:31.6% UV:21.1%&#xA;[libx264 @ 0x25d08e0] ref P L0: 70.5%  9.0% 12.1%  6.5%  2.0%&#xA;[libx264 @ 0x25d08e0] ref B L0: 91.3%  8.7%&#xA;[libx264 @ 0x25d08e0] kb/s:700.56&#xA;

    &#xA;

  • Live555 : X264 Stream Live source based on "testOnDemandRTSPServer"

    26 octobre 2017, par user2660369

    I am trying to create a rtsp Server that streams the OpenGL output of my program. I had a look at How to write a Live555 FramedSource to allow me to stream H.264 live, but I need the stream to be unicast. So I had a look at testOnDemandRTSPServer. Using the same Code fails. To my understanding I need to provide memory in which I store my h264 frames so the OnDemandServer can read them on Demand.

    H264VideoStreamServerMediaSubsession.cpp

    H264VideoStreamServerMediaSubsession*
    H264VideoStreamServerMediaSubsession::createNew(UsageEnvironment&amp; env,
                             Boolean reuseFirstSource) {
     return new H264VideoStreamServerMediaSubsession(env, reuseFirstSource);
    }

    H264VideoStreamServerMediaSubsession::H264VideoStreamServerMediaSubsession(UsageEnvironment&amp; env, Boolean reuseFirstSource)
     : OnDemandServerMediaSubsession(env, reuseFirstSource), fAuxSDPLine(NULL), fDoneFlag(0), fDummyRTPSink(NULL) {
    }

    H264VideoStreamServerMediaSubsession::~H264VideoStreamServerMediaSubsession() {
     delete[] fAuxSDPLine;
    }

    static void afterPlayingDummy(void* clientData) {
     H264VideoStreamServerMediaSubsession* subsess = (H264VideoStreamServerMediaSubsession*)clientData;
     subsess->afterPlayingDummy1();
    }

    void H264VideoStreamServerMediaSubsession::afterPlayingDummy1() {
     // Unschedule any pending 'checking' task:
     envir().taskScheduler().unscheduleDelayedTask(nextTask());
     // Signal the event loop that we're done:
     setDoneFlag();
    }

    static void checkForAuxSDPLine(void* clientData) {
     H264VideoStreamServerMediaSubsession* subsess = (H264VideoStreamServerMediaSubsession*)clientData;
     subsess->checkForAuxSDPLine1();
    }

    void H264VideoStreamServerMediaSubsession::checkForAuxSDPLine1() {
     char const* dasl;

     if (fAuxSDPLine != NULL) {
       // Signal the event loop that we're done:
       setDoneFlag();
     } else if (fDummyRTPSink != NULL &amp;&amp; (dasl = fDummyRTPSink->auxSDPLine()) != NULL) {
       fAuxSDPLine = strDup(dasl);
       fDummyRTPSink = NULL;

       // Signal the event loop that we're done:
       setDoneFlag();
     } else {
       // try again after a brief delay:
       int uSecsToDelay = 100000; // 100 ms
       nextTask() = envir().taskScheduler().scheduleDelayedTask(uSecsToDelay,
                     (TaskFunc*)checkForAuxSDPLine, this);
     }
    }

    char const* H264VideoStreamServerMediaSubsession::getAuxSDPLine(RTPSink* rtpSink, FramedSource* inputSource) {
     if (fAuxSDPLine != NULL) return fAuxSDPLine; // it's already been set up (for a previous client)

     if (fDummyRTPSink == NULL) { // we're not already setting it up for another, concurrent stream
       // Note: For H264 video files, the 'config' information ("profile-level-id" and "sprop-parameter-sets") isn't known
       // until we start reading the file.  This means that "rtpSink"s "auxSDPLine()" will be NULL initially,
       // and we need to start reading data from our file until this changes.
       fDummyRTPSink = rtpSink;

       // Start reading the file:
       fDummyRTPSink->startPlaying(*inputSource, afterPlayingDummy, this);

       // Check whether the sink's 'auxSDPLine()' is ready:
       checkForAuxSDPLine(this);
     }

     envir().taskScheduler().doEventLoop(&amp;fDoneFlag);

     return fAuxSDPLine;
    }

    FramedSource* H264VideoStreamServerMediaSubsession::createNewStreamSource(unsigned /*clientSessionId*/, unsigned&amp; estBitrate) {
     estBitrate = 500; // kb
     megamol::remotecontrol::View3D_MRC *parent = (megamol::remotecontrol::View3D_MRC*)this->parent;
     return H264VideoStreamFramer::createNew(envir(), parent->h264FramedSource);
    }

    RTPSink* H264VideoStreamServerMediaSubsession::createNewRTPSink(Groupsock* rtpGroupsock, unsigned char rtpPayloadTypeIfDynamic, FramedSource* /*inputSource*/) {
     return H264VideoRTPSink::createNew(envir(), rtpGroupsock, rtpPayloadTypeIfDynamic);
    }

    FramedSource.cpp

    H264FramedSource* H264FramedSource::createNew(UsageEnvironment&amp; env,
                                             unsigned preferredFrameSize,
                                             unsigned playTimePerFrame)
    {
       return new H264FramedSource(env, preferredFrameSize, playTimePerFrame);
    }

    H264FramedSource::H264FramedSource(UsageEnvironment&amp; env,
                                  unsigned preferredFrameSize,
                                  unsigned playTimePerFrame)
       : FramedSource(env),
       fPreferredFrameSize(fMaxSize),
       fPlayTimePerFrame(playTimePerFrame),
       fLastPlayTime(0),
       fCurIndex(0)
    {

       x264_param_default_preset(&amp;param, "veryfast", "zerolatency");
       param.i_threads = 1;
       param.i_width = 1024;
       param.i_height = 768;
       param.i_fps_num = 30;
       param.i_fps_den = 1;
       // Intra refres:
       param.i_keyint_max = 60;
       param.b_intra_refresh = 1;
       //Rate control:
       param.rc.i_rc_method = X264_RC_CRF;
       param.rc.f_rf_constant = 25;
       param.rc.f_rf_constant_max = 35;
       param.i_sps_id = 7;
       //For streaming:
       param.b_repeat_headers = 1;
       param.b_annexb = 1;
       x264_param_apply_profile(&amp;param, "baseline");

       param.i_log_level = X264_LOG_ERROR;

       encoder = x264_encoder_open(&amp;param);
       pic_in.i_type            = X264_TYPE_AUTO;
       pic_in.i_qpplus1         = 0;
       pic_in.img.i_csp         = X264_CSP_I420;
       pic_in.img.i_plane       = 3;


       x264_picture_alloc(&amp;pic_in, X264_CSP_I420, 1024, 768);

       convertCtx = sws_getContext(1024, 768, PIX_FMT_RGBA, 1024, 768, PIX_FMT_YUV420P, SWS_FAST_BILINEAR, NULL, NULL, NULL);
       eventTriggerId = envir().taskScheduler().createEventTrigger(deliverFrame0);
    }

    H264FramedSource::~H264FramedSource()
    {
       envir().taskScheduler().deleteEventTrigger(eventTriggerId);
       eventTriggerId = 0;
    }

    void H264FramedSource::AddToBuffer(uint8_t* buf, int surfaceSizeInBytes)
    {
       uint8_t* surfaceData = (new uint8_t[surfaceSizeInBytes]);

       memcpy(surfaceData, buf, surfaceSizeInBytes);

       int srcstride = 1024*4;
       sws_scale(convertCtx, &amp;surfaceData, &amp;srcstride,0, 768, pic_in.img.plane, pic_in.img.i_stride);
       x264_nal_t* nals = NULL;
       int i_nals = 0;
       int frame_size = -1;


       frame_size = x264_encoder_encode(encoder, &amp;nals, &amp;i_nals, &amp;pic_in, &amp;pic_out);

       static bool finished = false;

       if (frame_size >= 0)
       {
       static bool alreadydone = false;
       if(!alreadydone)
       {

           x264_encoder_headers(encoder, &amp;nals, &amp;i_nals);
           alreadydone = true;
       }
       for(int i = 0; i &lt; i_nals; ++i)
       {
           m_queue.push(nals[i]);
       }
       }
       delete [] surfaceData;
       surfaceData = nullptr;

       envir().taskScheduler().triggerEvent(eventTriggerId, this);
    }

    void H264FramedSource::doGetNextFrame()
    {
       deliverFrame();
    }

    void H264FramedSource::deliverFrame0(void* clientData)
    {
       ((H264FramedSource*)clientData)->deliverFrame();
    }

    void H264FramedSource::deliverFrame()
    {
       x264_nal_t nalToDeliver;

       if (fPlayTimePerFrame > 0 &amp;&amp; fPreferredFrameSize > 0) {
       if (fPresentationTime.tv_sec == 0 &amp;&amp; fPresentationTime.tv_usec == 0) {
           // This is the first frame, so use the current time:
           gettimeofday(&amp;fPresentationTime, NULL);
       } else {
           // Increment by the play time of the previous data:
           unsigned uSeconds   = fPresentationTime.tv_usec + fLastPlayTime;
           fPresentationTime.tv_sec += uSeconds/1000000;
           fPresentationTime.tv_usec = uSeconds%1000000;
       }

       // Remember the play time of this data:
       fLastPlayTime = (fPlayTimePerFrame*fFrameSize)/fPreferredFrameSize;
       fDurationInMicroseconds = fLastPlayTime;
       } else {
       // We don't know a specific play time duration for this data,
       // so just record the current time as being the 'presentation time':
       gettimeofday(&amp;fPresentationTime, NULL);
       }

       if(!m_queue.empty())
       {
       m_queue.wait_and_pop(nalToDeliver);

       uint8_t* newFrameDataStart = (uint8_t*)0xD15EA5E;

       newFrameDataStart = (uint8_t*)(nalToDeliver.p_payload);
       unsigned newFrameSize = nalToDeliver.i_payload;

       // Deliver the data here:
       if (newFrameSize > fMaxSize) {
           fFrameSize = fMaxSize;
           fNumTruncatedBytes = newFrameSize - fMaxSize;
       }
       else {
           fFrameSize = newFrameSize;
       }

       memcpy(fTo, nalToDeliver.p_payload, nalToDeliver.i_payload);

       FramedSource::afterGetting(this);
       }
    }

    Relevant part of the RTSP-Server Therad

     RTSPServer* rtspServer = RTSPServer::createNew(*(parent->env), 8554, NULL);
     if (rtspServer == NULL) {
       *(parent->env) &lt;&lt; "Failed to create RTSP server: " &lt;&lt; (parent->env)->getResultMsg() &lt;&lt; "\n";
       exit(1);
     }
     char const* streamName = "Stream";
     parent->h264FramedSource = H264FramedSource::createNew(*(parent->env), 0, 0);
     H264VideoStreamServerMediaSubsession *h264VideoStreamServerMediaSubsession = H264VideoStreamServerMediaSubsession::createNew(*(parent->env), true);
     h264VideoStreamServerMediaSubsession->parent = parent;
     sms->addSubsession(h264VideoStreamServerMediaSubsession);
     rtspServer->addServerMediaSession(sms);

     parent->env->taskScheduler().doEventLoop(); // does not return

    Once a connection exists the render loop calls

    h264FramedSource->AddToBuffer(videoData, 1024*768*4);