Recherche avancée

Médias (91)

Autres articles (65)

  • Websites made ​​with MediaSPIP

    2 mai 2011, par

    This page lists some websites based on MediaSPIP.

  • Amélioration de la version de base

    13 septembre 2013

    Jolie sélection multiple
    Le plugin Chosen permet d’améliorer l’ergonomie des champs de sélection multiple. Voir les deux images suivantes pour comparer.
    Il suffit pour cela d’activer le plugin Chosen (Configuration générale du site > Gestion des plugins), puis de configurer le plugin (Les squelettes > Chosen) en activant l’utilisation de Chosen dans le site public et en spécifiant les éléments de formulaires à améliorer, par exemple select[multiple] pour les listes à sélection multiple (...)

  • Emballe médias : à quoi cela sert ?

    4 février 2011, par

    Ce plugin vise à gérer des sites de mise en ligne de documents de tous types.
    Il crée des "médias", à savoir : un "média" est un article au sens SPIP créé automatiquement lors du téléversement d’un document qu’il soit audio, vidéo, image ou textuel ; un seul document ne peut être lié à un article dit "média" ;

Sur d’autres sites (6285)

  • Webcam stream with FFMpeg on iPhone

    6 décembre 2011, par Saphrosit

    I'm trying to send and show a webcam stream from a linux server to an iPhone app. I don't know if it's the best solution, but I downloaded and installed FFMpeg on the linux server (following, for those who want to know, this tutorial).
    FFMpeg is working fine. After a lots of wandering, I managed to send a stream to the client launching

    ffmpeg  -s 320x240 -f video4linux2 -i /dev/video0 -f mpegts -vcodec libx264 udp://192.168.1.34:1234

    where 192.168.1.34 is the address of the client. Actually the client is a Mac, but it is supposed to be an iPhone. I know the stream is sent and received correctly (tested in different ways).
    However I didn't managed to watch the stream directly on the iPhone.
    I thought of different (possible) solutions :

    • first solution : store incoming data in a NSMutableData object. Then, when the stream ends, store it and then play it using a MPMoviePlayerController. Here's the code :

      [video writeToFile:@"videoStream.m4v" atomically:YES];
      NSURL *url = [NSURL fileURLWithPath:@"videoStream.m4v"];

      MPMoviePlayerController *videoController = [[MPMoviePlayerController alloc] initWithContentURL:url];

      [videoController.view setFrame:CGRectMake(100, 100, 150, 150)];

      [self.view addSubview:videoController.view];

      [videoController play];

      the problem of this solution is that nothing is played (I only see a black square), even if the video is saved correctly (I can play it directly from my disk using VLC). Besides, it's not such a great idea. It's just to make things work.

    • Second solution : use CMSampleBufferRef to store the incoming video. Much more problems comes with this solution : first of all, there's no CoreMedia.framework in my system. Besides I do not get well what does this class represents and what should I do to make it works : I mean if I start (somehow) filling this "SampleBuffer" with bytes I receive from UDP connection, then it will automatically call the CMSampleBufferMakeDataReadyCallback function I set during creation ? If yes, when ? When the single frame is completed or when the whole stream is received ?

    • Third solution : use AVFoundation framework (neither this is actually available on my Mac). I did not understand if it's actually possible to start recording from a remote source or even from a NSMutableData, a char* or something like that. On AVFoundation Programming Guide I didn't find any reference that say if it's possible or not.

    I don't know which one of this solution is the best for my purpose. ANY suggestion would be appreciate.

    Besides, there's also another problem : I didn't use any segmenter program to send the video. Now, if I'm not getting wrong, segmenter needs to split the source video in smaller/shorter video easier to send. If it is right, then maybe it's not strictly necessary to make things work (may be added later). However, since the server is running under linux, I cannot use Apple's mediastreamsegmeter. May someone suggest an opensource segmenter to use in association with FFMpeg ?


    UPDATE : I edited my question adding more informations on what I did since now and what my doubts are.

  • ffmpeg mp3 to mp4 with image compatible with iphone [duplicate]

    3 mai 2016, par neoDev

    This question already has an answer here :

    I need to convert an mp3 to mp4 with image using ffmpeg from my OSX command line.

    I already installed last version of ffmpeg (3.0.2), and tried the following command as described here

    ./ffmpeg -loop 1 -i test.jpg -i test.mp3 -c:a copy -c:v libx264 -shortest test.mp4

    but the result mp4 file is not compatible with my iphone 5

    Now this is what I am trying

    ./ffmpeg -loop 1 -i test.png -i test.mp3 -profile:v baseline -level 3.0 test.mp4

    but I get the error

    Error while opening encoder for output stream #0:0 - maybe incorrect
    parameters such as bit_rate, rate, width or height

  • ffmpeg link errors when building on iPhone 4.3 SDK

    12 septembre 2011, par YuzaKen

    After a rather trying few days, I finally got ffmpeg to compile under Xcode 4 with SDK 4.3. The issue no is a series (39) link errors. They fall into at least two cases : assembly language routines and static arrays defined in header files. My believe is that it is generating C method names for the assembly routines while the .c files containing the reference to the routine is generating a different method name (munging).

    Undefined symbols for architecture armv7 :

     "_ff_vector_fmul_vfp", referenced from:
         _ff_dsputil_init_vfp in libavcodec.a(dsputil_init_vfp.o)
     "_main", referenced from:
         start in crt1.3.1.o
     "_av_solve_lls", referenced from:
         _ff_lpc_calc_coefs in libavcodec.a(lpc.o)
     "_ff_inv_aanscales", referenced from:
         _dct_quantize_trellis_c in libavcodec.a(mpegvideo_enc.o)
         _decode_frame in libavcodec.a(eamad.o)
         _tgq_decode_frame in libavcodec.a(eatgq.o)
         _tqi_decode_frame in libavcodec.a(eatqi.o)
     "_ff_add_pixels_clamped_armv6", referenced from:
         _ff_dsputil_init_armv6 in libavcodec.a(dsputil_init_armv6.o)
     "_ff_cga_palette", referenced from:
         _tmv_decode_frame in libavcodec.a(tmv.o)
     "_ff_svq1_inter_multistage_vlc", referenced from:
         _encode_block in libavcodec.a(svq1enc.o)
         _svq1_decode_init in libavcodec.a(svq1dec.o)
     "_ff_simple_idct_armv6", referenced from:
         _ff_dsputil_init_armv6 in libavcodec.a(dsputil_init_armv6.o)
     "_BZ2_bzDecompressInit", referenced from:
         _matroska_decode_buffer in libavformat.a(matroskadec.o)
     "_ff_put_pixels8_y2_arm", referenced from:
         _ff_put_pixels16_y2_arm in libavcodec.a(dsputil_init_arm.o)
         _dsputil_init_arm in libavcodec.a(dsputil_init_arm.o)
     "_ff_simple_idct_add_armv6", referenced from:

    ...and so on.

    Anyone with experience with ffmpeg on iPhone ? Successfully ?