Recherche avancée

Médias (3)

Mot : - Tags -/pdf

Autres articles (25)

  • Qu’est ce qu’un éditorial

    21 juin 2013, par

    Ecrivez votre de point de vue dans un article. Celui-ci sera rangé dans une rubrique prévue à cet effet.
    Un éditorial est un article de type texte uniquement. Il a pour objectif de ranger les points de vue dans une rubrique dédiée. Un seul éditorial est placé à la une en page d’accueil. Pour consulter les précédents, consultez la rubrique dédiée.
    Vous pouvez personnaliser le formulaire de création d’un éditorial.
    Formulaire de création d’un éditorial Dans le cas d’un document de type éditorial, les (...)

  • Keeping control of your media in your hands

    13 avril 2011, par

    The vocabulary used on this site and around MediaSPIP in general, aims to avoid reference to Web 2.0 and the companies that profit from media-sharing.
    While using MediaSPIP, you are invited to avoid using words like "Brand", "Cloud" and "Market".
    MediaSPIP is designed to facilitate the sharing of creative media online, while allowing authors to retain complete control of their work.
    MediaSPIP aims to be accessible to as many people as possible and development is based on expanding the (...)

  • Personnaliser en ajoutant son logo, sa bannière ou son image de fond

    5 septembre 2013, par

    Certains thèmes prennent en compte trois éléments de personnalisation : l’ajout d’un logo ; l’ajout d’une bannière l’ajout d’une image de fond ;

Sur d’autres sites (3991)

  • FFMPEG : Get the Scene Change Detection value for all frame

    14 novembre 2022, par celacanto

    I'm trying to measure how much a movies is "fast" (more action in the screen and quick scene chances). I don't want just a single value for the movie, but values along the movie to see how the action varies during it. After normalize the frame rate of the movies (10 fps), my idea is to compare each frame with the previous. I'm not only interest if the scene has changed, but also, if there was no cut, how much movement there is. Not only people/object movement but also, camera movement. In summary the paced (I think that the term) of the scenes.

    



    My idea was to use the scene function from ffmpeg as a metric. But looking at the document and examples online I'm thinking I can only use the value of the Scene Change Detection as a threshold to return frames informations, but I can't get ffmpeg to return the value. Is that right ? There is any way I can make it return the value ?

    


  • use AVMutableVideoComposition rotate video after ffmpge can't get rotate info

    29 janvier 2018, par ladeng

    before the video is not rotated, I can use FFmpeg command get rotate information, command like this :

    ffprobe -v quiet -print_format json -show_format -show_streams recordVideo.mp4

    or

    ffmpeg -i recordVideo.mp4.

    when I use AVMutableVideoComposition rotate video, the video lost video rotate information, rotate video simple : RoatetVideoSimpleCode,
    code below :

    -(void)performWithAsset:(AVAsset*)asset complateBlock:(void(^)(void))complateBlock{
    AVMutableComposition *mutableComposition;
    AVMutableVideoComposition *mutableVideoComposition;
    cacheRotateVideoURL = [[NSURL alloc] initFileURLWithPath:[NSString pathWithComponents:@[NSTemporaryDirectory(), kCacheCertVideoRotate]]];

    AVMutableVideoCompositionInstruction *instruction = nil;
    AVMutableVideoCompositionLayerInstruction *layerInstruction = nil;
    CGAffineTransform t1;
    CGAffineTransform t2;

    AVAssetTrack *assetVideoTrack = nil;
    AVAssetTrack *assetAudioTrack = nil;
    // Check if the asset contains video and audio tracks
    if ([[asset tracksWithMediaType:AVMediaTypeVideo] count] != 0) {
       assetVideoTrack = [asset tracksWithMediaType:AVMediaTypeVideo][0];
    }
    if ([[asset tracksWithMediaType:AVMediaTypeAudio] count] != 0) {
       assetAudioTrack = [asset tracksWithMediaType:AVMediaTypeAudio][0];
    }

    CMTime insertionPoint = kCMTimeZero;
    NSError *error = nil;

    //    CGAffineTransform rotateTranslate;
    // Step 1
    // Create a composition with the given asset and insert audio and video tracks into it from the asset
    if (!mutableComposition) {

       // Check whether a composition has already been created, i.e, some other tool has already been applied
       // Create a new composition
       mutableComposition = [AVMutableComposition composition];

       // Insert the video and audio tracks from AVAsset
       if (assetVideoTrack != nil) {
           AVMutableCompositionTrack *compositionVideoTrack = [mutableComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
           [compositionVideoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, [asset duration]) ofTrack:assetVideoTrack atTime:insertionPoint error:&error];

       }
       if (assetAudioTrack != nil) {
           AVMutableCompositionTrack *compositionAudioTrack = [mutableComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
           [compositionAudioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, [asset duration]) ofTrack:assetAudioTrack atTime:insertionPoint error:&error];
       }

    }


    // Step 2
    // Translate the composition to compensate the movement caused by rotation (since rotation would cause it to move out of frame)
    //    t1 = CGAffineTransformMakeTranslation(assetVideoTrack.naturalSize.height, 0.0);
    // Rotate transformation
    //    t2 = CGAffineTransformRotate(t1, degreesToRadians(90.0));

    CGFloat degrees = 90;
    //--
    if (degrees != 0) {
       //        CGAffineTransform mixedTransform;
       if(degrees == 90){
           //90°
           t1 = CGAffineTransformMakeTranslation(assetVideoTrack.naturalSize.height,0.0);
           t2 = CGAffineTransformRotate(t1,M_PI_2);
       }else if(degrees == 180){
           //180°
           t1 = CGAffineTransformMakeTranslation(assetVideoTrack.naturalSize.width, assetVideoTrack.naturalSize.height);
           t2 = CGAffineTransformRotate(t1,M_PI);
       }else if(degrees == 270){
           //270°
           t1 = CGAffineTransformMakeTranslation(0.0, assetVideoTrack.naturalSize.width);
           t2 = CGAffineTransformRotate(t1,M_PI_2*3.0);
       }
    }

    // Step 3
    // Set the appropriate render sizes and rotational transforms
    if (!mutableVideoComposition) {

       // Create a new video composition
       mutableVideoComposition = [AVMutableVideoComposition videoComposition];
       mutableVideoComposition.renderSize = CGSizeMake(assetVideoTrack.naturalSize.height,assetVideoTrack.naturalSize.width);
       mutableVideoComposition.frameDuration = CMTimeMake(1, 30);

       // The rotate transform is set on a layer instruction
       instruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
       instruction.timeRange = CMTimeRangeMake(kCMTimeZero, [mutableComposition duration]);
       layerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:(mutableComposition.tracks)[0]];
       [layerInstruction setTransform:t2 atTime:kCMTimeZero];
       //           [layerInstruction setTransform:rotateTranslate atTime:kCMTimeZero];

    } else {

       mutableVideoComposition.renderSize = CGSizeMake(mutableVideoComposition.renderSize.height, mutableVideoComposition.renderSize.width);

       // Extract the existing layer instruction on the mutableVideoComposition
       instruction = (mutableVideoComposition.instructions)[0];
       layerInstruction = (instruction.layerInstructions)[0];

       // Check if a transform already exists on this layer instruction, this is done to add the current transform on top of previous edits
       CGAffineTransform existingTransform;

       if (![layerInstruction getTransformRampForTime:[mutableComposition duration] startTransform:&existingTransform endTransform:NULL timeRange:NULL]) {
           [layerInstruction setTransform:t2 atTime:kCMTimeZero];
       } else {
           // Note: the point of origin for rotation is the upper left corner of the composition, t3 is to compensate for origin
           CGAffineTransform t3 = CGAffineTransformMakeTranslation(-1*assetVideoTrack.naturalSize.height/2, 0.0);
           CGAffineTransform newTransform = CGAffineTransformConcat(existingTransform, CGAffineTransformConcat(t2, t3));
           [layerInstruction setTransform:newTransform atTime:kCMTimeZero];
       }

    }


    // Step 4
    // Add the transform instructions to the video composition
    instruction.layerInstructions = @[layerInstruction];
    mutableVideoComposition.instructions = @[instruction];

    //write video
    if ([[NSFileManager  defaultManager] fileExistsAtPath:cacheRotateVideoURL.path]) {
       NSError *error = nil;
       BOOL removeFlag = [[NSFileManager  defaultManager] removeItemAtURL:cacheRotateVideoURL error:&error];
       SPLog(@"remove rotate file:%@ %@",cacheRotateVideoURL.path,removeFlag?@"Success":@"Failed");
    }

    AVAssetExportSession *exportSession = [[AVAssetExportSession alloc] initWithAsset:asset presetName:AVAssetExportPresetMediumQuality] ;

    exportSession.outputURL = cacheRotateVideoURL;
    exportSession.outputFileType = AVFileTypeMPEG4;
    exportSession.videoComposition = mutableVideoComposition;
    exportSession.shouldOptimizeForNetworkUse = YES;
    exportSession.timeRange = CMTimeRangeMake(kCMTimeZero, asset.duration);

    [exportSession exportAsynchronouslyWithCompletionHandler:^{
       SPLog(@"cache write done");
       AVAsset* asset = [AVURLAsset URLAssetWithURL: cacheRotateVideoURL options:nil];
       SPLog(@"rotate recrod video time: %lf",CMTimeGetSeconds(asset.duration));

       ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
       [library writeVideoAtPathToSavedPhotosAlbum:cacheRotateVideoURL
                                   completionBlock:^(NSURL *assetURL, NSError *error) {
                                       if (error) {
                                           NSLog(@"Save video fail:%@",error);
                                       } else {
                                           NSLog(@"Save video succeed.");
                                       }
                                   }];

       complateBlock();
    }];

    }

    can anyone tell me why this is so?
    how can I write rotate information when I rotate the video ?

  • Moving an overlay smoothly

    6 décembre 2017, par techr

    I have a sliding video overlay on top of a background screen and move it from point a to point b.

    The resulting video produces choppy movement. The overlay moves a step at a time instead of in one continous motion.

    Here’s an example of the code :

    ffmpeg -y -stream_loop -1 -i b.mpg -i bg.jpg -filter_complex "
    [0:v]scale=320:240[vid];[1][vid]overlay=x='(180+(30-180)*(t-
    0)/60)*between(t,0,60)+(30+(290-30)*(t-60)/60)*between(t,60,120)':y='(120+
    (120-120)*(t-0)/60)*between(t,0,60)+(120+(30-120)*(t-
    60)/60)*between(t,60,120)'[out]" -map "[out]" -r 29.97 -aspect 4:3 -b:v
    4000k -minrate 4000k -maxrate 4000k -bufsize 2000k -t 64 t.mpg

    Input #0, mpegvideo, from 'b.mpg':
    Duration: 00:00:32.34, bitrate: 4000 kb/s
    Stream #0:0: Video: mpeg2video (Main), yuv420p(tv, progressive), 720x480
    [SAR 32:27 DAR 16:9], 4000 kb/s, 29.97 fps, 29.97 tbr, 1200k tbn, 59.94 tbc
    Input #1, image2, from 'bg.jpg':
    Duration: 00:00:00.04, start: 0.000000, bitrate: 19637 kb/s
    Stream #1:0: Video: mjpeg, yuvj444p(pc, bt470bg/unknown/unknown), 640x480,
    25 tbr, 25 tbn, 25 tbc
    Stream mapping:
    Stream #0:0 (mpeg2video) -> scale
    Stream #1:0 (mjpeg) -> overlay:main
    overlay -> Stream #0:0 (mpeg1video)
    Press [q] to stop, [?] for help
    [swscaler @ 0000000002adc740] deprecated pixel format used, make sure you
    did set range correctly
    Output #0, mpeg, to 't.mpg':
    Metadata:
    encoder         : Lavf57.73.100
    Stream #0:0: Video: mpeg1video, yuv420p, 640x480 [SAR 1:1 DAR 4:3], q=2-31,
    4000 kb/s, 29.97 fps, 90k tbn, 29.97 tbc (default)
    Metadata:
    encoder         : Lavc57.99.100 mpeg1video
    Side data:
    cpb: bitrate max/min/avg: 4000000/4000000/4000000 buffer size: 2000000
    vbv_delay: -1
    warning, clipping 1 dct coefficients to -255..255repeated 2 timesnter

    Link to video here : https://youtu.be/EE_hrjy4ilg