Recherche avancée

Médias (3)

Mot : - Tags -/Valkaama

Autres articles (72)

  • Organiser par catégorie

    17 mai 2013, par

    Dans MédiaSPIP, une rubrique a 2 noms : catégorie et rubrique.
    Les différents documents stockés dans MédiaSPIP peuvent être rangés dans différentes catégories. On peut créer une catégorie en cliquant sur "publier une catégorie" dans le menu publier en haut à droite ( après authentification ). Une catégorie peut être rangée dans une autre catégorie aussi ce qui fait qu’on peut construire une arborescence de catégories.
    Lors de la publication prochaine d’un document, la nouvelle catégorie créée sera proposée (...)

  • Récupération d’informations sur le site maître à l’installation d’une instance

    26 novembre 2010, par

    Utilité
    Sur le site principal, une instance de mutualisation est définie par plusieurs choses : Les données dans la table spip_mutus ; Son logo ; Son auteur principal (id_admin dans la table spip_mutus correspondant à un id_auteur de la table spip_auteurs)qui sera le seul à pouvoir créer définitivement l’instance de mutualisation ;
    Il peut donc être tout à fait judicieux de vouloir récupérer certaines de ces informations afin de compléter l’installation d’une instance pour, par exemple : récupérer le (...)

  • Encoding and processing into web-friendly formats

    13 avril 2011, par

    MediaSPIP automatically converts uploaded files to internet-compatible formats.
    Video files are encoded in MP4, Ogv and WebM (supported by HTML5) and MP4 (supported by Flash).
    Audio files are encoded in MP3 and Ogg (supported by HTML5) and MP3 (supported by Flash).
    Where possible, text is analyzed in order to retrieve the data needed for search engine detection, and then exported as a series of image files.
    All uploaded files are stored online in their original format, so you can (...)

Sur d’autres sites (5593)

  • FFMPEG - Watermark in Random Position

    13 octobre 2018, par gbvisconti

    I need to make a watermark in my video with the following loop :

    1. Appear in a random position on the video.
    2. Fade out after 3 seconds.
    3. Appear again after random seconds.

    I have read documentation and also found a forum with similar solution, that is almost what I need.

    ffmpeg -i video.mp4 -c:v libx264 -preset veryfast -crf 23 -tune zerolatency -vendor ap10 -pix_fmt yuv420p -filter:v drawtext=fontfile=arial.ttf:text='WatermarkTextHere':fontcolor=black@0.5:fontsize=16:x=if(eq(mod(n\,200)\,0)\,sin(random(1))*w\,x):y=if(eq(mod(n\,200)\,0)\,sin(random(1))*h\,y) -c:a copy outvideo2.mp4

    I don’t know how to adapt correctly yet.

    EDIT

    After some hours I managed to get this. It fades in and out, but is not really random. The logo.png only appears mostly in upper left area of the video. How to make it appear more random in any area of the video ?

    ffmpeg -i input.mp4 -loop 1 -i logo.png -filter_complex "[1]trim=0:30,fade=in:st=0:d=1:alpha=1,fade=out:st=9:d=1:alpha=1,loop=999:750:0,setpts=N/25/TB[w];[0][w]overlay=shortest=1:x=if(eq(mod(n\,200)\,0)\,sin(random(1))*w\,x):y=if(eq(mod(n\,200)\,0)\,sin(random(1))*h\,y)" output.mp4
  • use AVMutableVideoComposition rotate video after ffmpge can't get rotate info

    29 janvier 2018, par ladeng

    before the video is not rotated, I can use FFmpeg command get rotate information, command like this :

    ffprobe -v quiet -print_format json -show_format -show_streams recordVideo.mp4

    or

    ffmpeg -i recordVideo.mp4.

    when I use AVMutableVideoComposition rotate video, the video lost video rotate information, rotate video simple : RoatetVideoSimpleCode,
    code below :

    -(void)performWithAsset:(AVAsset*)asset complateBlock:(void(^)(void))complateBlock{
    AVMutableComposition *mutableComposition;
    AVMutableVideoComposition *mutableVideoComposition;
    cacheRotateVideoURL = [[NSURL alloc] initFileURLWithPath:[NSString pathWithComponents:@[NSTemporaryDirectory(), kCacheCertVideoRotate]]];

    AVMutableVideoCompositionInstruction *instruction = nil;
    AVMutableVideoCompositionLayerInstruction *layerInstruction = nil;
    CGAffineTransform t1;
    CGAffineTransform t2;

    AVAssetTrack *assetVideoTrack = nil;
    AVAssetTrack *assetAudioTrack = nil;
    // Check if the asset contains video and audio tracks
    if ([[asset tracksWithMediaType:AVMediaTypeVideo] count] != 0) {
       assetVideoTrack = [asset tracksWithMediaType:AVMediaTypeVideo][0];
    }
    if ([[asset tracksWithMediaType:AVMediaTypeAudio] count] != 0) {
       assetAudioTrack = [asset tracksWithMediaType:AVMediaTypeAudio][0];
    }

    CMTime insertionPoint = kCMTimeZero;
    NSError *error = nil;

    //    CGAffineTransform rotateTranslate;
    // Step 1
    // Create a composition with the given asset and insert audio and video tracks into it from the asset
    if (!mutableComposition) {

       // Check whether a composition has already been created, i.e, some other tool has already been applied
       // Create a new composition
       mutableComposition = [AVMutableComposition composition];

       // Insert the video and audio tracks from AVAsset
       if (assetVideoTrack != nil) {
           AVMutableCompositionTrack *compositionVideoTrack = [mutableComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
           [compositionVideoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, [asset duration]) ofTrack:assetVideoTrack atTime:insertionPoint error:&error];

       }
       if (assetAudioTrack != nil) {
           AVMutableCompositionTrack *compositionAudioTrack = [mutableComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
           [compositionAudioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, [asset duration]) ofTrack:assetAudioTrack atTime:insertionPoint error:&error];
       }

    }


    // Step 2
    // Translate the composition to compensate the movement caused by rotation (since rotation would cause it to move out of frame)
    //    t1 = CGAffineTransformMakeTranslation(assetVideoTrack.naturalSize.height, 0.0);
    // Rotate transformation
    //    t2 = CGAffineTransformRotate(t1, degreesToRadians(90.0));

    CGFloat degrees = 90;
    //--
    if (degrees != 0) {
       //        CGAffineTransform mixedTransform;
       if(degrees == 90){
           //90°
           t1 = CGAffineTransformMakeTranslation(assetVideoTrack.naturalSize.height,0.0);
           t2 = CGAffineTransformRotate(t1,M_PI_2);
       }else if(degrees == 180){
           //180°
           t1 = CGAffineTransformMakeTranslation(assetVideoTrack.naturalSize.width, assetVideoTrack.naturalSize.height);
           t2 = CGAffineTransformRotate(t1,M_PI);
       }else if(degrees == 270){
           //270°
           t1 = CGAffineTransformMakeTranslation(0.0, assetVideoTrack.naturalSize.width);
           t2 = CGAffineTransformRotate(t1,M_PI_2*3.0);
       }
    }

    // Step 3
    // Set the appropriate render sizes and rotational transforms
    if (!mutableVideoComposition) {

       // Create a new video composition
       mutableVideoComposition = [AVMutableVideoComposition videoComposition];
       mutableVideoComposition.renderSize = CGSizeMake(assetVideoTrack.naturalSize.height,assetVideoTrack.naturalSize.width);
       mutableVideoComposition.frameDuration = CMTimeMake(1, 30);

       // The rotate transform is set on a layer instruction
       instruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
       instruction.timeRange = CMTimeRangeMake(kCMTimeZero, [mutableComposition duration]);
       layerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:(mutableComposition.tracks)[0]];
       [layerInstruction setTransform:t2 atTime:kCMTimeZero];
       //           [layerInstruction setTransform:rotateTranslate atTime:kCMTimeZero];

    } else {

       mutableVideoComposition.renderSize = CGSizeMake(mutableVideoComposition.renderSize.height, mutableVideoComposition.renderSize.width);

       // Extract the existing layer instruction on the mutableVideoComposition
       instruction = (mutableVideoComposition.instructions)[0];
       layerInstruction = (instruction.layerInstructions)[0];

       // Check if a transform already exists on this layer instruction, this is done to add the current transform on top of previous edits
       CGAffineTransform existingTransform;

       if (![layerInstruction getTransformRampForTime:[mutableComposition duration] startTransform:&existingTransform endTransform:NULL timeRange:NULL]) {
           [layerInstruction setTransform:t2 atTime:kCMTimeZero];
       } else {
           // Note: the point of origin for rotation is the upper left corner of the composition, t3 is to compensate for origin
           CGAffineTransform t3 = CGAffineTransformMakeTranslation(-1*assetVideoTrack.naturalSize.height/2, 0.0);
           CGAffineTransform newTransform = CGAffineTransformConcat(existingTransform, CGAffineTransformConcat(t2, t3));
           [layerInstruction setTransform:newTransform atTime:kCMTimeZero];
       }

    }


    // Step 4
    // Add the transform instructions to the video composition
    instruction.layerInstructions = @[layerInstruction];
    mutableVideoComposition.instructions = @[instruction];

    //write video
    if ([[NSFileManager  defaultManager] fileExistsAtPath:cacheRotateVideoURL.path]) {
       NSError *error = nil;
       BOOL removeFlag = [[NSFileManager  defaultManager] removeItemAtURL:cacheRotateVideoURL error:&error];
       SPLog(@"remove rotate file:%@ %@",cacheRotateVideoURL.path,removeFlag?@"Success":@"Failed");
    }

    AVAssetExportSession *exportSession = [[AVAssetExportSession alloc] initWithAsset:asset presetName:AVAssetExportPresetMediumQuality] ;

    exportSession.outputURL = cacheRotateVideoURL;
    exportSession.outputFileType = AVFileTypeMPEG4;
    exportSession.videoComposition = mutableVideoComposition;
    exportSession.shouldOptimizeForNetworkUse = YES;
    exportSession.timeRange = CMTimeRangeMake(kCMTimeZero, asset.duration);

    [exportSession exportAsynchronouslyWithCompletionHandler:^{
       SPLog(@"cache write done");
       AVAsset* asset = [AVURLAsset URLAssetWithURL: cacheRotateVideoURL options:nil];
       SPLog(@"rotate recrod video time: %lf",CMTimeGetSeconds(asset.duration));

       ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
       [library writeVideoAtPathToSavedPhotosAlbum:cacheRotateVideoURL
                                   completionBlock:^(NSURL *assetURL, NSError *error) {
                                       if (error) {
                                           NSLog(@"Save video fail:%@",error);
                                       } else {
                                           NSLog(@"Save video succeed.");
                                       }
                                   }];

       complateBlock();
    }];

    }

    can anyone tell me why this is so?
    how can I write rotate information when I rotate the video ?

  • Video duration is the double of what it should be

    12 juin 2019, par lulas

    I was trying to record my computer screen, for that I am using Accord.

    I had everything working fine with the Accord version 3.8.0.0, but when I upgraded to the version 3.8.2.0-alpha my recording videos are slow down, and they take approximately the double to finish.

    So a 10 second video actually takes 20 seconds to play (in the video player the duration of the video is also 10 seconds).

    I searched in google and didn’t managed to find anything usefull. Probably due to the alpha version of the release.

    I managed to find a project from one of the maintainers of the Accord project that uses this 3.8.2.0-alpha version : https://github.com/cesarsouza/screencast-capture

    I downloaded the zip file, extracted the project, built the project, fixed a compilation error (change a Dispose() to Stop()) and then ran the application, but the problem still exists... the generated video files are still slow down...

    The main methods of my code with the 3.8.2.0-alpha version are these (they were taken from the previous project I downloaded) :

    public void StartRecording()
    {
       if (IsRecording || !IsPlaying)
           return;

       int height = area.Height;
       int width = area.Width;
       Rational framerate = new Rational(1000, screenStream.FrameInterval);
       int videoBitRate = 1200 * 1000;

       OutputPath = Path.Combine(main.CurrentDirectory, fileName);
       RecordingStartTime = DateTime.MinValue;
       videoWriter = new VideoFileWriter();
       videoWriter.BitRate = videoBitRate;
       videoWriter.FrameRate = framerate;
       videoWriter.Width = width;
       videoWriter.Height = height;
       videoWriter.VideoCodec = VideoCodec.H264;
       videoWriter.VideoOptions["crf"] = "18";
       videoWriter.VideoOptions["preset"] = "veryfast";
       videoWriter.VideoOptions["tune"] = "zerolatency";
       videoWriter.VideoOptions["x264opts"] = "no-mbtree:sliced-threads:sync-lookahead=0";

       videoWriter.Open(OutputPath);

       HasRecorded = false;
       IsRecording = true;
    }

    void VideoPlayer_NewFrameReceived(object sender, Accord.Video.NewFrameEventArgs eventArgs)
    {
       DateTime currentFrameTime = eventArgs.CaptureFinished;

       // Encode the last frame at the same time we prepare the new one
       Task.WaitAll(
           Task.Run(() =>
           {
               lock (syncObj) // Save the frame to the video file.
               {
                   if (IsRecording)
                   {
                       if (RecordingStartTime == DateTime.MinValue)
                           RecordingStartTime = DateTime.Now;

                       TimeSpan timestamp = currentFrameTime - RecordingStartTime;
                       if (timestamp > TimeSpan.Zero)
                           videoWriter.WriteVideoFrame(this.lastFrame, timestamp, this.lastFrameRegion);
                   }
               }
           }),

           Task.Run(() =>
           {
               // Adjust the window according to the current capture
               // mode. Also adjusts to keep even widths and heights.
               CaptureRegion = AdjustWindow();

               // Crop the image if the mode requires it
               if (CaptureMode == CaptureRegionOption.Fixed ||
                   CaptureMode == CaptureRegionOption.Window)
               {
                   crop.Rectangle = CaptureRegion;

                   eventArgs.Frame = croppedImage = crop.Apply(eventArgs.Frame, croppedImage);
                   eventArgs.FrameSize = crop.Rectangle.Size;
               }

               //// Draw extra information on the screen
               bool captureMouse = Settings.Default.CaptureMouse;
               bool captureClick = Settings.Default.CaptureClick;
               bool captureKeys = Settings.Default.CaptureKeys;

               if (captureMouse || captureClick || captureKeys)
               {
                   cursorCapture.CaptureRegion = CaptureRegion;
                   clickCapture.CaptureRegion = CaptureRegion;
                   keyCapture.Font = Settings.Default.KeyboardFont;

                   using (Graphics g = Graphics.FromImage(eventArgs.Frame))
                   {
                       g.CompositingQuality = CompositingQuality.HighSpeed;
                       g.SmoothingMode = SmoothingMode.HighSpeed;

                       float invWidth = 1; // / widthScale;
                       float invHeight = 1; // / heightScale;

                       if (captureMouse)
                           cursorCapture.Draw(g, invWidth, invHeight);

                       if (captureClick)
                           clickCapture.Draw(g, invWidth, invHeight);

                       if (captureKeys)
                           keyCapture.Draw(g, invWidth, invHeight);
                   }
               }
           })
       );

       // Save the just processed frame and mark it to be encoded in the next iteration:
       lastFrame = eventArgs.Frame.Copy(lastFrame);

       lastFrameRegion = new Rectangle(0, 0, eventArgs.FrameSize.Width, eventArgs.Frame.Height);
    }

    Anyone knows what might be problem causing this slow down ?

    EDIT :

    I think I managed to find my problem :

    pts:4.032000e+003   pts_time:0.252  dts:2.016000e+003   dts_time:0.126  duration:6.720000e+002  duration_time:0.042 stream_index:0
    pts:6.720000e+003   pts_time:0.42   dts:3.360000e+003   dts_time:0.21   duration:6.720000e+002  duration_time:0.042 stream_index:0
    pts:1.075200e+004   pts_time:0.672  dts:5.376000e+003   dts_time:0.336  duration:6.720000e+002  duration_time:0.042 stream_index:0
    pts:1.344000e+004   pts_time:0.84   dts:6.720000e+003   dts_time:0.42   duration:6.720000e+002  duration_time:0.042 stream_index:0
    pts:1.612800e+004   pts_time:1.008  dts:8.064000e+003   dts_time:0.504  duration:6.720000e+002  duration_time:0.042 stream_index:0
    pts:1.881600e+004   pts_time:1.176  dts:9.408000e+003   dts_time:0.588  duration:6.720000e+002  duration_time:0.042 stream_index:0
    pts:2.150400e+004   pts_time:1.344  dts:1.075200e+004   dts_time:0.672  duration:6.720000e+002  duration_time:0.042 stream_index:0
    pts:2.553600e+004   pts_time:1.596  dts:1.276800e+004   dts_time:0.798  duration:6.720000e+002  duration_time:0.042 stream_index:0
    pts:2.822400e+004   pts_time:1.764  dts:1.411200e+004   dts_time:0.882  duration:6.720000e+002  duration_time:0.042 stream_index:0
    pts:3.091200e+004   pts_time:1.932  dts:1.545600e+004   dts_time:0.966  duration:6.720000e+002  duration_time:0.042 stream_index:0
    pts:3.494400e+004   pts_time:2.184  dts:1.747200e+004   dts_time:1.092  duration:6.720000e+002  duration_time:0.042 stream_index:0
    pts:3.897600e+004   pts_time:2.436  dts:1.948800e+004   dts_time:1.218  duration:6.720000e+002  duration_time:0.042 stream_index:0
    pts:4.166400e+004   pts_time:2.604  dts:2.083200e+004   dts_time:1.302  duration:6.720000e+002  duration_time:0.042 stream_index:0
    pts:4.704000e+004   pts_time:2.94   dts:2.352000e+004   dts_time:1.47   duration:6.720000e+002  duration_time:0.042 stream_index:0
    pts:5.107200e+004   pts_time:3.192  dts:2.553600e+004   dts_time:1.596  duration:6.720000e+002  duration_time:0.042 stream_index:0

    The PTS is always the double of DTS, that’s why the video is presented in a slow down way.

    Unfortunately I have no idea why this happens... Anyone has any clue ?