
Recherche avancée
Autres articles (99)
-
Des sites réalisés avec MediaSPIP
2 mai 2011, parCette page présente quelques-uns des sites fonctionnant sous MediaSPIP.
Vous pouvez bien entendu ajouter le votre grâce au formulaire en bas de page. -
Récupération d’informations sur le site maître à l’installation d’une instance
26 novembre 2010, parUtilité
Sur le site principal, une instance de mutualisation est définie par plusieurs choses : Les données dans la table spip_mutus ; Son logo ; Son auteur principal (id_admin dans la table spip_mutus correspondant à un id_auteur de la table spip_auteurs)qui sera le seul à pouvoir créer définitivement l’instance de mutualisation ;
Il peut donc être tout à fait judicieux de vouloir récupérer certaines de ces informations afin de compléter l’installation d’une instance pour, par exemple : récupérer le (...) -
Les tâches Cron régulières de la ferme
1er décembre 2010, parLa gestion de la ferme passe par l’exécution à intervalle régulier de plusieurs tâches répétitives dites Cron.
Le super Cron (gestion_mutu_super_cron)
Cette tâche, planifiée chaque minute, a pour simple effet d’appeler le Cron de l’ensemble des instances de la mutualisation régulièrement. Couplée avec un Cron système sur le site central de la mutualisation, cela permet de simplement générer des visites régulières sur les différents sites et éviter que les tâches des sites peu visités soient trop (...)
Sur d’autres sites (12812)
-
Swift - Workaround/Alternative to M3u8 to play mp4 segment or merge segments into mp4
14 mai 2020, par STerrierI used AVAssetExportSession to download a session URL but the issue that you can't download live stream so to get around it, the live stream is split into 10 seconds mp4 segments that are downloaded using an m3u8 to create the URLs. I then use AVAssetExportSession to merge those mp4 segments.



I can merge those clips one by one into one mp4 file which is what I want but as the file gets bigger, the longer it takes as I am dealing with thousands of segments which becomes unpractical.



I thought about using AVplayerLooper but I cannot scrub, rewind or forward through the mp4 segment like a single video.



Is there a way to combine the mp4 clips together to play as one video as the m3u8 does without merging ? or is there a fast way to merge videos ?



Note : The server uses FFmpeg but I am not allowed to use FFmpeg or pods in the app.



below is the function to merge videos



var mp4Array: [AVAsset] = []
var avAssetExportSession: AVAssetExportSession?

var firstAsset: AVAsset?
var secondAsset: AVAsset?

func mergeVideos() {

 firstAsset = mp4Array.first
 secondAsset = mp4Array[1]

 guard let firstAsset = firstAsset, let secondAsset = secondAsset else { return }
 let mixComposition = AVMutableComposition()

 guard let firstTrack = mixComposition.addMutableTrack(withMediaType: .video, preferredTrackID: Int32(kCMPersistentTrackID_Invalid)) else {return}

 do {

 try firstTrack.insertTimeRange(CMTimeRangeMake(start: CMTime.zero, duration: firstAsset.duration),
 of: firstAsset.tracks(withMediaType: .video)[0],
 at: CMTime.zero)

 } catch {
 print("Couldn't load track 1")
 return
 }

 guard let secondTrack = mixComposition.addMutableTrack(withMediaType: .video, preferredTrackID: Int32(kCMPersistentTrackID_Invalid)) else {return}

 do {
 try secondTrack.insertTimeRange(CMTimeRangeMake(start: CMTime.zero, duration: secondAsset.duration),
 of: secondAsset.tracks(withMediaType: .video)[0],
 at: firstAsset.duration)
 } catch {
 print("couldn't load track 2")
 return
 }

 let mainInstruction = AVMutableVideoCompositionInstruction()
 mainInstruction.timeRange = CMTimeRangeMake(start: CMTime.zero, duration: CMTimeAdd(firstAsset.duration, secondAsset.duration))

 let firstAssetInstruction = AVMutableVideoCompositionLayerInstruction(assetTrack: firstTrack)
 firstAssetInstruction.setOpacity(0.0, at: firstAsset.duration)

 let secondAssetInstruction = AVMutableVideoCompositionLayerInstruction(assetTrack: secondTrack)

 mainInstruction.layerInstructions = [firstAssetInstruction, secondAssetInstruction]
 let mainComposition = AVMutableVideoComposition()
 mainComposition.instructions = [mainInstruction]
 mainComposition.frameDuration = CMTimeMake(value: 1, timescale: 30)
 mainComposition.renderSize = firstTrack.naturalSize

 guard let documentDirectory = FileManager.default.urls(for: .documentDirectory, in: .userDomainMask).first else { return }
 let url = documentDirectory.appendingPathComponent("MergedVideos/mergeVideo\(videoInt).mp4")

 guard let exporter = AVAssetExportSession(asset: mixComposition, presetName: AVAssetExportPresetHighestQuality) else {return}

 exporter.outputURL = url
 exporter.outputFileType = AVFileType.mp4
 exporter.shouldOptimizeForNetworkUse = true
 exporter.videoComposition = mainComposition

 exporter.exportAsynchronously {

 if exporter.status == .completed {
 let avasset = AVAsset(url:url)
 self.mergeUrl = avasset
 if self.mp4Array.count > 1{
 print("This add the merged video to the front of the mp4array")
 self.mp4Array.remove(at: 1)
 self.mp4Array.removeFirst()
 self.videoInt = self.videoInt + 1
 self.mp4Array.append(self.mergeUrl!)
 self.mp4Array.bringToFront(item: self.mp4Array.last!)
 }

 if (self.mp4Array.count > 1){
 if self.mergeUrl != nil {
 self.mergeVideos()
 }
 } else {
 var numberofvideosdeleted = 0
 while (numberofvideosdeleted < self.videoInt - 1){
 do {
 print("deleting")
 let url = documentDirectory.appendingPathComponent("MergedVideos/mergeVideo\(numberofvideosdeleted).mp4")
 try FileManager.default.removeItem(at: url)
 numberofvideosdeleted = numberofvideosdeleted + 1
 } catch {
 print("Error removing videos")
 }
 }

 self.deleteCurrentSegementsInFolder()
 }
 }
 }
}



-
Extract the 0th frame of every second on a live video
14 avril 2020, par geo-freakI have a command to extract the zero-th frame of every second. I got the command from here.



ffmpeg -i input.ts -vf "select=between(mod(n\, 25)\, 0\, 0), setpts=N/24/TB" output-%04d.png




But when I run the above command on live feed, it is extracting more than 100000 frames. The above command is not working on a live recording. Can anyone suggest or help me to extract the very first frame on a live recording ? Thanks in advance.



P.S : For my testing I am running the above command on a tcr video.


-
MPEG-DASH Encoding for Live Streaming
16 mai 2016, par McpI want to do encode a live stream for MPEG-DASH in various bitrates and resolutions for live playback.
Everything I found so far either uses only the source resolution (Nimble, nginx-rtmp-module) or seems to be only for VOD streaming(DASHEncoder).
Is it possible to use DASHEncoder with a live input (rtmp stream) and how would I do that ?
If not, is it possible to use nginx-rtmp + ffmpeg for what I want to do ?