
Recherche avancée
Autres articles (105)
-
Des sites réalisés avec MediaSPIP
2 mai 2011, parCette page présente quelques-uns des sites fonctionnant sous MediaSPIP.
Vous pouvez bien entendu ajouter le votre grâce au formulaire en bas de page. -
Les formats acceptés
28 janvier 2010, parLes commandes suivantes permettent d’avoir des informations sur les formats et codecs gérés par l’installation local de ffmpeg :
ffmpeg -codecs ffmpeg -formats
Les format videos acceptés en entrée
Cette liste est non exhaustive, elle met en exergue les principaux formats utilisés : h264 : H.264 / AVC / MPEG-4 AVC / MPEG-4 part 10 m4v : raw MPEG-4 video format flv : Flash Video (FLV) / Sorenson Spark / Sorenson H.263 Theora wmv :
Les formats vidéos de sortie possibles
Dans un premier temps on (...) -
Ajouter notes et légendes aux images
7 février 2011, parPour pouvoir ajouter notes et légendes aux images, la première étape est d’installer le plugin "Légendes".
Une fois le plugin activé, vous pouvez le configurer dans l’espace de configuration afin de modifier les droits de création / modification et de suppression des notes. Par défaut seuls les administrateurs du site peuvent ajouter des notes aux images.
Modification lors de l’ajout d’un média
Lors de l’ajout d’un média de type "image" un nouveau bouton apparait au dessus de la prévisualisation (...)
Sur d’autres sites (8724)
-
FFmpeg overlay positioning issue : Converting frontend center coordinates to FFmpeg top-left coordinates
25 janvier, par tarunI'm building a web-based video editor where users can :


Add multiple videos
Add images
Add text overlays with background color


Frontend sends coordinates where each element's (x,y) represents its center position.
on click of the export button i want all data to be exported as one final video
on click i send the data to the backend like -


const exportAllVideos = async () => {
 try {
 const formData = new FormData();
 
 
 const normalizedVideos = videos.map(video => ({
 ...video,
 startTime: parseFloat(video.startTime),
 endTime: parseFloat(video.endTime),
 duration: parseFloat(video.duration)
 })).sort((a, b) => a.startTime - b.startTime);

 
 for (const video of normalizedVideos) {
 const response = await fetch(video.src);
 const blobData = await response.blob();
 const file = new File([blobData], `${video.id}.mp4`, { type: "video/mp4" });
 formData.append("videos", file);
 }

 
 const normalizedImages = images.map(image => ({
 ...image,
 startTime: parseFloat(image.startTime),
 endTime: parseFloat(image.endTime),
 x: parseInt(image.x),
 y: parseInt(image.y),
 width: parseInt(image.width),
 height: parseInt(image.height),
 opacity: parseInt(image.opacity)
 }));

 
 for (const image of normalizedImages) {
 const response = await fetch(image.src);
 const blobData = await response.blob();
 const file = new File([blobData], `${image.id}.png`, { type: "image/png" });
 formData.append("images", file);
 }

 
 const normalizedTexts = texts.map(text => ({
 ...text,
 startTime: parseFloat(text.startTime),
 endTime: parseFloat(text.endTime),
 x: parseInt(text.x),
 y: parseInt(text.y),
 fontSize: parseInt(text.fontSize),
 opacity: parseInt(text.opacity)
 }));

 
 formData.append("metadata", JSON.stringify({
 videos: normalizedVideos,
 images: normalizedImages,
 texts: normalizedTexts
 }));

 const response = await fetch("my_flask_endpoint", {
 method: "POST",
 body: formData
 });

 if (!response.ok) {
 
 console.log('wtf', response);
 
 }

 const finalVideo = await response.blob();
 const url = URL.createObjectURL(finalVideo);
 const a = document.createElement("a");
 a.href = url;
 a.download = "final_video.mp4";
 a.click();
 URL.revokeObjectURL(url);

 } catch (e) {
 console.log(e, "err");
 }
 };



the frontend data for each object that is text image and video we are storing it as an array of objects below is the Data strcutre for each object -


// the frontend data for each
 const newVideo = {
 id: uuidv4(),
 src: URL.createObjectURL(videoData.videoBlob),
 originalDuration: videoData.duration,
 duration: videoData.duration,
 startTime: 0,
 playbackOffset: 0,
 endTime: videoData.endTime || videoData.duration,
 isPlaying: false,
 isDragging: false,
 speed: 1,
 volume: 100,
 x: window.innerHeight / 2,
 y: window.innerHeight / 2,
 width: videoData.width,
 height: videoData.height,
 };
 const newTextObject = {
 id: uuidv4(),
 description: text,
 opacity: 100,
 x: containerWidth.width / 2,
 y: containerWidth.height / 2,
 fontSize: 18,
 duration: 20,
 endTime: 20,
 startTime: 0,
 color: "#ffffff",
 backgroundColor: hasBG,
 padding: 8,
 fontWeight: "normal",
 width: 200,
 height: 40,
 };

 const newImage = {
 id: uuidv4(),
 src: URL.createObjectURL(imageData),
 x: containerWidth.width / 2,
 y: containerWidth.height / 2,
 width: 200,
 height: 200,
 borderRadius: 0,
 startTime: 0,
 endTime: 20,
 duration: 20,
 opacity: 100,
 };




BACKEND CODE -


import os
import shutil
import subprocess
from flask import Flask, request, send_file
import ffmpeg
import json
from werkzeug.utils import secure_filename
import uuid
from flask_cors import CORS


app = Flask(__name__)
CORS(app, resources={r"/*": {"origins": "*"}})



UPLOAD_FOLDER = 'temp_uploads'
if not os.path.exists(UPLOAD_FOLDER):
 os.makedirs(UPLOAD_FOLDER)


@app.route('/')
def home():
 return 'Hello World'


OUTPUT_WIDTH = 1920
OUTPUT_HEIGHT = 1080



@app.route('/process', methods=['POST'])
def process_video():
 work_dir = None
 try:
 work_dir = os.path.abspath(os.path.join(UPLOAD_FOLDER, str(uuid.uuid4())))
 os.makedirs(work_dir)
 print(f"Created working directory: {work_dir}")

 metadata = json.loads(request.form['metadata'])
 print("Received metadata:", json.dumps(metadata, indent=2))
 
 video_paths = []
 videos = request.files.getlist('videos')
 for idx, video in enumerate(videos):
 filename = f"video_{idx}.mp4"
 filepath = os.path.join(work_dir, filename)
 video.save(filepath)
 if os.path.exists(filepath) and os.path.getsize(filepath) > 0:
 video_paths.append(filepath)
 print(f"Saved video to: {filepath} Size: {os.path.getsize(filepath)}")
 else:
 raise Exception(f"Failed to save video {idx}")

 image_paths = []
 images = request.files.getlist('images')
 for idx, image in enumerate(images):
 filename = f"image_{idx}.png"
 filepath = os.path.join(work_dir, filename)
 image.save(filepath)
 if os.path.exists(filepath):
 image_paths.append(filepath)
 print(f"Saved image to: {filepath}")

 output_path = os.path.join(work_dir, 'output.mp4')

 filter_parts = []

 base_duration = metadata["videos"][0]["duration"] if metadata["videos"] else 10
 filter_parts.append(f'color=c=black:s={OUTPUT_WIDTH}x{OUTPUT_HEIGHT}:d={base_duration}[canvas];')

 for idx, (path, meta) in enumerate(zip(video_paths, metadata['videos'])):
 x_pos = int(meta.get("x", 0) - (meta.get("width", 0) / 2))
 y_pos = int(meta.get("y", 0) - (meta.get("height", 0) / 2))
 
 filter_parts.extend([
 f'[{idx}:v]setpts=PTS-STARTPTS,scale={meta.get("width", -1)}:{meta.get("height", -1)}[v{idx}];',
 f'[{idx}:a]asetpts=PTS-STARTPTS[a{idx}];'
 ])

 if idx == 0:
 filter_parts.append(
 f'[canvas][v{idx}]overlay=x={x_pos}:y={y_pos}:eval=init[temp{idx}];'
 )
 else:
 filter_parts.append(
 f'[temp{idx-1}][v{idx}]overlay=x={x_pos}:y={y_pos}:'
 f'enable=\'between(t,{meta["startTime"]},{meta["endTime"]})\':eval=init'
 f'[temp{idx}];'
 )

 last_video_temp = f'temp{len(video_paths)-1}'

 if video_paths:
 audio_mix_parts = []
 for idx in range(len(video_paths)):
 audio_mix_parts.append(f'[a{idx}]')
 filter_parts.append(f'{"".join(audio_mix_parts)}amix=inputs={len(video_paths)}[aout];')

 
 if image_paths:
 for idx, (img_path, img_meta) in enumerate(zip(image_paths, metadata['images'])):
 input_idx = len(video_paths) + idx
 
 
 x_pos = int(img_meta["x"] - (img_meta["width"] / 2))
 y_pos = int(img_meta["y"] - (img_meta["height"] / 2))
 
 filter_parts.extend([
 f'[{input_idx}:v]scale={img_meta["width"]}:{img_meta["height"]}[img{idx}];',
 f'[{last_video_temp}][img{idx}]overlay=x={x_pos}:y={y_pos}:'
 f'enable=\'between(t,{img_meta["startTime"]},{img_meta["endTime"]})\':'
 f'alpha={img_meta["opacity"]/100}[imgout{idx}];'
 ])
 last_video_temp = f'imgout{idx}'

 if metadata.get('texts'):
 for idx, text in enumerate(metadata['texts']):
 next_output = f'text{idx}' if idx < len(metadata['texts']) - 1 else 'vout'
 
 escaped_text = text["description"].replace("'", "\\'")
 
 x_pos = int(text["x"] - (text["width"] / 2))
 y_pos = int(text["y"] - (text["height"] / 2))
 
 text_filter = (
 f'[{last_video_temp}]drawtext=text=\'{escaped_text}\':'
 f'x={x_pos}:y={y_pos}:'
 f'fontsize={text["fontSize"]}:'
 f'fontcolor={text["color"]}'
 )
 
 if text.get('backgroundColor'):
 text_filter += f':box=1:boxcolor={text["backgroundColor"]}:boxborderw=5'
 
 if text.get('fontWeight') == 'bold':
 text_filter += ':font=Arial-Bold'
 
 text_filter += (
 f':enable=\'between(t,{text["startTime"]},{text["endTime"]})\''
 f'[{next_output}];'
 )
 
 filter_parts.append(text_filter)
 last_video_temp = next_output
 else:
 filter_parts.append(f'[{last_video_temp}]null[vout];')

 
 filter_complex = ''.join(filter_parts)

 
 cmd = [
 'ffmpeg',
 *sum([['-i', path] for path in video_paths], []),
 *sum([['-i', path] for path in image_paths], []),
 '-filter_complex', filter_complex,
 '-map', '[vout]'
 ]
 
 
 if video_paths:
 cmd.extend(['-map', '[aout]'])
 
 cmd.extend(['-y', output_path])

 print(f"Running ffmpeg command: {' '.join(cmd)}")
 result = subprocess.run(cmd, capture_output=True, text=True)
 
 if result.returncode != 0:
 print(f"FFmpeg error output: {result.stderr}")
 raise Exception(f"FFmpeg processing failed: {result.stderr}")

 return send_file(
 output_path,
 mimetype='video/mp4',
 as_attachment=True,
 download_name='final_video.mp4'
 )

 except Exception as e:
 print(f"Error in video processing: {str(e)}")
 return {'error': str(e)}, 500
 
 finally:
 if work_dir and os.path.exists(work_dir):
 try:
 print(f"Directory contents before cleanup: {os.listdir(work_dir)}")
 if not os.environ.get('FLASK_DEBUG'):
 shutil.rmtree(work_dir)
 else:
 print(f"Keeping directory for debugging: {work_dir}")
 except Exception as e:
 print(f"Cleanup error: {str(e)}")

 
if __name__ == '__main__':
 app.run(debug=True, port=8000)




I'm also attaching what the final thing looks like on the frontend web vs in the downloaded video
and as u can see the downloaded video has all coords and positions messed up be it of the texts, images as well as videos




can somebody please help me figure this out :)


-
I want it to do the ffmpeg convert while the application continues to run
2 janvier, par Mustafa GemsizDownloading with yd-dl.exe is successful, but it does not convert the downloaded video to mp4 with ffmpeg. When I stop compiling the project in visual studio, it converts the ffmpeg file to mp4.


using System;
using System.Diagnostics;
using System.IO;
using System.Threading.Tasks;
using System.Windows.Forms;

namespace YouTube_MP4_indir
{
 public partial class Form1 : Form
 {
 public Form1()
 {
 InitializeComponent();
 pictureBox2.Visible = false;
 pictureBox3.Visible = false;
 }

 private async void btnDownload_Click(object sender, EventArgs e)
 {
 searchBox.Enabled = false;
 btnDownload.Enabled = false;
 pictureBox2.Visible = true;
 pictureBox3.Visible = false;

 string url = searchBox.Text.Trim();
 if (string.IsNullOrEmpty(url))
 {
 DialogResult result = MessageBox.Show("Lütfen geçerli bir YouTube URL'si giriniz.", "Uyarı", MessageBoxButtons.OK, MessageBoxIcon.Warning);

 if (result == DialogResult.OK)
 {
 searchBox.Text = "";
 searchBox.Enabled = true;
 btnDownload.Enabled = true;
 }

 pictureBox2.Visible = false;
 pictureBox3.Visible = false;
 return;
 }

 string desktopPath = Environment.GetFolderPath(Environment.SpecialFolder.Desktop);
 string videosFolderPath = Path.Combine(desktopPath, "Videolar");

 if (!Directory.Exists(videosFolderPath))
 {
 Directory.CreateDirectory(videosFolderPath);
 }

 string videoTitle = await GetVideoTitle(url);

 if (string.IsNullOrEmpty(videoTitle))
 {
 MessageBox.Show("Video başlığı alınamadı.", "Hata", MessageBoxButtons.OK, MessageBoxIcon.Error);
 pictureBox2.Visible = false;
 pictureBox3.Visible = false;
 return;
 }

 string inputFilePath = Path.Combine(videosFolderPath, $"{videoTitle}.mp4");
 string outputFilePath = Path.Combine(videosFolderPath, $"{videoTitle}-dönüştürülmüş.mp4");

 try
 {
 var ytDlpPath = Path.Combine(Application.StartupPath, "files", "yt-dlp.exe");

 var startInfo = new ProcessStartInfo()
 {
 FileName = ytDlpPath,
 Arguments = $"-f bestvideo[height<=1080]+bestaudio/best --merge-output-format mp4 --output \"{inputFilePath}\" {url}",
 UseShellExecute = false,
 CreateNoWindow = true,
 RedirectStandardOutput = true,
 RedirectStandardError = true
 };

 var process = Process.Start(startInfo);
 string output = await process.StandardOutput.ReadToEndAsync();
 string error = await process.StandardError.ReadToEndAsync();

 await process.WaitForExitAsync();

 if (process.ExitCode == 0)
 {
 // Paralel olarak FFmpeg dönüştürme işlemini başlat
 _ = Task.Run(() => ConvertToMp4(inputFilePath, outputFilePath));

 MessageBox.Show("İndirme tamamlandı. Video dönüştürülüyor.", "Bilgi", MessageBoxButtons.OK, MessageBoxIcon.Information);

 searchBox.Text = "";
 searchBox.Enabled = true;
 btnDownload.Enabled = true;

 pictureBox2.Visible = false;
 pictureBox3.Visible = true;
 }
 else
 {
 MessageBox.Show("Lütfen sadece video linki giriniz", "Uyarı", MessageBoxButtons.OK, MessageBoxIcon.Warning);

 pictureBox2.Visible = false;
 pictureBox3.Visible = false;
 }
 }
 catch (Exception ex)
 {
 MessageBox.Show("Hata: " + ex.Message, "Hata", MessageBoxButtons.OK, MessageBoxIcon.Error);

 pictureBox2.Visible = false;
 pictureBox3.Visible = false;
 }
 }

 private async Task<string> GetVideoTitle(string url)
 {
 try
 {
 var ytDlpPath = Path.Combine(Application.StartupPath, "files", "yt-dlp.exe");

 var startInfo = new ProcessStartInfo()
 {
 FileName = ytDlpPath,
 Arguments = $"-e {url}",
 UseShellExecute = false,
 CreateNoWindow = true,
 RedirectStandardOutput = true,
 RedirectStandardError = true
 };

 var process = Process.Start(startInfo);
 string output = await process.StandardOutput.ReadToEndAsync();
 await process.WaitForExitAsync();

 return output.Trim();
 }
 catch (Exception ex)
 {
 MessageBox.Show("Hata: " + ex.Message, "Hata", MessageBoxButtons.OK, MessageBoxIcon.Error);
 return null;
 }
 }

 private async Task ConvertToMp4(string inputFilePath, string outputFilePath)
 {
 try
 {
 var ffmpegPath = Path.Combine(Application.StartupPath, "files", "ffmpeg.exe");

 var startInfo = new ProcessStartInfo
 {
 FileName = ffmpegPath,
 Arguments = $"-i \"{inputFilePath}\" -c:v libx264 -preset ultrafast -crf 23 -s hd1080 \"{outputFilePath}\"",
 UseShellExecute = false,
 CreateNoWindow = true,
 RedirectStandardOutput = true,
 RedirectStandardError = true
 };

 var process = Process.Start(startInfo);
 string output = await process.StandardOutput.ReadToEndAsync();
 string error = await process.StandardError.ReadToEndAsync();

 await process.WaitForExitAsync();

 if (process.ExitCode == 0)
 {
 MessageBox.Show("Dönüştürme işlemi başarılı.", "Bilgi", MessageBoxButtons.OK, MessageBoxIcon.Information);

 if (File.Exists(inputFilePath))
 {
 File.Delete(inputFilePath);
 }
 }
 else
 {
 MessageBox.Show("Dönüştürme sırasında bir hata oluştu: " + error, "Hata", MessageBoxButtons.OK, MessageBoxIcon.Error);
 }
 }
 catch (Exception ex)
 {
 MessageBox.Show("Hata: " + ex.Message, "Hata", MessageBoxButtons.OK, MessageBoxIcon.Error);
 }
 }

 private Form2 form2;

 private void button2_Click(object sender, EventArgs e)
 {
 if (form2 == null || form2.IsDisposed)
 {
 form2 = new Form2();
 form2.Show();
 }
 else
 {
 form2.BringToFront();
 }
 }
 }
}
</string>


Downloading with yd-dl.exe is successful. as soon as the video is downloaded I want to convert it to mp4 with ffmpeg but it won't convert without stopping the project.


-
MOV to ACVHD conversion via Spring Boot and FFmpeg leads to file system error
31 décembre 2024, par epicUsernameI am experiencing an issue on a personal project that seeks to convert HEIC to JPG files and MOV files to AVCHD format. The HEIC to JPG conversion works, but the MOV to AVCHD does not, which is where my problems lie.


The intent is to do this with Spring Boot and FFmpeg, using a simple interface done in WindowBuilder.


The relevant bits are the pom file :


<dependencies>
 
 
 <dependency>
 <groupid>jmagick</groupid>
 <artifactid>jmagick</artifactid>
 <version>6.6.9</version>
 </dependency>

 
 <dependency>
 <groupid>net.java.dev.jna</groupid>
 <artifactid>jna</artifactid>
 <version>5.7.0</version> 
 </dependency>
 <dependency>
 <groupid>net.java.dev.jna</groupid>
 <artifactid>jna-platform</artifactid>
 <version>5.7.0</version>
 </dependency>
 
 


 <dependency>
 <groupid>org.bytedeco</groupid>
 <artifactid>ffmpeg</artifactid>
 <version>7.1-1.5.11</version>
 </dependency>
 <dependency>
 <groupid>org.bytedeco</groupid>
 <artifactid>javacv</artifactid>
 <version>1.5.11</version>
 </dependency>
 <dependency>
 <groupid>org.bytedeco</groupid>
 <artifactid>ffmpeg-platform</artifactid>
 <version>7.1-1.5.11</version>
 </dependency>
 
 <dependency>
 <groupid>org.bytedeco</groupid>
 <artifactid>javacpp</artifactid>
 <version>1.5.11</version>
 </dependency>
 </dependencies>




and the main file with the event handling for the application, based on the interface :


package home.multimeida.mmconverter;

imports...

public class MMConverterInterface extends JFrame {

 public static void main(String[] args) {
 
 
 try {
 System.setProperty("jna.library.path", "absolute/path/to/gstreamer/bin");
 // Gst.init("GStreamer Test");
 System.out.println("GStreamer initialized successfully.");
 } catch (Exception e) {
 e.printStackTrace();
 System.out.println("Failed to initialize GStreamer.");
 }
 EventQueue.invokeLater(new Runnable() {
 public void run() {
 try {
 MMConverterInterface frame = new MMConverterInterface();
 frame.setVisible(true);
 } catch (Exception e) {
 e.printStackTrace();
 }
 }
 });
 }

 /**
 * Create the frame.
 */
 public MMConverterInterface() {
 
 // convert button
 
 btnConvert.addActionListener(e -> {
 
 try {
 
 if (sourceFileLabel.getText().equals("No file chosen...") || destinationFolderLabel.getText().equals("No folder selected...")) {
 JOptionPane.showMessageDialog(null, "Please select both an input file and a save location.", "Validation Error", JOptionPane.WARNING_MESSAGE);
 return;
 }
 
 File sourceFile = new File(sourceFileLabel.getText());
 File destinationFile;
 
 if (rdbtnNewRadioButton.isSelected()) {
 
 System.out.println("Converting HEIC to JPG...");
 
 String outputFileName = sourceFile.getName().replaceFirst("[.][^.]+$", ".jpg");
 
 // Call your conversion logic here
 
 destinationFile = new File(destinationFolderLabel.getText(), outputFileName);
 
 convertHeicToJpg(sourceFile, destinationFile);
 
 } else if (rdbtnNewRadioButton_1.isSelected()) {
 
 if (sourceFileLabel.getText().equals("No file chosen...") || destinationFolderLabel.getText().equals("No folder selected...")) {
 JOptionPane.showMessageDialog(null, "Please select both an input file and a save location.", "Validation Error", JOptionPane.WARNING_MESSAGE);
 return;
 }
 
 // Validate source file
 if (!sourceFile.exists() || !sourceFile.canRead()) {
 JOptionPane.showMessageDialog(null, "Source file does not exist or is not readable.", "File Error", JOptionPane.ERROR_MESSAGE);
 return;
 }
 
 // Validate destination folder
 String destinationPath = destinationFolderLabel.getText();
 if (destinationPath == null || destinationPath.isEmpty() || !(new File(destinationPath).isDirectory())) {
 JOptionPane.showMessageDialog(null, "Invalid destination folder.", "File Error", JOptionPane.ERROR_MESSAGE);
 return;
 }
 
 System.out.println("Converting MOV to AVCHD...");
 
 String currentDate = new SimpleDateFormat("yyyyMMdd").format(new Date());

 // Extract the file name without the extension
 String baseName = sourceFile.getName().replaceFirst("[.][^.]+$", "");

 // Sanitize the base name (replace invalid characters with '_')
 baseName = baseName.replaceAll("[^a-zA-Z0-9-_]", "_");
 
 String sanitizedFileName = baseName + "_" + currentDate;
 sanitizedFileName = sanitizedFileName.replaceAll("[^a-zA-Z0-9._-]", "_"); // Allow alphanumeric, '-', '_', and '.'

 destinationFile = new File(destinationPath, sanitizedFileName);
 
 
 /*
 // Ensure the destination file is writable
 if (!destinationFile.canWrite()) {
 JOptionPane.showMessageDialog(null, "Output file is not writable.", "File Error", JOptionPane.ERROR_MESSAGE);
 return;
 }
 */
 

 convertMovToAvchd(sourceFile, destinationFile);
 
 } else {
 
 JOptionPane.showMessageDialog(null, "Please select a conversion type.");
 
 }
 
 } catch (Exception ex) {
 
 JOptionPane.showMessageDialog(null, "Error: " + ex.getMessage(), "Conversion Error", JOptionPane.ERROR_MESSAGE);
 ex.printStackTrace();
 }
 
 
 });
 
 // cancel button:
 
 btnCancel.addActionListener(e -> {
 System.out.println("Operation canceled.");
 System.exit(0); // Close the application
 });

 }
 
 public void convertMovToAvchd(File sourceFile, File destinationFile) {
 avutil.av_log_set_level(avutil.AV_LOG_DEBUG);
 
 

 AVFormatContext inputFormatContext = null;
 AVFormatContext outputFormatContext = new AVFormatContext(null);
 AVCodecContext inputCodecContext = null;
 AVCodecContext outputCodecContext = null;

 try {
 // Validate input file
 if (!sourceFile.exists() || !sourceFile.canRead()) {
 System.out.println("Source file does not exist or is not readable: " + sourceFile.getAbsolutePath());
 return;
 }
 
 // Validate output file path using the validateFileCreation method
 if (!validateFileCreation(destinationFile)) {
 return; // Exit if destination file validation fails
 }

 // Validate output file path
 if (destinationFile.getParentFile() == null || !destinationFile.getParentFile().exists()) {
 System.out.println("Output directory does not exist: " + destinationFile.getParentFile());
 return;
 }
 if (!destinationFile.getParentFile().canWrite()) {
 System.out.println("Output directory is not writable: " + destinationFile.getParentFile());
 return;
 }

 // Open input file
 inputFormatContext = avformat.avformat_alloc_context();
 if (avformat.avformat_open_input(inputFormatContext, sourceFile.getAbsolutePath(), null, null) < 0) {
 System.out.println("Failed to open input file: " + sourceFile.getAbsolutePath());
 return;
 }

 // Find stream information
 if (avformat.avformat_find_stream_info(inputFormatContext, (PointerPointer) null) < 0) {
 System.out.println("Failed to retrieve input stream information.");
 return;
 }

 // Find video stream
 int videoStreamIndex = avformat.av_find_best_stream(inputFormatContext, avutil.AVMEDIA_TYPE_VIDEO, -1, -1, (AVCodec) null, 0);
 if (videoStreamIndex < 0) {
 System.out.println("Failed to find video stream in input file.");
 return;
 }

 // Initialize input codec context
 inputCodecContext = avcodec.avcodec_alloc_context3(null);
 avcodec.avcodec_parameters_to_context(inputCodecContext, inputFormatContext.streams(videoStreamIndex).codecpar());

 AVCodec decoder = avcodec.avcodec_find_decoder(inputCodecContext.codec_id());
 if (decoder == null || avcodec.avcodec_open2(inputCodecContext, decoder, (PointerPointer) null) < 0) {
 System.out.println("Failed to open video decoder.");
 return;
 }

 // Allocate output format context
 if (avformat.avformat_alloc_output_context2(outputFormatContext, null, "mpegts", destinationFile.getAbsolutePath()) < 0) {
 System.out.println("Failed to allocate output format context.");
 return;
 }

 // Initialize output codec
 AVCodec encoder = avcodec.avcodec_find_encoder_by_name("mpeg2video");
 if (encoder == null) {
 System.out.println("Failed to find MPEG2 video encoder.");
 return;
 }

 outputCodecContext = avcodec.avcodec_alloc_context3(encoder);
 if (outputCodecContext == null) {
 System.out.println("Failed to allocate output codec context.");
 return;
 }
 
 if ((outputFormatContext.oformat().flags() & avformat.AVFMT_GLOBALHEADER) != 0) {
 outputCodecContext.flags(outputCodecContext.flags() | avcodec.AV_CODEC_FLAG_GLOBAL_HEADER);
 }


 //outputCodecContext.codec_id(avcodec.AV_CODEC_ID_MPEG2VIDEO);
 outputCodecContext.codec_id(encoder.id());
 outputCodecContext.codec_type(avutil.AVMEDIA_TYPE_VIDEO);
 outputCodecContext.width(1920);
 outputCodecContext.height(1080);
 outputCodecContext.pix_fmt(avutil.AV_PIX_FMT_YUV420P);
 outputCodecContext.time_base(avutil.av_make_q(1, 25));
 outputCodecContext.bit_rate(4000000);
 outputCodecContext.gop_size(12);

 if ((outputFormatContext.oformat().flags() & avformat.AVFMT_GLOBALHEADER) != 0) {
 outputCodecContext.flags(outputCodecContext.flags() | avcodec.AV_CODEC_FLAG_GLOBAL_HEADER);
 }

 
 
 if (avcodec.avcodec_open2(outputCodecContext, encoder, (PointerPointer) null) < 0) {
 System.out.println("Failed to open video encoder.");
 return;
 }

 // Create output stream
 AVStream videoStream = avformat.avformat_new_stream(outputFormatContext, encoder);
 if (videoStream == null) {
 System.out.println("Failed to create video stream.");
 return;
 }

 avcodec.avcodec_parameters_from_context(videoStream.codecpar(), outputCodecContext);
 
 System.out.println("Destination file path before trying to open the file is: " + destinationFile);

 if ((outputFormatContext.oformat().flags() & avformat.AVFMT_NOFILE) == 0) {
 // Ensure the output path has the correct extension
 String outputPath = destinationFile.getAbsolutePath().replace("\\", "/") + ".avchd";
 System.out.println("Normalized output path: " + outputPath);

 // Try opening the output file
 int ret = avformat.avio_open(outputFormatContext.pb(), outputPath, avformat.AVIO_FLAG_WRITE);
 if (ret < 0) {
 BytePointer errorBuffer = new BytePointer(avutil.AV_ERROR_MAX_STRING_SIZE);
 avutil.av_strerror(ret, errorBuffer, errorBuffer.capacity());
 System.out.println("Failed to open output file: " + errorBuffer.getString());
 return;
 }
 }


 // Write header
 if (avformat.avformat_write_header(outputFormatContext, (PointerPointer) null) < 0) {
 System.out.println("Failed to write header to output file.");
 return;
 }


 // Packet processing loop
 AVPacket packet = new AVPacket();
 while (avformat.av_read_frame(inputFormatContext, packet) >= 0) {
 if (packet.stream_index() == videoStreamIndex) {
 if (avcodec.avcodec_send_packet(inputCodecContext, packet) >= 0) {
 AVFrame frame = avutil.av_frame_alloc();
 while (avcodec.avcodec_receive_frame(inputCodecContext, frame) >= 0) {
 if (avcodec.avcodec_send_frame(outputCodecContext, frame) >= 0) {
 AVPacket encodedPacket = new AVPacket();
 while (avcodec.avcodec_receive_packet(outputCodecContext, encodedPacket) >= 0) {
 encodedPacket.stream_index(videoStream.index());
 avformat.av_interleaved_write_frame(outputFormatContext, encodedPacket);
 avcodec.av_packet_unref(encodedPacket);
 }
 }
 avutil.av_frame_unref(frame);
 }
 avutil.av_frame_free(frame);
 }
 }
 avcodec.av_packet_unref(packet);
 }

 // Write trailer
 avformat.av_write_trailer(outputFormatContext);
 System.out.println("Conversion completed successfully.");
 
 if (avcodec.avcodec_send_frame(outputCodecContext, null) >= 0) {
 AVPacket encodedPacket = new AVPacket();
 while (avcodec.avcodec_receive_packet(outputCodecContext, encodedPacket) >= 0) {
 encodedPacket.stream_index(videoStream.index());
 avformat.av_interleaved_write_frame(outputFormatContext, encodedPacket);
 avcodec.av_packet_unref(encodedPacket);
 }
 }

 } catch (Exception e) {
 e.printStackTrace();
 } finally {
 // Cleanup
 avcodec.avcodec_free_context(inputCodecContext);
 avcodec.avcodec_free_context(outputCodecContext);
 avformat.avformat_close_input(inputFormatContext);

 if (outputFormatContext != null && outputFormatContext.pb() != null) {
 avformat.avio_closep(outputFormatContext.pb());
 }
 avformat.avformat_free_context(outputFormatContext);
 }
 }
 
 private boolean validateFileCreation(File destinationFile) {
 // Check if the parent directory exists and is writable
 File parentDir = destinationFile.getParentFile();
 if (parentDir == null || !parentDir.exists()) {
 System.out.println("Parent directory does not exist: " + parentDir);
 return false;
 }
 if (!parentDir.canWrite()) {
 System.out.println("Cannot write to parent directory: " + parentDir);
 return false;
 }

 // Check if the file exists and is writable
 if (destinationFile.exists()) {
 if (!destinationFile.canWrite()) {
 System.out.println("Destination file is not writable: " + destinationFile);
 return false;
 }
 } else {
 // If the file doesn't exist, try to create it to verify writability
 try {
 if (!destinationFile.createNewFile()) {
 System.out.println("Unable to create destination file: " + destinationFile);
 return false;
 }
 // Delete the file after successful creation to avoid residual files
 destinationFile.delete();
 } catch (IOException e) {
 System.out.println("File creation failed: " + e.getMessage());
 return false;
 }
 }

 return true;
 }
 
}





A few caveats :


- 

-
I did explore FFmpeg and GStreamer for this project. GStreamer was inconclusive, with available version for it that were too old for use with my current state of STS4.27 and Java 17, even if this version of Java is under long-term support...


-
I've used AI to tell me about the options and suggest ways to build this thing, since multimedia handling is very far away from my skillset. I don't have a good conceptual grasp of video formats and how they transfrom from one to another.








The issue, as I have identified it, occurs at these lines :


// Ensure the destination file is writable
 if (!destinationFile.canWrite()) {
 JOptionPane.showMessageDialog(null, "Output file is not writable.", "File Error", JOptionPane.ERROR_MESSAGE);
 return;
 }



^^ And this, while temporarily commented out for testing, it meant to compensate for an issue that occurs here in the conversion function :


if ((outputFormatContext.oformat().flags() & avformat.AVFMT_NOFILE) == 0) {
 // Ensure the output path has the correct extension
 String outputPath = destinationFile.getAbsolutePath().replace("\\", "/") + ".avchd";
 System.out.println("Normalized output path: " + outputPath);

 // Try opening the output file
 int ret = avformat.avio_open(outputFormatContext.pb(), outputPath, avformat.AVIO_FLAG_WRITE);
 if (ret < 0) {
 BytePointer errorBuffer = new BytePointer(avutil.AV_ERROR_MAX_STRING_SIZE);
 avutil.av_strerror(ret, errorBuffer, errorBuffer.capacity());
 System.out.println("Failed to open output file: " + errorBuffer.getString());
 return;
 }
 }



The idea here is that the avio_open() function requires the use of the a valid file path that it can open to be able to write it.


Padadoxically, the file conversion seems to work, but the project crashes with a fatal error in the console :


Selected file: E:\TestConveresions\sample_960x540.mov
Save location: E:\TestConveresions
Converting MOV to AVCHD...
Destination file path before trying to open the file is: E:\TestConveresions\sample_960x540_20241231
Normalized output path: E:/TestConveresions/sample_960x540_20241231.avchd
#
# A fatal error has been detected by the Java Runtime Environment:
#
# EXCEPTION_ACCESS_VIOLATION (0xc0000005) at pc=0x00007ffcffb0868b, pid=11020, tid=14436
#
# JRE version: OpenJDK Runtime Environment Temurin-21.0.5+11 (21.0.5+11) (build 21.0.5+11-LTS)
# Java VM: OpenJDK 64-Bit Server VM Temurin-21.0.5+11 (21.0.5+11-LTS, mixed mode, emulated-client, sharing, tiered, compressed oops, compressed class ptrs, g1 gc, windows-amd64)
# Problematic frame:
# C 0x00007ffcffb0868b
#
# No core dump will be written. Minidumps are not enabled by default on client versions of Windows
#
# An error report file with more information is saved as:
# E:\STS4 Workspace\MMConverter\hs_err_pid11020.log
[80.882s][warning][os] Loading hsdis library failed
#
# If you would like to submit a bug report, please visit:
# https://github.com/adoptium/adoptium-support/issues
# The crash happened outside the Java Virtual Machine in native code.
# See problematic frame for where to report the bug.
#
[AVFormatContext @ 000002528adcea40] Opening 'E:\TestConveresions\sample_960x540.mov' for reading
[file @ 000002528ae51c40] Setting default whitelist 'file,crypto,data'
[mov,mp4,m4a,3gp,3g2,mj2 @ 000002528adcea40] Format mov,mp4,m4a,3gp,3g2,mj2 probed with size=2048 and score=100
[mov,mp4,m4a,3gp,3g2,mj2 @ 000002528adcea40] ISO: File Type Major Brand: qt 
[mov,mp4,m4a,3gp,3g2,mj2 @ 000002528adcea40] Unknown dref type 0x206c7275 size 12
[mov,mp4,m4a,3gp,3g2,mj2 @ 000002528adcea40] Processing st: 0, edit list 0 - media time: 2002, duration: 400410
[mov,mp4,m4a,3gp,3g2,mj2 @ 000002528adcea40] Offset DTS by 2002 to make first pts zero.
[mov,mp4,m4a,3gp,3g2,mj2 @ 000002528adcea40] Setting codecpar->delay to 2 for stream st: 0
[mov,mp4,m4a,3gp,3g2,mj2 @ 000002528adcea40] Before avformat_find_stream_info() pos: 1320742 bytes read:38225 seeks:1 nb_streams:1
[h264 @ 000002528ae62780] nal_unit_type: 7(SPS), nal_ref_idc: 3
[h264 @ 000002528ae62780] Decoding VUI
[h264 @ 000002528ae62780] nal_unit_type: 8(PPS), nal_ref_idc: 3
[h264 @ 000002528ae62780] nal_unit_type: 7(SPS), nal_ref_idc: 3
[h264 @ 000002528ae62780] Decoding VUI
[h264 @ 000002528ae62780] nal_unit_type: 8(PPS), nal_ref_idc: 3
[h264 @ 000002528ae62780] nal_unit_type: 6(SEI), nal_ref_idc: 0
[h264 @ 000002528ae62780] nal_unit_type: 5(IDR), nal_ref_idc: 3
[h264 @ 000002528ae62780] Format yuv420p chosen by get_format().
[h264 @ 000002528ae62780] Reinit context to 960x544, pix_fmt: yuv420p
[h264 @ 000002528ae62780] no picture 
[mov,mp4,m4a,3gp,3g2,mj2 @ 000002528adcea40] All info found
[mov,mp4,m4a,3gp,3g2,mj2 @ 000002528adcea40] After avformat_find_stream_info() pos: 51943 bytes read:90132 seeks:2 frames:1
[h264 @ 000002528ae62780] nal_unit_type: 7(SPS), nal_ref_idc: 3
[h264 @ 000002528ae62780] Decoding VUI
[h264 @ 000002528ae62780] nal_unit_type: 8(PPS), nal_ref_idc: 3
[mpeg2video @ 000002528ae8e700] intra_quant_bias = 96 inter_quant_bias = 0




If I refer to the error log, I get this. It is partial, as I'm not sure SO will take all of it (quite long), but still might have enough to be relevant :


Host: Intel(R) Core(TM) i7-8550U CPU @ 1.80GHz, 8 cores, 31G, Windows 11 , 64 bit Build 26100 (10.0.26100.2454)


--------------- T H R E A D ---------------

Current thread (0x00000252d030b340): JavaThread "AWT-EventQueue-0" [_thread_in_native, id=14436, stack(0x000000a4e2b00000,0x000000a4e2c00000) (1024K)]

Stack: [0x000000a4e2b00000,0x000000a4e2c00000], sp=0x000000a4e2bfdf30, free space=1015k
Native frames: (J=compiled Java code, j=interpreted, Vv=VM code, C=native code)
C 0x00007ffcffb0868b

Java frames: (J=compiled Java code, j=interpreted, Vv=VM code)
j org.bytedeco.ffmpeg.global.avformat.avio_open(Lorg/bytedeco/ffmpeg/avformat/AVIOContext;Ljava/lang/String;I)I+0
j home.multimeida.mmconverter.MMConverterInterface.convertMovToAvchd(Ljava/io/File;Ljava/io/File;)V+1120
j home.multimeida.mmconverter.MMConverterInterface.lambda$2(Ljavax/swing/JRadioButton;Ljavax/swing/JRadioButton;Ljava/awt/event/ActionEvent;)V+347
j home.multimeida.mmconverter.MMConverterInterface$$Lambda+0x000002528c0c7778.actionPerformed(Ljava/awt/event/ActionEvent;)V+13
j javax.swing.AbstractButton.fireActionPerformed(Ljava/awt/event/ActionEvent;)V+84 java.desktop@21.0.5
j javax.swing.AbstractButton$Handler.actionPerformed(Ljava/awt/event/ActionEvent;)V+5 java.desktop@21.0.5
j javax.swing.DefaultButtonModel.fireActionPerformed(Ljava/awt/event/ActionEvent;)V+34 java.desktop@21.0.5
j javax.swing.DefaultButtonModel.setPressed(Z)V+117 java.desktop@21.0.5
j javax.swing.plaf.basic.BasicButtonListener.mouseReleased(Ljava/awt/event/MouseEvent;)V+35 java.desktop@21.0.5
j java.awt.Component.processMouseEvent(Ljava/awt/event/MouseEvent;)V+64 java.desktop@21.0.5
j javax.swing.JComponent.processMouseEvent(Ljava/awt/event/MouseEvent;)V+23 java.desktop@21.0.5
J 2581 c1 java.awt.Component.processEvent(Ljava/awt/AWTEvent;)V java.desktop@21.0.5 (220 bytes) @ 0x00000252fa62719c [0x00000252fa627020+0x000000000000017c]
J 2580 c1 java.awt.Container.processEvent(Ljava/awt/AWTEvent;)V java.desktop@21.0.5 (22 bytes) @ 0x00000252fa627d9c [0x00000252fa627cc0+0x00000000000000dc]
J 2406 c1 java.awt.Component.dispatchEventImpl(Ljava/awt/AWTEvent;)V java.desktop@21.0.5 (785 bytes) @ 0x00000252fa670f14 [0x00000252fa670040+0x0000000000000ed4]
J 2325 c1 java.awt.Container.dispatchEventImpl(Ljava/awt/AWTEvent;)V java.desktop@21.0.5 (129 bytes) @ 0x00000252fa64e93c [0x00000252fa64e8a0+0x000000000000009c]
J 2608 c1 java.awt.LightweightDispatcher.retargetMouseEvent(Ljava/awt/Component;ILjava/awt/event/MouseEvent;)V java.desktop@21.0.5 (372 bytes) @ 0x00000252fa61c364 [0x00000252fa61b9e0+0x0000000000000984]
J 2578 c1 java.awt.LightweightDispatcher.processMouseEvent(Ljava/awt/event/MouseEvent;)Z java.desktop@21.0.5 (268 bytes) @ 0x00000252fa628a54 [0x00000252fa6284c0+0x0000000000000594]
J 2474 c1 java.awt.LightweightDispatcher.dispatchEvent(Ljava/awt/AWTEvent;)Z java.desktop@21.0.5 (73 bytes) @ 0x00000252fa699bbc [0x00000252fa699a60+0x000000000000015c]
J 2325 c1 java.awt.Container.dispatchEventImpl(Ljava/awt/AWTEvent;)V java.desktop@21.0.5 (129 bytes) @ 0x00000252fa64e914 [0x00000252fa64e8a0+0x0000000000000074]
J 2473 c1 java.awt.Window.dispatchEventImpl(Ljava/awt/AWTEvent;)V java.desktop@21.0.5 (23 bytes) @ 0x00000252fa699654 [0x00000252fa6994e0+0x0000000000000174]
J 1838 c1 java.awt.EventQueue.dispatchEventImpl(Ljava/awt/AWTEvent;Ljava/lang/Object;)V java.desktop@21.0.5 (139 bytes) @ 0x00000252fa3bec64 [0x00000252fa3beb20+0x0000000000000144]
J 1837 c1 java.awt.EventQueue$4.run()Ljava/lang/Void; java.desktop@21.0.5 (60 bytes) @ 0x00000252fa3c0504 [0x00000252fa3c0460+0x00000000000000a4]
J 1836 c1 java.awt.EventQueue$4.run()Ljava/lang/Object; java.desktop@21.0.5 (5 bytes) @ 0x00000252fa3c0a04 [0x00000252fa3c09c0+0x0000000000000044]
J 1778 c1 java.security.ProtectionDomain$JavaSecurityAccessImpl.doIntersectionPrivilege(Ljava/security/PrivilegedAction;Ljava/security/AccessControlContext;Ljava/security/AccessControlContext;)Ljava/lang/Object; java.base@21.0.5 (22 bytes) @ 0x00000252fa4601d4 [0x00000252fa45ffa0+0x0000000000000234]
J 1832 c1 java.awt.EventQueue.dispatchEvent(Ljava/awt/AWTEvent;)V java.desktop@21.0.5 (80 bytes) @ 0x00000252fa44f14c [0x00000252fa44eae0+0x000000000000066c]
J 1846 c1 java.awt.EventDispatchThread.pumpOneEventForFilters(I)V java.desktop@21.0.5 (106 bytes) @ 0x00000252fa3ba544 [0x00000252fa3ba2e0+0x0000000000000264]
j java.awt.EventDispatchThread.pumpEventsForFilter(ILjava/awt/Conditional;Ljava/awt/EventFilter;)V+35 java.desktop@21.0.5
j java.awt.EventDispatchThread.pumpEventsForHierarchy(ILjava/awt/Conditional;Ljava/awt/Component;)V+11 java.desktop@21.0.5
j java.awt.EventDispatchThread.pumpEvents(ILjava/awt/Conditional;)V+4 java.desktop@21.0.5
j java.awt.EventDispatchThread.pumpEvents(Ljava/awt/Conditional;)V+3 java.desktop@21.0.5
j java.awt.EventDispatchThread.run()V+9 java.desktop@21.0.5
v ~StubRoutines::call_stub 0x00000252fa08100d

siginfo: EXCEPTION_ACCESS_VIOLATION (0xc0000005), writing address 0x0000000000000000




If anyone has a perspective on this, it'd be appreciated.


The catch 22 in this project is that pre-creating the file is not a good idea, since avio_open has a purpose in-built method for that (I tried). Error checking everything about Java's File class in terms of setting pathways and creating and deleting files is not problematic. Likewise, permissions are all fine (Full Control in source and target folders) ; I've tested default C drive folders, which have restritions, to a separate volume and removable media, to no effect. Likewise, FFmpeg requires a forward slash, "/" in file paths, whereas Java does the backslash, generally. That's been handled with the replace method in the above conditioning, also to no effect.


The basic contradiction in the project seems to be that the error tries open a file that does not exist, with a valid source and destination file, and if I try to create a placeholder file wiht an acvhd extension at the event handling for the Convert button, it still errors out ; meanwhile, FFmpeg allegedly handles the file creation at its core, but requires a valid path to be passed ; I've tried with and without a filename, with and without an extension. I'm not able to resovle it.


The excessive error handling conditions are in an effort to isolate the problem, which I think I've done.


There also seems to be a compatibility between mpegts and acvhd, which is why I also had that format specified in the conversion function, without result.


I also have the idea to be able to do this without having to install any libraries locally or having to set path variables, which is an aspect that both GStreamer and FFmpeg have.


Nearest suggestion I've found is this : integrate ffmpeg with spring boot


AI remains hopeless for resolving this issue.


-