
Recherche avancée
Autres articles (59)
-
Des sites réalisés avec MediaSPIP
2 mai 2011, parCette page présente quelques-uns des sites fonctionnant sous MediaSPIP.
Vous pouvez bien entendu ajouter le votre grâce au formulaire en bas de page. -
La sauvegarde automatique de canaux SPIP
1er avril 2010, parDans le cadre de la mise en place d’une plateforme ouverte, il est important pour les hébergeurs de pouvoir disposer de sauvegardes assez régulières pour parer à tout problème éventuel.
Pour réaliser cette tâche on se base sur deux plugins SPIP : Saveauto qui permet une sauvegarde régulière de la base de donnée sous la forme d’un dump mysql (utilisable dans phpmyadmin) mes_fichiers_2 qui permet de réaliser une archive au format zip des données importantes du site (les documents, les éléments (...) -
Script d’installation automatique de MediaSPIP
25 avril 2011, parAfin de palier aux difficultés d’installation dues principalement aux dépendances logicielles coté serveur, un script d’installation "tout en un" en bash a été créé afin de faciliter cette étape sur un serveur doté d’une distribution Linux compatible.
Vous devez bénéficier d’un accès SSH à votre serveur et d’un compte "root" afin de l’utiliser, ce qui permettra d’installer les dépendances. Contactez votre hébergeur si vous ne disposez pas de cela.
La documentation de l’utilisation du script d’installation (...)
Sur d’autres sites (6341)
-
Create mp4 thumbnail in node.js
21 mai 2015, par trdavidsonnew in node.js and aws framework so I apologize in advance. I am trying to configure the AWS DB of my app to automatically create thumbnails using AWS Lambda. This works great using the example provided by Amazon for regular .jpg images (walkthrough here : https://alestic.com/2014/11/aws-lambda-cli/).
However to try and do the same operation for mp4 files seems exponentially more difficult. After some searching I found that it seems the way to do this is by using the ffmpeg module. The problem is that I do not at all understand the response object returned by aws, and thus am not sure how to manipulate it so that ffmpeg can use it.
current code :
// dependencies
var async = require('async');
var AWS = require('aws-sdk');
var gm = require('gm')
.subClass({ imageMagick: true }); // Enable ImageMagick integration.
var util = require('util');
var ffmpeg = require('ffmpeg');
var stream = require('stream')
// constants
var MAX_WIDTH = 250;
var MAX_HEIGHT = 250;
// get reference to S3 client
var s3 = new AWS.S3();
exports.handler = function(event, context) {
// Read options from the event.
console.log("Reading options from event:\n", util.inspect(event, {depth: 5}));
var srcBucket = event.Records[0].s3.bucket.name;
// Object key may have spaces or unicode non-ASCII characters.
var srcKey =
decodeURIComponent(event.Records[0].s3.object.key.replace(/\+/g, " "));
var dstBucket = srcBucket + "small";
var dstKey = "small-" + srcKey;
// Sanity check: validate that source and destination are different buckets.
if (srcBucket == dstBucket) {
console.error("Destination bucket must not match source bucket.");
return;
}
// Infer the image type.
var typeMatch = srcKey.match(/\.([^.]*)$/);
if (!typeMatch) {
console.error('unable to infer image type for key ' + srcKey);
return;
}
var imageType = typeMatch[1];
if (imageType != "mp4" && imageType != "avi") {
console.log('skipping non-image ' + srcKey);
return;
}
// Download the image from S3, transform, and upload to a different S3 bucket.
async.waterfall([
function download(next) {
// Download the image from S3 into a buffer.
s3.getObject({
Bucket: srcBucket,
Key: srcKey
},
next);
},
function tranform(response, next) {
var instream = new stream.Readable();
instream.push(response.Body)
instream.push(null)
var outstream = new stream();
ffmpeg(instream)
.screenshots({timestamps: 1, size: '200x200'})
.output('screenshot.png')
.output(outstream)
.on('end', function(){
console.log('screenshots finished processing son!')
})
gm(outstream, 'screenshot.png').size(function(err, size) {
// Infer the scaling factor to avoid stretching the image unnaturally.
var scalingFactor = Math.min(
MAX_WIDTH / size.width,
MAX_HEIGHT / size.height
);
var width = scalingFactor * size.width;
var height = scalingFactor * size.height;
// Transform the image buffer in memory.
this.resize(width, height)
.toBuffer(imageType, function(err, buffer) {
if (err) {
next(err);
} else {
next(null, response.ContentType, buffer);
}
});
});
},
function upload(contentType, data, next) {
// Stream the transformed image to a different S3 bucket.
s3.putObject({
Bucket: dstBucket,
Key: dstKey,
Body: data,
ContentType: contentType
},
next);
}
], function (err) {
if (err) {
console.error(
'Unable to resize ' + srcBucket + '/' + srcKey +
' and upload to ' + dstBucket + '/' + dstKey +
' due to an error: ' + err
);
} else {
console.log(
'Successfully resized ' + srcBucket + '/' + srcKey +
' and uploaded to ' + dstBucket + '/' + dstKey
);
}
context.done();
}
);} ;
Any suggestions are welcome ! Thanks
-
Architecture of video-based service for mobile phones
27 juin 2015, par David AzarI guess this is more of a conceptual question than a technical one.
I’m trying to figure out the best way to upload short videos to a server and also be able to download them and watch them on both Android and iOS.
Lets focus on Android for the moment.
I’ve done some experiments, and my results have been :
-
I’m able to compress 12-14MB video down to 500KB using FFMPEG lib with pretty good results in quality, but it takes about 12 seconds.
-
Next, im uploading those videos to my Parse backend as ParseFile to store them.
-
Finally, i can download them and watch them with no problem using a VideoView widget.
Now, for the tests i’ve been running, these are great results. But i want to see if there is a better way to manage and scale all of this.
My questions are :
-
Is there a better, lighter way to compress video ?
-
Is Parse the right way to go ?
-
How can i stream videos instead of downloading them and storing the on local storage before playing them ? i know this will cause my app to use significant space on disk and i dont want that.
-
How do big companies do this kind of tasks ?
I’ve heard Amazon S3 is a cool thing for projects like this one, also Google Cloud Platform. I want to understand the best approach before building everything so i can do it the right way and also, provide the absolute best user experience for watching these videos.
-
-
Rails Thumbnails for videos being uploaded to S3
20 octobre 2015, par DaniI have a rails application where I need to upload videos to an amazon s3 bucket alongwith their thumbnails. I am using ffmpeg to generate thumbnails and I am using carrierwave to handle video uploads. Here is my video uploader class
class VideoUploader < CarrierWave::Uploader::Base
include CarrierWave::Video
storage :fog
def store_dir
"uploads/#{model.class.to_s.underscore}/#{mounted_as}/#{model.id}"
end
def extension_white_list
%w(mp4 flv)
end
endThe video uploads fine and the column for video url is set in videos table but I want to generate thumbnail and upload it as well. I know I have to use ffmpeg here but don’t exactly know how to do it.
Any help will be appreciated.