
Recherche avancée
Médias (91)
-
Les Miserables
9 décembre 2019, par
Mis à jour : Décembre 2019
Langue : français
Type : Textuel
-
VideoHandle
8 novembre 2019, par
Mis à jour : Novembre 2019
Langue : français
Type : Video
-
Somos millones 1
21 juillet 2014, par
Mis à jour : Juin 2015
Langue : français
Type : Video
-
Un test - mauritanie
3 avril 2014, par
Mis à jour : Avril 2014
Langue : français
Type : Textuel
-
Pourquoi Obama lit il mes mails ?
4 février 2014, par
Mis à jour : Février 2014
Langue : français
-
IMG 0222
6 octobre 2013, par
Mis à jour : Octobre 2013
Langue : français
Type : Image
Autres articles (50)
-
Mise à jour de la version 0.1 vers 0.2
24 juin 2013, parExplications des différents changements notables lors du passage de la version 0.1 de MediaSPIP à la version 0.3. Quelles sont les nouveautés
Au niveau des dépendances logicielles Utilisation des dernières versions de FFMpeg (>= v1.2.1) ; Installation des dépendances pour Smush ; Installation de MediaInfo et FFprobe pour la récupération des métadonnées ; On n’utilise plus ffmpeg2theora ; On n’installe plus flvtool2 au profit de flvtool++ ; On n’installe plus ffmpeg-php qui n’est plus maintenu au (...) -
Ajout d’utilisateurs manuellement par un administrateur
12 avril 2011, parL’administrateur d’un canal peut à tout moment ajouter un ou plusieurs autres utilisateurs depuis l’espace de configuration du site en choisissant le sous-menu "Gestion des utilisateurs".
Sur cette page il est possible de :
1. décider de l’inscription des utilisateurs via deux options : Accepter l’inscription de visiteurs du site public Refuser l’inscription des visiteurs
2. d’ajouter ou modifier/supprimer un utilisateur
Dans le second formulaire présent un administrateur peut ajouter, (...) -
MediaSPIP Player : les contrôles
26 mai 2010, parLes contrôles à la souris du lecteur
En plus des actions au click sur les boutons visibles de l’interface du lecteur, il est également possible d’effectuer d’autres actions grâce à la souris : Click : en cliquant sur la vidéo ou sur le logo du son, celui ci se mettra en lecture ou en pause en fonction de son état actuel ; Molette (roulement) : en plaçant la souris sur l’espace utilisé par le média (hover), la molette de la souris n’exerce plus l’effet habituel de scroll de la page, mais diminue ou (...)
Sur d’autres sites (4835)
-
What is the best way to fill AVFrame.data
21 septembre 2015, par Tim HsuI want to transfer opengl framebuffer data to AVCodec as fast as possible.
I’ve already converted RGB to YUV with shader and read it with glReadPixels
I still need to fill AVFrame data manually. Is there any better way ?
AVFrame *frame;
// Y
frame->data[0][y*frame->linesize[0]+x] = data[i*3];
// U
frame->data[1][y*frame->linesize[1]+x] = data[i*3+1];
// V
frame->data[2][y*frame->linesize[2]+x] = data[i*3+2]; -
Best approach to real time http streaming to HTML5 video client
12 octobre 2016, par deandobI’m really stuck trying to understand the best way to stream real time output of ffmpeg to a HTML5 client using node.js, as there are a number of variables at play and I don’t have a lot of experience in this space, having spent many hours trying different combinations.
My use case is :
1) IP video camera RTSP H.264 stream is picked up by FFMPEG and remuxed into a mp4 container using the following FFMPEG settings in node, output to STDOUT. This is only run on the initial client connection, so that partial content requests don’t try to spawn FFMPEG again.
liveFFMPEG = child_process.spawn("ffmpeg", [
"-i", "rtsp://admin:12345@192.168.1.234:554" , "-vcodec", "copy", "-f",
"mp4", "-reset_timestamps", "1", "-movflags", "frag_keyframe+empty_moov",
"-" // output to stdout
], {detached: false});2) I use the node http server to capture the STDOUT and stream that back to the client upon a client request. When the client first connects I spawn the above FFMPEG command line then pipe the STDOUT stream to the HTTP response.
liveFFMPEG.stdout.pipe(resp);
I have also used the stream event to write the FFMPEG data to the HTTP response but makes no difference
xliveFFMPEG.stdout.on("data",function(data) {
resp.write(data);
}I use the following HTTP header (which is also used and working when streaming pre-recorded files)
var total = 999999999 // fake a large file
var partialstart = 0
var partialend = total - 1
if (range !== undefined) {
var parts = range.replace(/bytes=/, "").split("-");
var partialstart = parts[0];
var partialend = parts[1];
}
var start = parseInt(partialstart, 10);
var end = partialend ? parseInt(partialend, 10) : total; // fake a large file if no range reques
var chunksize = (end-start)+1;
resp.writeHead(206, {
'Transfer-Encoding': 'chunked'
, 'Content-Type': 'video/mp4'
, 'Content-Length': chunksize // large size to fake a file
, 'Accept-Ranges': 'bytes ' + start + "-" + end + "/" + total
});3) The client has to use HTML5 video tags.
I have no problems with streaming playback (using fs.createReadStream with 206 HTTP partial content) to the HTML5 client a video file previously recorded with the above FFMPEG command line (but saved to a file instead of STDOUT), so I know the FFMPEG stream is correct, and I can even correctly see the video live streaming in VLC when connecting to the HTTP node server.
However trying to stream live from FFMPEG via node HTTP seems to be a lot harder as the client will display one frame then stop. I suspect the problem is that I am not setting up the HTTP connection to be compatible with the HTML5 video client. I have tried a variety of things like using HTTP 206 (partial content) and 200 responses, putting the data into a buffer then streaming with no luck, so I need to go back to first principles to ensure I’m setting this up the right way.
Here is my understanding of how this should work, please correct me if I’m wrong :
1) FFMPEG should be setup to fragment the output and use an empty moov (FFMPEG frag_keyframe and empty_moov mov flags). This means the client does not use the moov atom which is typically at the end of the file which isn’t relevant when streaming (no end of file), but means no seeking possible which is fine for my use case.
2) Even though I use MP4 fragments and empty MOOV, I still have to use HTTP partial content, as the HTML5 player will wait until the entire stream is downloaded before playing, which with a live stream never ends so is unworkable.
3) I don’t understand why piping the STDOUT stream to the HTTP response doesn’t work when streaming live yet if I save to a file I can stream this file easily to HTML5 clients using similar code. Maybe it’s a timing issue as it takes a second for the FFMPEG spawn to start, connect to the IP camera and send chunks to node, and the node data events are irregular as well. However the bytestream should be exactly the same as saving to a file, and HTTP should be able to cater for delays.
4) When checking the network log from the HTTP client when streaming a MP4 file created by FFMPEG from the camera, I see there are 3 client requests : A general GET request for the video, which the HTTP server returns about 40Kb, then a partial content request with a byte range for the last 10K of the file, then a final request for the bits in the middle not loaded. Maybe the HTML5 client once it receives the first response is asking for the last part of the file to load the MP4 MOOV atom ? If this is the case it won’t work for streaming as there is no MOOV file and no end of the file.
5) When checking the network log when trying to stream live, I get an aborted initial request with only about 200 bytes received, then a re-request again aborted with 200 bytes and a third request which is only 2K long. I don’t understand why the HTML5 client would abort the request as the bytestream is exactly the same as I can successfully use when streaming from a recorded file. It also seems node isn’t sending the rest of the FFMPEG stream to the client, yet I can see the FFMPEG data in the .on event routine so it is getting to the FFMPEG node HTTP server.
6) Although I think piping the STDOUT stream to the HTTP response buffer should work, do I have to build an intermediate buffer and stream that will allow the HTTP partial content client requests to properly work like it does when it (successfully) reads a file ? I think this is the main reason for my problems however I’m not exactly sure in Node how to best set that up. And I don’t know how to handle a client request for the data at the end of the file as there is no end of file.
7) Am I on the wrong track with trying to handle 206 partial content requests, and should this work with normal 200 HTTP responses ? HTTP 200 responses works fine for VLC so I suspect the HTML5 video client will only work with partial content requests ?
As I’m still learning this stuff its difficult to work through the various layers of this problem (FFMPEG, node, streaming, HTTP, HTML5 video) so any pointers will be greatly appreciated. I have spent hours researching on this site and the net, and I have not come across anyone who has been able to do real time streaming in node but I can’t be the first, and I think this should be able to work (somehow !).
-
Issues with processing media on windows Azure
23 septembre 2015, par Ahmed MujtabaI have a website built on ASP.NET web forms that works as a media portal for users to upload videos. I’m using ffmpeg encoders to produce video contents to be streamed in the browser. I’m using the web deploy method to publish the site on the Azure server. The website get’s deployed properly however I get following issues in the live site.
-
Video never get’s encoded and published. I get some sort of error.
-
Video get’s published but the process of uploading and encoding the video is way too slow on the web server.
My project solution contains upload.ashx that handles the upload requests and makes the call to encode.ashx which is responsible for the encoding and publishing of the videos. I tried to remotely debug the site but the debugger never get’s to encode.ashx.
I was wondering if these issues can be resolved by having the website deployed with a VM ?
Script that uploads the video file :
var filesuploaded = 0;
var faileduploaded = 0;
$(function () {
var uploader = new plupload.Uploader({
runtimes: 'gears,html5,flash,silverlight,browserplus',
browse_button: '<%= pickfiles.ClientID %>',
container: 'container',
max_file_size: '<%= MaxMediaSize %>mb',
url: '<%=Config.GetUrl() %>videos/upload/upload.ashx',
flash_swf_url: '<%=Config.GetUrl() %>plupload/js/plupload.flash.swf',
silverlight_xap_url: '<%=Config.GetUrl() %>plupload/js/plupload.silverlight.xap',
chunk_size: '4mb',
<%= UniqueNames %>
filters: [
{ title: '<%= AllowedFormatsDisplay %>', extensions: '<%= AllowedFormats %>'}],
headers: { UName: '<%=UserName %>', MTP: '<%= MediaType %>' }
});
//uploader.bind('Init', function (up, params) {
// $('#filelist').html("<div>Current runtime: " + params.runtime + "</div>");
//});
uploader.init();
$('#uploadfiles').click(function (e) {
uploader.start();
e.preventDefault();
$("#uploadfiles").hide();
$("#<%= embd.ClientID %>").hide();
});
uploader.bind('FilesAdded', function (up, files) {
$("#uploadfiles").show();
$("#<%= msg.ClientID %>").html("");
var count=0;
$.each(files, function (i, file) {
$('#filelist').append(
'<div class="item_pad_4 bx_br_bt">' + (count + 1) + ': ' + file.name + ' (' + plupload.formatSize(file.size) + ') <b></b></div>' );
count++;
});
var maxupload = <%= MaxVideoUploads %>;
if(count > maxupload)
{
$.each(files, function(i, file) {
uploader.removeFile(file);
});
$('#filelist').html("");
$("#uploadfiles").hide();
Display_Message("#<%= msg.ClientID %>", "Can't upload more than " + maxupload + " records at once!", 1, 1);
return false;
}
else {
$("#tfiles").html(count);
$("#uploadfiles").removeClass("disabled");
$("#<%= pickfiles.ClientID %>").hide();
}
up.refresh(); // Reposition Flash/Silverlight
});
uploader.bind('UploadProgress', function (up, file) {
$('#' + file.id + " b").html(file.percent + "%");
});
uploader.bind('Error', function (up, err) {
$('#filelist').append("<div>Error: " + err.code +
", Message: " + err.message +
(err.file ? ", File: " + err.file.name : "") +
"</div>"
);
up.refresh(); // Reposition Flash/Silverlight
});
var failedstatus = 0;
uploader.bind('FileUploaded', function (up, file, info) {
// encode started
if (info.response != "failed" && info.response != "") {
EncodeVD(file.id, info.response, file.size);
Display_Message('#' + file.id, "Please wait for final processing", 0, 1);
if (failedstatus == 0)
Redirect(info.response);
filesuploaded++;
}
else {
Display_Message('#' + file.id, "Response is: " + info.response, 0, 1);
}
});
});
var redcnt = 0;
function Redirect(filename) {
var IntervalID = setInterval(function () {
redcnt++;
if (redcnt > 2) {
clearInterval(IntervalID);
var tfiles = $("#tfiles").html();
if(tfiles == faileduploaded) { // break further processing all videos failed to upload
}
else if (filesuploaded >= tfiles) {
document.location = "<%=ConfirmPageUrl %>?fn=" + filename + "&gid=<%=GalleryID %>&uvids=" + $("#tfiles").html() + "&mpid=" + $("#maxpid").html().trim() + "<%=GroupParam %>";
}
}
}, 2000);
}
function EncodeVD(mid, mfile, msize) {
var params = '<%= EncodingParams %>&fn=' + mfile;
$.ajax({
type: 'GET',
url: '<%= Encoding_Handler_Path %>',
data: params,
async: true,
success: function (msg) {
if (msg == "Success" || msg == "") {
$('#' + mid).html('<strong>Uploading Completed Successfully - Wait for Processing.');
}
else {
failedstatus = 1;
faileduploaded++;
Display_Message('#' + mid, "Response is: " + msg, 0, 1);
}
}
});
}
</strong>Server side code for processing the file upload :
private int MediaType = 0; // 0 : video, 1: audio
public void ProcessRequest (HttpContext context) {
try
{
context.Response.ContentType = "text/plain";
context.Response.Write(ProcessMedia(context));
}
catch (Exception ex)
{
context.Response.Write("error|" + ex.Message);
}
}
public string ProcessMedia(HttpContext context)
{
if (context.Request.Files.Count > 0)
{
int chunk = context.Request["chunk"] != null ? int.Parse(context.Request["chunk"]) : 0;
string fileName = context.Request["name"] != null ? context.Request["name"] : string.Empty;
//string _fileName = fileName.Remove(fileName.LastIndexOf(".")) + "-" + Guid.NewGuid().ToString().Substring(0, 6) + "" + fileName.Remove(0, fileName.LastIndexOf("."));
HttpPostedFile fileUpload = context.Request.Files[0];
string upath = "";
if (context.Request.Headers["UName"] != null)
upath = context.Request.Headers["UName"].ToString();
//if (CloudSettings.EnableCloudStorage && upath != "")
// _fileName = upath.Substring(0, 3) + "-" + _fileName; // avoid duplication in cloud storage
if (context.Request.Headers["MTP"] != null)
MediaType = Convert.ToInt32(context.Request.Headers["MTP"]);
//string extensions = "";
//if (MediaType == 0)
// extensions = Site_Settings.Video_Allowed_Formats;
//else
// extensions = Site_Settings.Audio_Allowed_Formats;
//bool sts = UtilityBLL.Check_File_Extension(extensions, fileName.ToLower());
//if (sts == false)
//{
// return "Invalid format, please upload proper video!"; // Invalid video format, please upload proper video
//}
int allowable_size_mb = 0;
if (MediaType == 0)
{
allowable_size_mb = Site_Settings.Video_Max_Size;
}
else
{
allowable_size_mb = Site_Settings.Audio_Max_Size;
}
int UploadSize = allowable_size_mb * 1000000;
if (fileUpload.ContentLength > UploadSize)
{
return "Video Limit Exceeds";
}
string uploadPath = "";
// check whether audio / mp3 encoding enabled
if (this.MediaType == 1)
{
// audio encoding
if (fileName.EndsWith(".mp3"))
{
// upload mp3 directly in mp3 path instead of default path
if (upath == "")
uploadPath = UrlConfig.MP3_Path(); // source video path
else
uploadPath = UrlConfig.MP3_Path(upath); // source video path
}
else
{
// default path
if (upath == "")
uploadPath = UrlConfig.Source_Video_Path(); // source video path
else
uploadPath = UrlConfig.Source_Video_Path(upath); // source video path
}
}
else
{//azure
// default path
if (upath == "")
uploadPath = UrlConfig.Source_Video_Path(); // source video path
else
uploadPath = UrlConfig.Source_Video_Path(upath); // source video path
}
FileStream fs;
using (fs = new FileStream(Path.Combine(uploadPath, fileName), chunk == 0 ? FileMode.Create : FileMode.Append))
{
byte[] buffer = new byte[fileUpload.InputStream.Length];
fileUpload.InputStream.Read(buffer, 0, buffer.Length);
fs.Write(buffer, 0, buffer.Length);
}
return fileName; // "Success";
}
else
{
return "failed";
}
return "";
}
public bool IsReusable {
get {
return false;
}
}code in encode.aspx responsible for encoding the video :
private string EncodeMedia(HttpContext context)
{
string sourcepath = "";
string publishedpath = "";
string mp3path = "";
string thumbpath = "";
if (this.UserName != "")
{//azure
sourcepath = UrlConfig.Source_Video_Path(this.UserName);
publishedpath = UrlConfig.Published_Video_Path(this.UserName);
mp3path = UrlConfig.MP3_Path(this.UserName);
thumbpath = UrlConfig.Thumbs_Path(this.UserName);
}
else
{
sourcepath = UrlConfig.Source_Video_Path();
publishedpath = UrlConfig.Published_Video_Path();
mp3path = UrlConfig.MP3_Path();
thumbpath = UrlConfig.Thumbs_Path();
}
if (this.FileName.EndsWith(".mp3") && this.MediaType == 1)
{
// mp3 and audio format
if (!File.Exists(mp3path + "/" + this.FileName))
{
return "Audio file not found!";
}
}
else
{
// rest normal video and audio encoding
if (!File.Exists(sourcepath + "/" + this.FileName))
{
return "Source file not found!";
}
}
if (CloudSettings.EnableCloudStorage && this.UserName != "")
this.FileName = this.UserName.Substring(0, 3) + "-" + this.FileName; // avoid duplication in cloud storage
//double f_contentlength = 0;
//if (Site_Settings.Feature_Packages == 1)
//{
// if (Config.GetMembershipAccountUpgradeType() != 1)
// {
// // Check whether user have enough space to upload content
// // Restriction only for normal or premium users
// f_contentlength = (double)fileUpload.ContentLength / 1000000;
// string media_field_name = "space_video";
// if (MediaType == 1)
// media_field_name = "space_audio";
// if (!User_PackagesBLL.Check_User_Space_Status(upath, media_field_name, f_contentlength) && !isAdmin)
// {
// // insufficient credits to upload content
// return "Insufficient credits to upload media file"; // Response.Redirect(Config.GetUrl("myaccount/packages.aspx?status=" + media_field_name), true);
// }
// }
//}
this.backgroundpublishing = true; // should be true on direct encoding
// Video Processing
string flv_filename = "";
string original_filename = "";
string thumb_filename = "";
string duration = "";
int duration_sec = 0;
// set video actions : 1 -> on, 0 -> off
int isenabled = 1;
int ispublished = 1;
int isreviewed = 1;
int isresponse = 0;
if (Response_VideoID > 0)
isresponse = 1;
string flv_url = "none";
string thumb_url = "none";
string org_url = "none";
string _embed = "";
string errorcode = "0";
VideoInfo info = null;
if (Site_Settings.Content_Approval == 0)
isreviewed = 0;
// check whether audio / mp3 encoding enabled
if (this.FileName.EndsWith(".mp3") && this.MediaType==1)
{
// audio encoding
// mp3 file already
// so no encoding happens
MediaHandler _minfo = new MediaHandler();
_minfo.FFMPEGPath = Encoding_Settings.FFMPEGPATH;
_minfo.FileName = FileName;
_minfo.InputPath = mp3path;
info = _minfo.Get_Info();
flv_filename = FileName;
original_filename = FileName;
duration = info.Duration;
duration_sec = info.Duration_Sec;
isenabled = 1; // enabled
}
else if (this.directpublishing)
{
// publish video
ArrayList itags = new ArrayList();
MHPEncoder encoder = new MHPEncoder();
//if (this.FileName.EndsWith(".mpeg") || this.FileName.EndsWith(".mpg")) // use mpg compatible ffmpeg encoder
// encoder.FfmpegPath = HttpContext.Current.Server.MapPath(HttpContext.Current.Request.ApplicationPath) + "\\ffmpeg\\ffmpegbk\\ffmpeg.exe";
//encoder.ThumbFfmpegPath = Encoding_Settings.FFMPEGPATH; // use normal ffmpeg encoder for thumbs processing
//azure
encoder.FfmpegPath = Encoding_Settings.FFMPEGPATH; // use normal ffmpeg encoder
encoder.FlvToolPath = Encoding_Settings.FLVTOOLPATH; // set meta information for flv videos
encoder.Mp4BoxPath = Encoding_Settings.MP4BoxPath; // set meta information for mp4 videos
encoder.SourcePath = sourcepath;
encoder.SourceFileName = this.FileName;
// No cloud storage on direct encoding
//if (CloudSettings.EnableCloudStorage)
// encoder.EnableCloudStorage = true;
if (MediaType == 1)
{
// audio encoding
itags.Add("14");
encoder.iTags = itags;
encoder.GrabThumbs = false;
encoder.PublishedPath = mp3path;
//_vprocess.OutputPath = this.MP3Path;
//_vprocess.isAudio = true;
}
else
{
// video encoding
itags.Add(EncoderSettings.DefaultITagValue.ToString()); // 5 for 360p mp4 encoding
//itags.Add(7); // this will call 7 case settings to publish next video ending with _7.mp4 instead of _5.mp4
// so there will be 2 videos with different resoultions published at the end of the process?
// yesmake sure use proper settings first test it directly via command.
//okay i got it. But i'm gonna have to use a different media players to incroporate those settings
// once published you can load different videos for different user by checking _7.mp4 (end) va
//okay got it.
//azure
encoder.PublishedPath = publishedpath;
encoder.iTags = itags;
encoder.ThumbsDirectory = thumbpath;
encoder.TotalThumbs = 15;
//_vprocess.ThumbPath = this.ThumbPath;
//_vprocess.OutputPath = this.FLVPath;
//if (Config.isPostWaterMark())
//{
// // script for posting watermark on video
// _vprocess.WaterMarkPath = Server.MapPath(Request.ApplicationPath) + "\\contents\\watermark";
// _vprocess.WaterMarkImage = "watermark.gif";
//}
}
int deleteoption = Site_Settings.Video_Delete_Original;
if (deleteoption == 1)
{
encoder.DeleteSource = true;
}
// background processing
if (this.backgroundpublishing && this.MediaType==0)
{
encoder.BackgroundProcessing = true;
// get information from source video in order to store it in database
MediaHandler _minfo = new MediaHandler();
//if (this.FileName.EndsWith(".mpeg") || this.FileName.EndsWith(".mpg")) // use mpg compatible ffmpeg encoder
// encoder.FfmpegPath = HttpContext.Current.Server.MapPath(HttpContext.Current.Request.ApplicationPath) + "\\ffmpeg\\ffmpegbk\\ffmpeg.exe";
//else
_minfo.FFMPEGPath = Encoding_Settings.FFMPEGPATH;
_minfo.FileName = FileName;
_minfo.InputPath = sourcepath;
info = _minfo.Get_Info();
}
// encode video processing
Video_Information vinfo = encoder.Process();
if (vinfo.ErrorCode > 0)
{
errorcode = vinfo.ErrorCode.ToString();
ErrorLgBLL.Add_Log("Encoding Failed Log", "", "encoding error: " + vinfo.ErrorCode.ToString() + "<br />Description: " + vinfo.ErrorDescription.ToString());
//return vinfo.ErrorDescription;
}
// Double check validation
// if published video exist
// if thumb exist
// then proceed further
if (MediaType == 0)
{
if (!File.Exists(encoder.PublishedPath + "/" + vinfo.FLVVideoName))
{
return "Video failed to published properly.";
}
if (!File.Exists(encoder.ThumbsDirectory + "/" + vinfo.ThumbFileName))
{
return "Thumbs failed to grab from video properly.";
}
}
else
{
if (vinfo.FLVVideoName == "")
{
vinfo.FLVVideoName = this.FileName.Remove(this.FileName.LastIndexOf(".")) + "_14.mp3"; // mp3 file path name
}
if (!File.Exists(encoder.PublishedPath + "/" + vinfo.FLVVideoName))
{
return "Audio failed to published properly.";
}
}
// Now thumbs and video published, procceed for data processing
// get information from vinfo object
if (this.backgroundpublishing && this.MediaType == 0)
{
string OutputFileName = this.FileName.Remove(this.FileName.LastIndexOf("."));
flv_filename = OutputFileName + "_" + EncoderSettings.DefaultITagValue + "." + EncoderSettings.Return_Output_Extension(EncoderSettings.DefaultITagValue);
original_filename = vinfo.OriginalVideoName;
thumb_filename = OutputFileName + "_008.jpg"; // info.ThumbFileName;
duration = info.Duration;
duration_sec = info.Duration_Sec;
}
else
{
flv_filename = vinfo.FLVVideoName;
original_filename = vinfo.OriginalVideoName;
thumb_filename = vinfo.ThumbFileName;
duration = vinfo.Duration;
duration_sec = vinfo.Duration_Sec;
isenabled = vinfo.isEnabled;
}
// No cloud storage on direct encoding.
// Note cloude storage only works if background processing is disabled
// Or works in cased of sheduled processing
if (CloudSettings.EnableCloudStorage && errorcode == "0")
{
flv_url = "amazon";
org_url = "https://s3.amazonaws.com/" + CloudSettings.OriginalVideoBucketName + "/" + this.FileName;
thumb_url = "https://s3.amazonaws.com/" + CloudSettings.ThumbsBucketName + "/" + thumb_filename;
}
}
else
{
// set publishing status off.
ispublished = 0;
original_filename = this.FileName;
}
// Store video information in database
string ipaddress = context.Request.ServerVariables["REMOTE_ADDR"].ToString();
// Store media information in database
Video_Struct vd = new Video_Struct();
vd.CategoryID = 0; // store categoryname or term instead of category id
vd.Categories = Categories;
vd.UserName = UserName;
vd.Title = "";
vd.Description = "";
vd.Tags = Tags;
vd.Duration = duration;
vd.Duration_Sec = duration_sec;
vd.OriginalVideoFileName = original_filename;
vd.VideoFileName = flv_filename;
vd.ThumbFileName = thumb_filename;
vd.isPrivate = Privacy;
vd.AuthKey = PAuth;
vd.isEnabled = isenabled;
vd.Response_VideoID = Response_VideoID; // video responses
vd.isResponse = isresponse;
vd.isPublished = ispublished;
vd.isReviewed = isreviewed;
vd.FLV_Url = flv_url;
vd.Thumb_Url = thumb_url;
vd.Org_Url = org_url;
vd.Embed_Script = _embed;
vd.isExternal = 0; // website own video, 1: embed video
vd.IPAddress = ipaddress;
vd.Type = MediaType;
vd.YoutubeID = "";
vd.isTagsreViewed = 1;
vd.Mode = 0; // filter videos based on website sections
//vd.ContentLength = f_contentlength;
vd.GalleryID = GID;
vd.ErrorCode = Convert.ToInt32(errorcode);
long videoid = VideoBLL.Process_Info(vd, false);
// Process tags
if (Tags != "")
{
int tag_type = 0; // represent videos
if (MediaType == 1)
tag_type = 4; // represent audio file
TagsBLL.Process_Tags(Tags, tag_type, 0);
}
if (Response_VideoID > 0)
{
VideoBLL.Update_Responses(Response_VideoID);
}
return "Success";
} -