
Recherche avancée
Autres articles (78)
-
Le profil des utilisateurs
12 avril 2011, parChaque utilisateur dispose d’une page de profil lui permettant de modifier ses informations personnelle. Dans le menu de haut de page par défaut, un élément de menu est automatiquement créé à l’initialisation de MediaSPIP, visible uniquement si le visiteur est identifié sur le site.
L’utilisateur a accès à la modification de profil depuis sa page auteur, un lien dans la navigation "Modifier votre profil" est (...) -
Configurer la prise en compte des langues
15 novembre 2010, parAccéder à la configuration et ajouter des langues prises en compte
Afin de configurer la prise en compte de nouvelles langues, il est nécessaire de se rendre dans la partie "Administrer" du site.
De là, dans le menu de navigation, vous pouvez accéder à une partie "Gestion des langues" permettant d’activer la prise en compte de nouvelles langues.
Chaque nouvelle langue ajoutée reste désactivable tant qu’aucun objet n’est créé dans cette langue. Dans ce cas, elle devient grisée dans la configuration et (...) -
XMP PHP
13 mai 2011, parDixit Wikipedia, XMP signifie :
Extensible Metadata Platform ou XMP est un format de métadonnées basé sur XML utilisé dans les applications PDF, de photographie et de graphisme. Il a été lancé par Adobe Systems en avril 2001 en étant intégré à la version 5.0 d’Adobe Acrobat.
Étant basé sur XML, il gère un ensemble de tags dynamiques pour l’utilisation dans le cadre du Web sémantique.
XMP permet d’enregistrer sous forme d’un document XML des informations relatives à un fichier : titre, auteur, historique (...)
Sur d’autres sites (8876)
-
Long Overdue MediaWiki Upgrade
5 février 2014, par Multimedia Mike — GeneralWhat do I do ? What I do ? This library book is 42 years overdue !
I admit that it’s mine, yet I can’t pay the fine,
Should I turn it in or should I hide it again ?
What do I do ? What do I do ?I internalized the forgoing paean to the perils of procrastination by Shel Silverstein in my formative years. It’s probably why I’ve never paid a single cent in late fees in my entire life.
However, I have been woefully negligent as the steward of the MediaWiki software that drives the world famous MultimediaWiki, the internet’s central repository of obscure technical knowledge related to multimedia. It is currently running of version 1.6 software. The latest version is 1.22.
The Story So Far
According to my records, I first set up the wiki late in 2005. I don’t know which MediaWiki release I was using at the time. I probably conducted a few upgrades in the early days, but that went by the wayside perhaps in 2007. My web host stopped allowing shell access and the MediaWiki upgrade process pretty much requires running a PHP script from a command line. Upgrade time came around and I put off the project. Weeks turned into months turned into years until, according to some notes, the wiki abruptly stopped working in July, 2011. Suddenly, there were PHP errors about “Namespace” being a reserved word.While I finally laid out a plan to upgrade the wiki after all these years, I eventually found that the problem had been caused when my webhost upgraded from PHP 5.2 -> 5.3. I also learned of a small number of code changes that caused the problem to go away, thus kicking the can down the road once more.
Then a new problem showed up last week. I think it might be related to a new version of PHP again. This time, a few other things on my site broke, and I learned that my webhost now allows me to select a PHP version to use (with the version then set to “auto”, which didn’t yield much information). Rolling back to an earlier version of PHP might have solved the problem easily.
But NO ! I made the determination that this goes no further. I want this wiki upgraded.
The Arduous Upgrade Path
There are 2 general upgrade paths I can think of :
- Upgrade in place on the server
- Upgrade offline and put the site back on the server
Approach #1 is problematic since I don’t have direct shell access, though I considered using something like PHP Shell. Approach #2 involves getting the entire set of wiki files and a backup of the MySQL tables. This is workable since I keep automated backups of these items anyway.
In fairly short order, I was able to set up a working copy of the MultimediaWiki hosted on a local Linux machine. Now what’s the move ? The MediaWiki software I’m running is 1.6.10. The very latest, as of this upgrade project is 1.22.2. I suppose it’s way too much to hope that the software will upgrade cleanly from 1.6.x straight to 1.22.x, but I guess it’s worth a shot…
HA ! No chance. Okay, next idea is to march through the various versions and upgrade each in turn. MediaWiki has all their historic releases online, all the way back to the 1.3 lineage. I decided that the latest of each lineage should upgrade cleanly from anything in the previous version of lineage. E.g., 1.6.10 should upgrade cleanly to 1.7.3 (last in the 1.7 series). This seemed to be a workable strategy. So I downloaded the latest of each series, unpacked, and copied all the wiki files over the working installation and ran ‘php update.php’ in the maintenance/ directory.
The process is tedious and not without its obstacles. I consider this penance for my years of wiki neglect. First, I run into the “PHP Parse error : syntax error, unexpected T_NAMESPACE, expecting T_STRING” issue, the same that I saw years ago after the webhost transitioned from PHP 5.2 -> 5.3. I could solve this by editing assorted files and changing “Namespace” -> “MWNamespace” (which is what MediaWiki did by version 1.13). But I would prefer not to.
Instead, I downloaded the source for PHP 5.2 and compiled it in a separate directory, then called ‘/path/to/php/5.2/bin/php update.php’. Problem solved.
The next problem is that a bunch of the database update scripts are specifying “Type=InnoDB”. This isn’t supported by modern MySQL databases. Now, it’s “Engine=InnoDB”. A quick search & replace at the command line fixes this for 1.6.x… and 1.7.x… and 1.8 through 1.12. Finally, at 1.13, it was no longer necessary. As a bonus, at 1.13, I was able to test the installation since Namespace had been renamed to MWNamespace. I would later learn that the table type modifications probably could have been simplified in by changing “$wgDBmysql4 = true ;” to “$wgDBmysql5 = true ;” somewhere in LocalSettings.php.
Command line upgrading worked smoothly up through 1.18 series when I got a new syntax error :
<br />
PHP Fatal error: Call to a member function addMessages() on a non-object in /mnt/sdb1/archive/wiki/extensions/Cite.php on line 68<br />Best I could do was comment out that line. I hope that doesn’t break anything important.
In the home stretch, the very last transition (1.21 -> 1.22) failed :
PHP Fatal error : Cannot redeclare wfProfileIn() (previously declared in /mnt/sdb1/archive/wiki/includes/profiler/Profiler.php:33) in /mnt/sdb1/archive/wiki/includes/ProfilerStub.php on line 25
Apparently, this problem arises occasionally since 1.18. I found a way around it thanks to this page : Deleted the file StartProfiler.php. Who am I to argue ?
Upon completing the transition to 1.22, the wiki doesn’t look correct– the pictures aren’t showing up. The solution was to fix the temporary directory via LocalSettings.php.
Back To Production
Okay, it all works again ! Locally, that is. How to get it back to the server ? My first idea was that, knowing that this upgrade process can succeed, try stepping through the upgrade process again, but tell the update.php scripts to access the database tables on multimedia.cx. This seemed to be working for awhile, even though the database update phase often took 4-5 minutes. However, the transition from 1.8.5 -> 1.9.6 took 75 minutes and then timed out. According to my notes, “This isn’t going to work.”The new process :
- Dump the database tables from the local database.
- Create a new database remotely (melanson_wiki_ng).
- Dump the database table into melanson_wiki_ng.
- Move the index.php file out of the wiki files directory temporarily (or rename).
- Modify the LocalSettings.php to talk to the new database.
- Perform a lftp mirror operation in order to send all the files up to the server.
- Send the index.php file and hope beyond hope that everything magically works.
And that’s the story of how the updated MultimediaWiki came back online. Despite the database dump file being over 110 MB, it only tool MySQL 1m45s to transmit it all to the remote server (let’s hear it for the ‘–compress’ option). For comparison, inserting the tables back into a fresh local database took 1m07s.
When the MultimediaWiki was first live again, it loaded, but ever so slowly. This is when I finally looked into optimization and found that I was lacking any caching. So as a bonus, the MultimediaWiki should be much faster now.
Going Forward
For all I know, I did everything described here in the hardest way possible. But at least I got it done. Unless I learn of a better process, future upgrades will probably look similar to this.Additionally, I should probably take some time to figure out what new features are part of the standard MediaWiki distribution nowadays.
-
Correcting errors and inefficiency when concatenating hundreds of clips with time offsets in FFMPEG
16 juillet 2023, par TimmortalI am attempting to automate multicam editing with FFMPEG. Basically there is some number of time-synced videos, typically up to 7. There is also a list of "clips" from the source videos (e.g. Camera 1 from 0-4sec, Camera 2 from 4-5.5sec, Camera 1 from 5.5-8sec). I have tried two methods to render a final video :
overlay
andconcat
and I have not had success with either.

Attempted Method 1 : Overlays


The idea here was for each source video to be an input, then create a large filter of overlays to take place each cut/clip at the correct time. The audio is given as the first input and a black screen is given as the second input.


FFMPEG command :


ffmpeg -y -i audio.flac -f lavfi -i color=c=black:s=1920x1080 \
 -i video1.mp4 -i video2.mp4 -i video3.mp4 -i video4.mp4 -i video5.mp4 \
 -filter_complex_script ffmpeg_filters.txt -map "[edit289]" -map 0:a \
 -vcodec libx264 -c:a copy -t 4629 "output.mp4"




Example
ffmpeg_filters.txt
:

[1:v][2:v]overlay=enable='between(t,0,4.99229)'[edit1];
[edit1][3:v]overlay=enable='between(t,4.99229,10.0078)'[edit2];
[edit2][2:v]overlay=enable='between(t,10.0078,14.0016)'[edit3];
[edit3][5:v]overlay=enable='between(t,14.0016,15.9985)'[edit4];
...
[edit287][2:v]overlay=enable='between(t,4503,4624)'[edit288];
[edit288][4:v]overlay=enable='between(t,4624,4629.99)'[edit289]



The issue here is that it renders at something like 0.04x speed. The final video is something like 1-2 hours, so it would take well over 24 hours to render. This seems absurdly slow. Additionally the documentation specifically says
You can chain together more overlays but you should test the efficiency of such approach.
So perhaps this is not a good way to do it. Is there any way to speed this method up ?

Attempted Method 2 : Render Clips and Concatenate


Since the first method was proving to be far too slow, I attempted to render each clip into its own file named
edit###.mp4
and then concatenate them, but I ran into issues there, as well.

Rendering clips to files (with no audio) :


ffmpeg -y -i video1.mp4 -ss 0 -t 4.99229 -an -c:v libx264 edit000.mp4
ffmpeg -y -i video2.mp4 -ss 4.99229 -t 5.01551 -an -c:v libx264 edit001.mp4
ffmpeg -y -i video1.mp4 -ss 10.0078 -t 3.99383 -an -c:v libx264 edit002.mp4
...
ffmpeg -y -i video3.mp4 -ss 4624 -t 209.001 -an -c:v libx264 edit288.mp4



This appears to mostly work, however, some of the resulting files are under 1K and are not valid video files. I believe they may be too short. I can work around that, so I'm not worried about that part.


Concatenating clips :


ffmpeg -y -i audio.flac -f lavfi -i color=c=black:s=1920x1080 -safe 0 \
 -f concat -i input_clips.txt -filter_complex_script clip_filters.txt \
 -map [0:a] -map "[outv]" -c:v libx264 -t 4833 -c:a aac output_clips.mp4



Since the list of input files is so long, I put them into
input_clips.txt
.clip_filters.txt
contains all the time offsets for each input file.

input_clips.txt


file 'edit000.mp4'
file 'edit001.mp4'
file 'edit002.mp4'
...
file 'edit288.mp4'



clip_filters.txt


[2:v]setpts=PTS-STARTPTS+0/TB[v0];
[3:v]setpts=PTS-STARTPTS+4.99229/TB[v1];
[4:v]setpts=PTS-STARTPTS+10.0078/TB[v2];
[5:v]setpts=PTS-STARTPTS+14.0016/TB[v3];
...
[290:v]setpts=PTS-STARTPTS+4624/TB[v288];
[v0][v1][v2][v3]...[v290]concat=n=289:v=1[outv]



This call causes an error :


...
Input #2, concat, from 'input_clips.txt':
 Duration: N/A, start: 0.000000, bitrate: 4781 kb/s
 Stream #2:0(eng): Video: h264 (High) (avc1 / 0x31637661), yuvj420p(pc, bt709, progressive), 1920x1080 [SAR 1:1 DAR 16:9], 4781 kb/s, 59.94 fps, 59.94 tbr, 60k tbn
 Metadata:
 handler_name : GoPro AVC
 vendor_id : [0][0][0][0]
 encoder : Lavc60.2.100 libx264
 timecode : 10:20:07:10
 Stream #2:1: Unknown: none
Invalid file index 3 in filtergraph description [2:v]setpts=PTS-STARTPTS+0/TB[v0];
[3:v]setpts=PTS-STARTPTS+4.99229/TB[v1];
[4:v]setpts=PTS-STARTPTS+10.0078/TB[v2];
[5:v]setpts=PTS-STARTPTS+14.0016/TB[v3];
...



Is it having problems reading in the 'input_clips.txt' or something ? It's not clear to me what is causing the errors. I have tried using full paths, relative paths, no paths (as shown), and using
file file:edit###.mp4
syntax.

Bottom Line


- 

- Since neither of these methods have been too successful, is there a better way to go about rendering these time-offset cuts to a single video ?
- Is there any way to speed up the
overlay
render ? - What is causing the error in the
concat
method ?








-
Cloaked Archive Wiki
16 mai 2011, par Multimedia Mike — GeneralGoogle’s Chrome browser has made me phenomenally lazy. I don’t even attempt to type proper, complete URLs into the address bar anymore. I just type something vaguely related to the address and let the search engine take over. I saw something weird when I used this method to visit Archive Team’s site :
There’s greater detail when you elect to view more results from the site :
As the administrator of a MediaWiki installation like the one that archiveteam.org runs on, I was a little worried that they might have a spam problem. However, clicking through to any of those out-of-place pages does not indicate anything related to pharmaceuticals. Viewing source also reveals nothing amiss.
I quickly deduced that this is a textbook example of website cloaking. This is when a website reports different content to a search engine than it reports to normal web browsers (humans, presumably). General pseudocode :
C :-
if (web_request.user_agent_string == CRAWLER_USER_AGENT)
-
return cloaked_data ;
-
else
-
return real_data ;
You can verify this for yourself using the
wget
command line utility :<br />
$ wget --quiet --user-agent="<strong>Mozilla/5.0</strong>" \<br />
http://www.archiveteam.org/index.php?title=Geocities -O - | grep \<title\><br />
<title>GeoCities - Archiveteam</title>$ wget —quiet —user-agent="Googlebot/2.1"
http://www.archiveteam.org/index.php?title=Geocities -O - | grep \<title\>
<title>Cheap xanax | Online Drug Store, Big Discounts</title>I guess the little web prank worked because the phaux-pharma stuff got indexed. It makes we wonder if there’s a MediaWiki plugin that does this automatically.
For extra fun, here’s a site called the CloakingDetector which purports to be able to detect whether a page employs cloaking. This is just one humble observer’s opinion, but I don’t think the site works too well :
-