
Recherche avancée
Médias (91)
-
Valkaama DVD Cover Outside
4 octobre 2011, par
Mis à jour : Octobre 2011
Langue : English
Type : Image
-
Valkaama DVD Label
4 octobre 2011, par
Mis à jour : Février 2013
Langue : English
Type : Image
-
Valkaama DVD Cover Inside
4 octobre 2011, par
Mis à jour : Octobre 2011
Langue : English
Type : Image
-
1,000,000
27 septembre 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Audio
-
Demon Seed
26 septembre 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Audio
-
The Four of Us are Dying
26 septembre 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Audio
Autres articles (25)
-
MediaSPIP v0.2
21 juin 2013, parMediaSPIP 0.2 est la première version de MediaSPIP stable.
Sa date de sortie officielle est le 21 juin 2013 et est annoncée ici.
Le fichier zip ici présent contient uniquement les sources de MediaSPIP en version standalone.
Comme pour la version précédente, il est nécessaire d’installer manuellement l’ensemble des dépendances logicielles sur le serveur.
Si vous souhaitez utiliser cette archive pour une installation en mode ferme, il vous faudra également procéder à d’autres modifications (...) -
Les tâches Cron régulières de la ferme
1er décembre 2010, parLa gestion de la ferme passe par l’exécution à intervalle régulier de plusieurs tâches répétitives dites Cron.
Le super Cron (gestion_mutu_super_cron)
Cette tâche, planifiée chaque minute, a pour simple effet d’appeler le Cron de l’ensemble des instances de la mutualisation régulièrement. Couplée avec un Cron système sur le site central de la mutualisation, cela permet de simplement générer des visites régulières sur les différents sites et éviter que les tâches des sites peu visités soient trop (...) -
Mise à disposition des fichiers
14 avril 2011, parPar défaut, lors de son initialisation, MediaSPIP ne permet pas aux visiteurs de télécharger les fichiers qu’ils soient originaux ou le résultat de leur transformation ou encodage. Il permet uniquement de les visualiser.
Cependant, il est possible et facile d’autoriser les visiteurs à avoir accès à ces documents et ce sous différentes formes.
Tout cela se passe dans la page de configuration du squelette. Il vous faut aller dans l’espace d’administration du canal, et choisir dans la navigation (...)
Sur d’autres sites (3284)
-
avformat/daudenc : force 2000 sample packet size with a bsf
3 mars 2024, par Marton Balintavformat/daudenc : force 2000 sample packet size with a bsf
The samples I found all have 2000 sample packets, and by forcing the packet
size with a bsf we could automagically make muxing work for packets containing
more than 3640 samples.Signed-off-by : Marton Balint <cus@passwd.hu>
-
Using ImageMagick to efficiently stitch together a line scan image
2 octobre 2018, par rkantosI’m looking for alternatives for line scan cameras to be used in sports timing, or rather in the part where placing needs to be figured out. I found that common industrial cameras can readily match the speed of commercial camera solutions at >1000 frames per second. For my needs, usually the timing accuracy is not important, but the relative placing of athletes. I figured I could use one of the cheapest Basler, IDS or any other area scan industrial cameras for this purpose. Of course there are line scan cameras that can do a lot more than a few thousand fps (or hz), but it is possible to get area scan cameras that can do the required 1000-3000fps for less than 500€.
My holy grail would of course be the near-real time image composition capabilities of FinishLynx (or any other line scan system), basically this part : https://youtu.be/7CWZvFcwSEk?t=23s
The whole process I was thinking for my alternative is :
- Use Basler Pylon Viewer (or other software) to record 2px wide images at the camera’s fastest read speed. For the camera I am
currently using it means it has to be turned on it’s side and the
height needs to be reduced, since it is the only way it will read
1920x2px frames @ >250fps - Make a program or batch script that then stitches these 1920x2px frames together to, for example one second of recording 1000*1920x2px
frames, meaning a resulting image with a resolution of 1920x2000px
(Horizontal x Vertical). - Finally using the same program or another way, just rotate the image so it reflects how the camera is positioned, thus achieving an image
with a resolution of 2000x1920px (again Horizontal x Vertical) - Open the image in an analyzing program (currently ImageJ) to quickly analyze results
I am no programmer, but this is what I was able to put together just using batch scripts, with the help of stackoverflow of course.
- Currently recording a whole 10 seconds for example to disk as a raw/mjpeg(avi/mkv) stream can be done in real time.
- Recording individual frames as TIFF or BMP, or using FFMPEG to save them as PNG or JPG takes 20-60 seconds The appending and rotation
then takes a further 45-60 seconds
This all needs to be achieved in less than 60 seconds for 10 seconds of footage(1000-3000fps @ 10s = 10000-30000 frames) , thus why I need something faster.
I was able to figure out how to be pretty efficient with ImageMagick :
magick convert -limit file 16384 -limit memory 8GiB -interlace Plane -quality 85 -append +rotate 270 “%folder%\Basler*.Tiff” “%out%”
#%out% has a .jpg -filename that is dynamically made from folder name and number of frames.This command works and gets me 10000 frames encoded in about 30 seconds on a i5-2520m (most of the processing seems to be using only one thread though, since it is working at 25% cpu usage). This is the resulting image : https://i.imgur.com/OD4RqL7.jpg (19686x1928px)
However since recording to TIFF frames using Basler’s Pylon Viewer takes just that much longer than recording an MJPEG video stream, I would like to use the MJPEG (avi/mkv) file as a source for the appending. I noticed FFMPEG has “image2pipe” -command, which should be able to directly give images to ImageMagick. I was not able to get this working though :
$ ffmpeg.exe -threads 4 -y -i "Basler acA1920-155uc (21644989)_20180930_043754312.avi" -f image2pipe - | convert - -interlace Plane -quality 85 -append +rotate 270 "%out%" >> log.txt
ffmpeg version 3.4 Copyright (c) 2000-2017 the FFmpeg developers
built with gcc 7.2.0 (GCC)
configuration: –enable-gpl –enable-version3 –enable-sdl2 –enable-bzlib –enable-fontconfig –enable-gnutls –enable-iconv –enable-libass –enable-libbluray –enable-libfreetype –enable-libmp3lame –enable-libopenjpeg –enable-libopus –enable-libshine –enable-libsnappy –enable-libsoxr –enable-libtheora –enable-libtwolame –enable-libvpx –enable-libwavpack –enable-libwebp –enable-libx264 –enable-libx265 –enable-libxml2 –enable-libzimg –enable-lzma –enable-zlib –enable-gmp –enable-libvidstab –enable-libvorbis –enable-cuda –enable-cuvid –enable-d3d11va –enable-nvenc –enable-dxva2 –enable-avisynth –enable-libmfx
libavutil 55. 78.100 / 55. 78.100
libavcodec 57.107.100 / 57.107.100
libavformat 57. 83.100 / 57. 83.100
libavdevice 57. 10.100 / 57. 10.100
libavfilter 6.107.100 / 6.107.100
libswscale 4. 8.100 / 4. 8.100
libswresample 2. 9.100 / 2. 9.100
libpostproc 54. 7.100 / 54. 7.100
Invalid Parameter - -interlace
[mjpeg @ 000000000046b0a0] EOI missing, emulating
Input #0, avi, from 'Basler acA1920-155uc (21644989)_20180930_043754312.avi’:
Duration: 00:00:50.02, start: 0.000000, bitrate: 1356 kb/s
Stream #0:0: Video: mjpeg (MJPG / 0x47504A4D), yuvj422p(pc, bt470bg/unknown/unknown), 1920x2, 1318 kb/s, 200 fps, 200 tbr, 200 tbn, 200 tbc
Stream mapping:
Stream #0:0 -> #0:0 (mjpeg (native) -> mjpeg (native))
Press [q] to stop, [?] for help
Output #0, image2pipe, to ‘pipe:’:
Metadata:
encoder : Lavf57.83.100
Stream #0:0: Video: mjpeg, yuvj422p(pc), 1920x2, q=2-31, 200 kb/s, 200 fps, 200 tbn, 200 tbc
Metadata:
encoder : Lavc57.107.100 mjpeg
Side data:
cpb: bitrate max/min/avg: 0/0/200000 buffer size: 0 vbv_delay: -1
av_interleaved_write_frame(): Invalid argument
Error writing trailer of pipe:: Invalid argument
frame= 1 fps=0.0 q=1.6 Lsize= 0kB time=00:00:00.01 bitrate= 358.4kbits/s speed=0.625x
video:0kB audio:0kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: 0.000000%
Conversion failed!If I go a bit higher for the height, I no longer get the “[mjpeg @ 000000000046b0a0] EOI missing, emulating” -error. However the whole thing will only work with <2px high/wide footage.
edit : Oh yes, I can also use
ffmpeg -i file.mpg -r 1/1 $filename%03d.bmp
orffmpeg -i file.mpg $filename%03d.bmp
to extract all the frames from the MJPEG/RAW stream. However this is an extra step I do not want to take. (just deleting a folder of 30000 jpgs takes 2 minutes alone…)Can someone think of a working solution for the piping method or a totally different alternative way of handling this ?
- Use Basler Pylon Viewer (or other software) to record 2px wide images at the camera’s fastest read speed. For the camera I am
-
Small Time DevOps
1er janvier 2021, par Multimedia Mike — GeneralWhen you are a certain type of nerd who has been on the internet for long enough, you might run the risk of accumulating a lot of projects and websites. Website-wise, I have this multimedia.cx domain on which I host a bunch of ancient static multimedia documents as well as this PHP/MySQL-based blog. Further, there are 3 other PHP/MySQL-based blogs hosted on subdomains. Also, there is the wiki, another PHP/MySQL web app. A few other custom PHP- and Python-based apps are running around on the server as well.
While things largely run on auto-pilot, I need to concern myself every now and then with their ongoing upkeep.
If you ask N different people about the meaning of the term ‘DevOps’, you will surely get N different definitions. However, whenever I have to perform VM maintenance, I like to think I am at least dipping my toes into the DevOps domain. At the very least, the job seems to be concerned with making infrastructure setup and upgrades reliable and repeatable.
Even if it’s not fully automated, at the very least, I have generated a lot of lists for how to make things work (I’m a big fan of Trello’s Kanban boards for this), so it gets easier every time (ideally, anyway).
Infrastructure History
For a solid decade, from 2004 to 2014, everything was hosted on shared, cPanel-based web hosting. In mid-2014, I moved from the shared hosting over to my own VPSs, hosted on DigitalOcean. I must have used Ubuntu 14.04 at the time, as I look down down the list of Ubuntu LTS releases. It was with much trepidation that I undertook this task (knowing that anything that might go wrong with the stack, from the OS up to the apps, would all be firmly my fault), but it turned out not to be that bad. The earliest lesson you learn for such a small-time setup is to have a frontend VPS (web server) and a backend VPS (database server). That way, a surge in HTTP requests has no chance of crashing the database server due to depleted memory.
At the end of 2016, I decided to refresh the VMs. I brought them up to Ubuntu 16.04 at the time.
Earlier this year, I decided it would be a good idea to refresh the VMs again since it had been more than 3 years. The VMs were getting long in the tooth. Plus, I had seen an article speculating that Azure, another notable cloud hosting environment, might be getting full. It made me feel like I should grab some resources while I still could (resource-hoarding was in this year).
I decided to use 18.04 for these refreshed VMs, even though 20.04 was available. I think I was a little nervous about 20.04 because I heard weird things about something called snap packages being the new standard for distributing software for the platform and I wasn’t ready to take that plunge.
Which brings me to this month’s VM refresh in which I opted to take the 20.04 plunge.
Oh MediaWiki
I’ve been the maintainer and caretaker of the MultimediaWiki for 15 years now (wow ! Where does the time go ?). It doesn’t see a lot of updating these days, but I know it still serves as a resource for lots of obscure technical multimedia information. I still get requests for new accounts because someone has uncovered some niche technical data and wants to make sure it gets properly documented.
MediaWiki is quite an amazing bit of software and it undergoes constant development and improvement. According to the version history, I probably started the MultimediaWiki with the 1.5 series. As of this writing, 1.35 is the latest and therefore greatest lineage.
This pace of development can make it a bit of a chore to keep up to date. This was particularly true in the old days of the shared hosting when you didn’t have direct shell access and so it’s something you put off for a long time.
Honestly, to be fair, the upgrade process is pretty straightforward :
- Unpack a set of new files on top of the existing tree
- Run a PHP script to perform any database table upgrades
Pretty straightforward, assuming that there are no hiccups along the way, right ? And the vast majority of the time, that’s the case. Until it’s not. I had an upgrade go south about a year and a half ago (I wasn’t the only MW installation to have the problem at the time, I learned). While I do have proper backups, it still threw me for a loop and I worked for about an hour to restore the previous version of the site. That experience understandably left me a bit gun-shy about upgrading the wiki.
But upgrades must happen, especially when security notices come out. Eventually, I created a Trello template with a solid, 18-step checklist for upgrading MW as soon as a new version shows up. It’s still a chore, just not so nerve-wracking when the steps are all enumerated like that.
As I compose the post, I think I recall my impetus for wanting to refresh from the 16.04 VM. 16.04 used PHP 7.0. I wanted to upgrade to the latest MW, but if I tried to do so, it warned me that it needed PHP 7.4. So I initialized the new 18.04 VM as described above… only to realize that PHP 7.2 is the default on 18.04. You need to go all the way to 20.04 for 7.4 standard. I’m sure it’s possible to install later versions of PHP on 16.04 or 18.04, but I appreciate going with the defaults provided by the distro.
I figured I would just stay with MediaWiki 1.34 series and eschew 1.35 series (requiring PHP 7.4) for the time being… until I started getting emails that 1.34 would go end-of-life soon. Oh, and there are some critical security updates, but those are only for 1.35 (and also 1.31 series which is still stubbornly being maintained for some reason).
So here I am with a fresh Ubuntu 20.04 VM running PHP 7.4 and MediaWiki 1.35 series.
How Much Process ?
Anyone who decides to host on VPSs vs, say, shared hosting is (or ought to be) versed on the matter that all your data is your own problem and that glitches sometimes happen and that your VM might just suddenly disappear. (Indeed, I’ve read rants about VMs disappearing and taking entire un-backed-up websites with them, and also watched as the ranters get no sympathy– “yeah, it’s a VM ; the data is your responsibility”) So I like to make sure I have enough notes so that I could bring up a new VM quickly if I ever needed to.
But the process is a lot of manual steps. Sometimes I wonder if I need to use some automation software like Ansible in order to bring a new VM to life. Why do that if I only update the VM once every 1-3 years ? Well, perhaps I should update more frequently in order to ensure the process is solid ?
Seems like a lot of effort for a few websites which really don’t see much traffic in the grand scheme of things. But it still might be an interesting exercise and might be good preparation for some other websites I have in mind.
Besides, if I really wanted to go off the deep end, I would wrap everything up in containers and deploy using D-O’s managed Kubernetes solution.
The post Small Time DevOps first appeared on Breaking Eggs And Making Omelettes.