Recherche avancée

Médias (91)

Autres articles (30)

  • Librairies et logiciels spécifiques aux médias

    10 décembre 2010, par

    Pour un fonctionnement correct et optimal, plusieurs choses sont à prendre en considération.
    Il est important, après avoir installé apache2, mysql et php5, d’installer d’autres logiciels nécessaires dont les installations sont décrites dans les liens afférants. Un ensemble de librairies multimedias (x264, libtheora, libvpx) utilisées pour l’encodage et le décodage des vidéos et sons afin de supporter le plus grand nombre de fichiers possibles. Cf. : ce tutoriel ; FFMpeg avec le maximum de décodeurs et (...)

  • MediaSPIP 0.1 Beta version

    25 avril 2011, par

    MediaSPIP 0.1 beta is the first version of MediaSPIP proclaimed as "usable".
    The zip file provided here only contains the sources of MediaSPIP in its standalone version.
    To get a working installation, you must manually install all-software dependencies on the server.
    If you want to use this archive for an installation in "farm mode", you will also need to proceed to other manual (...)

  • L’agrémenter visuellement

    10 avril 2011

    MediaSPIP est basé sur un système de thèmes et de squelettes. Les squelettes définissent le placement des informations dans la page, définissant un usage spécifique de la plateforme, et les thèmes l’habillage graphique général.
    Chacun peut proposer un nouveau thème graphique ou un squelette et le mettre à disposition de la communauté.

Sur d’autres sites (6259)

  • Android ExoPlayer stream mp3 over HTTP

    8 juillet 2016, par kevintcoughlin

    I’m trying to get a grasp on the new ExoPlayer library introduced this year at Google I/O 2014 so that I can incorporate it into my application.

    I’m attempting to stream an mp3 over HTTP, but so far have been unsuccessful. I’m not sure if it’s possible, but I’m trying to accomplish this without extending any of the base Source/Sample classes. My code is as follows :

    In my Activity

    SampleSource s = new FrameworkSampleSource(this, Uri.parse("http://traffic.libsyn.com/joeroganexp/p518.mp3"), null, 1);

    // Since I only have 1 audio renderer
    ExoPlayer player = ExoPlayer.Factory.newInstance(1);

    MediaCodecAudioTrackRenderer audioRenderer = new MediaCodecAudioTrackRenderer(s);
    player.prepare(audioRenderer);
    player.setPlayWhenReady(true);

    Logcat

    3363-3376/com.kevintcoughlin.smodr V/FFmpegExtractor﹕ SniffFFMPEG
    07-06 15:52:34.080    3363-3376/com.kevintcoughlin.smodr I/FFmpegExtractor﹕ android-source:0xb7c53e00
    07-06 15:52:34.084    3363-3376/com.kevintcoughlin.smodr D/FFMPEG﹕ android source begin open
    07-06 15:52:34.084    3363-3376/com.kevintcoughlin.smodr D/FFMPEG﹕ android open, url: android-source:0xb7c53e00
    07-06 15:52:34.084    3363-3376/com.kevintcoughlin.smodr D/FFMPEG﹕ ffmpeg open android data source success, source ptr: 0xb7c53e00
    07-06 15:52:34.088    3363-3376/com.kevintcoughlin.smodr D/FFMPEG﹕ android source open success
    07-06 15:52:34.108    3363-3363/com.kevintcoughlin.smodr W/EGL_genymotion﹕ eglSurfaceAttrib not implemented
    07-06 15:52:34.116    3363-3363/com.kevintcoughlin.smodr E/OpenGLRenderer﹕ Getting MAX_TEXTURE_SIZE from GradienCache
    07-06 15:52:34.120    3363-3363/com.kevintcoughlin.smodr E/OpenGLRenderer﹕ MAX_TEXTURE_SIZE: 16384
    07-06 15:52:34.128    3363-3363/com.kevintcoughlin.smodr E/OpenGLRenderer﹕ Getting MAX_TEXTURE_SIZE from Caches::initConstraints()
    07-06 15:52:34.128    3363-3363/com.kevintcoughlin.smodr E/OpenGLRenderer﹕ MAX_TEXTURE_SIZE: 16384
    07-06 15:52:34.128    3363-3363/com.kevintcoughlin.smodr D/OpenGLRenderer﹕ Enabling debug mode 0
    07-06 15:52:34.640    3363-3376/com.kevintcoughlin.smodr I/FFMPEG﹕ [mp3 @ 0xb7c57040] Estimating duration from bitrate, this may be inaccurate
    07-06 15:52:34.640    3363-3376/com.kevintcoughlin.smodr I/FFMPEG﹕ Input #0, mp3, from 'android-source:0xb7c53e00':
    07-06 15:52:34.640    3363-3376/com.kevintcoughlin.smodr I/FFMPEG﹕ Metadata:
    07-06 15:52:34.640    3363-3376/com.kevintcoughlin.smodr I/FFMPEG﹕ TSS             : Logic Pro 9.1.8
    07-06 15:52:34.644    3363-3376/com.kevintcoughlin.smodr I/FFMPEG﹕ title           : #518 - Matt Fulchiron
    07-06 15:52:34.644    3363-3376/com.kevintcoughlin.smodr I/FFMPEG﹕ artist          : Joe Rogan Experience
    07-06 15:52:34.644    3363-3376/com.kevintcoughlin.smodr I/FFMPEG﹕ album_artist    : Joe Rogan Experience
    07-06 15:52:34.644    3363-3376/com.kevintcoughlin.smodr I/FFMPEG﹕ Duration: 02:57:10.21, start: 0.000000, bitrate: 127 kb/s
    07-06 15:52:34.644    3363-3376/com.kevintcoughlin.smodr I/FFMPEG﹕ Stream #0:0: Audio: mp3, 44100 Hz, stereo, s16p, 128 kb/s
    07-06 15:52:34.644    3363-3376/com.kevintcoughlin.smodr I/FFmpegExtractor﹕ FFmpegExtrator, url: android-source:0xb7c53e00, format_name: mp3, format_long_name: MP2/3 (MPEG audio layer 2/3)
    07-06 15:52:34.644    3363-3376/com.kevintcoughlin.smodr I/FFmpegExtractor﹕ list the formats suppoted by ffmpeg:
    07-06 15:52:34.644    3363-3376/com.kevintcoughlin.smodr I/FFmpegExtractor﹕ ========================================
    07-06 15:52:34.644    3363-3376/com.kevintcoughlin.smodr V/FFmpegExtractor﹕ format_names[00]: mpeg
    07-06 15:52:34.644    3363-3376/com.kevintcoughlin.smodr V/FFmpegExtractor﹕ format_names[01]: mpegts
    07-06 15:52:34.644    3363-3376/com.kevintcoughlin.smodr V/FFmpegExtractor﹕ format_names[02]: mov,mp4,m4a,3gp,3g2,mj2
    07-06 15:52:34.644    3363-3376/com.kevintcoughlin.smodr V/FFmpegExtractor﹕ format_names[03]: matroska,webm
    07-06 15:52:34.644    3363-3376/com.kevintcoughlin.smodr V/FFmpegExtractor﹕ format_names[04]: asf
    07-06 15:52:34.644    3363-3376/com.kevintcoughlin.smodr V/FFmpegExtractor﹕ format_names[05]: rm
    07-06 15:52:34.644    3363-3376/com.kevintcoughlin.smodr V/FFmpegExtractor﹕ format_names[06]: flv
    07-06 15:52:34.644    3363-3376/com.kevintcoughlin.smodr V/FFmpegExtractor﹕ format_names[07]: swf
    07-06 15:52:34.644    3363-3376/com.kevintcoughlin.smodr V/FFmpegExtractor﹕ format_names[08]: avi
    07-06 15:52:34.644    3363-3376/com.kevintcoughlin.smodr V/FFmpegExtractor﹕ format_names[09]: ape
    07-06 15:52:34.644    3363-3376/com.kevintcoughlin.smodr V/FFmpegExtractor﹕ format_names[10]: dts
    07-06 15:52:34.644    3363-3376/com.kevintcoughlin.smodr V/FFmpegExtractor﹕ format_names[11]: flac
    07-06 15:52:34.644    3363-3376/com.kevintcoughlin.smodr V/FFmpegExtractor﹕ format_names[12]: ac3
    07-06 15:52:34.644    3363-3376/com.kevintcoughlin.smodr V/FFmpegExtractor﹕ format_names[13]: wav
    07-06 15:52:34.644    3363-3376/com.kevintcoughlin.smodr V/FFmpegExtractor﹕ format_names[14]: ogg
    07-06 15:52:34.644    3363-3376/com.kevintcoughlin.smodr V/FFmpegExtractor﹕ format_names[15]: vc1
    07-06 15:52:34.644    3363-3376/com.kevintcoughlin.smodr V/FFmpegExtractor﹕ format_names[16]: hevc
    07-06 15:52:34.644    3363-3376/com.kevintcoughlin.smodr I/FFmpegExtractor﹕ ========================================
    07-06 15:52:34.644    3363-3376/com.kevintcoughlin.smodr D/FFMPEG﹕ android source close
    07-06 15:52:34.644    3363-3376/com.kevintcoughlin.smodr W/FFmpegExtractor﹕ sniff through BetterSniffFFMPEG failed, try LegacySniffFFMPEG
    07-06 15:52:34.644    3363-3376/com.kevintcoughlin.smodr I/FFmpegExtractor﹕ source url:http://traffic.libsyn.com/joeroganexp/p518.mp3
    07-06 15:52:34.644    3363-3376/com.kevintcoughlin.smodr D/FFMPEG﹕ android source begin open
    07-06 15:52:34.648    3363-3376/com.kevintcoughlin.smodr D/FFMPEG﹕ android open, url: android-source:0xb7c53e00|file:http://traffic.libsyn.com/joeroganexp/p518.mp3
    07-06 15:52:34.648    3363-3376/com.kevintcoughlin.smodr D/FFMPEG﹕ ffmpeg open android data source success, source ptr: 0xb7c53e00
    07-06 15:52:34.648    3363-3376/com.kevintcoughlin.smodr D/FFMPEG﹕ android source open success
    07-06 15:52:34.708    3363-3394/com.kevintcoughlin.smodr D/dalvikvm﹕ GC_FOR_ALLOC freed 361K, 4% free 10852K/11288K, paused 75ms, total 75ms
    07-06 15:52:34.844    3363-3376/com.kevintcoughlin.smodr I/FFMPEG﹕ [mp3 @ 0xb7ca7700] Estimating duration from bitrate, this may be inaccurate
    07-06 15:52:34.844    3363-3376/com.kevintcoughlin.smodr I/FFMPEG﹕ Input #0, mp3, from 'android-source:0xb7c53e00|file:http://traffic.libsyn.com/joeroganexp/p518.mp3':
    07-06 15:52:34.844    3363-3376/com.kevintcoughlin.smodr I/FFMPEG﹕ Metadata:
    07-06 15:52:34.844    3363-3376/com.kevintcoughlin.smodr I/FFMPEG﹕ TSS             : Logic Pro 9.1.8
    07-06 15:52:34.848    3363-3376/com.kevintcoughlin.smodr I/FFMPEG﹕ title           : #518 - Matt Fulchiron
    07-06 15:52:34.848    3363-3376/com.kevintcoughlin.smodr I/FFMPEG﹕ artist          : Joe Rogan Experience
    07-06 15:52:34.848    3363-3376/com.kevintcoughlin.smodr I/FFMPEG﹕ album_artist    : Joe Rogan Experience
    07-06 15:52:34.848    3363-3376/com.kevintcoughlin.smodr I/FFMPEG﹕ Duration: 02:57:10.21, start: 0.000000, bitrate: 127 kb/s
    07-06 15:52:34.848    3363-3376/com.kevintcoughlin.smodr I/FFMPEG﹕ Stream #0:0: Audio: mp3, 44100 Hz, stereo, s16p, 128 kb/s
    07-06 15:52:34.848    3363-3376/com.kevintcoughlin.smodr I/FFmpegExtractor﹕ FFmpegExtrator, url: android-source:0xb7c53e00|file:http://traffic.libsyn.com/joeroganexp/p518.mp3, format_name: mp3, format_long_name: MP2/3 (MPEG audio layer 2/3)
    07-06 15:52:34.852    3363-3376/com.kevintcoughlin.smodr I/FFmpegExtractor﹕ list the formats suppoted by ffmpeg:
    07-06 15:52:34.852    3363-3376/com.kevintcoughlin.smodr I/FFmpegExtractor﹕ ========================================
    07-06 15:52:34.852    3363-3376/com.kevintcoughlin.smodr V/FFmpegExtractor﹕ format_names[00]: mpeg
    07-06 15:52:34.852    3363-3376/com.kevintcoughlin.smodr V/FFmpegExtractor﹕ format_names[01]: mpegts
    07-06 15:52:34.852    3363-3376/com.kevintcoughlin.smodr V/FFmpegExtractor﹕ format_names[02]: mov,mp4,m4a,3gp,3g2,mj2
    07-06 15:52:34.852    3363-3376/com.kevintcoughlin.smodr V/FFmpegExtractor﹕ format_names[03]: matroska,webm
    07-06 15:52:34.852    3363-3376/com.kevintcoughlin.smodr V/FFmpegExtractor﹕ format_names[04]: asf
    07-06 15:52:34.852    3363-3376/com.kevintcoughlin.smodr V/FFmpegExtractor﹕ format_names[05]: rm
    07-06 15:52:34.852    3363-3376/com.kevintcoughlin.smodr V/FFmpegExtractor﹕ format_names[06]: flv
    07-06 15:52:34.852    3363-3376/com.kevintcoughlin.smodr V/FFmpegExtractor﹕ format_names[07]: swf
    07-06 15:52:34.852    3363-3376/com.kevintcoughlin.smodr V/FFmpegExtractor﹕ format_names[08]: avi
    07-06 15:52:34.852    3363-3376/com.kevintcoughlin.smodr V/FFmpegExtractor﹕ format_names[09]: ape
    07-06 15:52:34.852    3363-3376/com.kevintcoughlin.smodr V/FFmpegExtractor﹕ format_names[10]: dts
    07-06 15:52:34.856    3363-3376/com.kevintcoughlin.smodr V/FFmpegExtractor﹕ format_names[11]: flac
    07-06 15:52:34.856    3363-3376/com.kevintcoughlin.smodr V/FFmpegExtractor﹕ format_names[12]: ac3
    07-06 15:52:34.856    3363-3376/com.kevintcoughlin.smodr V/FFmpegExtractor﹕ format_names[13]: wav
    07-06 15:52:34.856    3363-3376/com.kevintcoughlin.smodr V/FFmpegExtractor﹕ format_names[14]: ogg
    07-06 15:52:34.856    3363-3376/com.kevintcoughlin.smodr V/FFmpegExtractor﹕ format_names[15]: vc1
    07-06 15:52:34.856    3363-3376/com.kevintcoughlin.smodr V/FFmpegExtractor﹕ format_names[16]: hevc
    07-06 15:52:34.856    3363-3376/com.kevintcoughlin.smodr I/FFmpegExtractor﹕ ========================================
    07-06 15:52:34.856    3363-3376/com.kevintcoughlin.smodr D/FFMPEG﹕ android source close
    07-06 15:52:34.856    3363-3376/com.kevintcoughlin.smodr D/FFmpegExtractor﹕ SniffFFMPEG failed to sniff this source
    07-06 15:52:34.856    3363-3397/com.kevintcoughlin.smodr D/dalvikvm﹕ GC_FOR_ALLOC freed 120K, 3% free 12342K/12696K, paused 29ms, total 60ms
    07-06 15:52:34.860    3363-3397/com.kevintcoughlin.smodr I/dalvikvm-heap﹕ Grow heap (frag case) to 13.981MB for 1960012-byte allocation
    07-06 15:52:34.872    3363-3387/com.kevintcoughlin.smodr D/dalvikvm﹕ GC_FOR_ALLOC freed <1K, 3% free 14256K/14612K, paused 10ms, total 10ms
    07-06 15:52:34.880    3363-3376/com.kevintcoughlin.smodr I/OMXClient﹕ Using client-side OMX mux.
    07-06 15:52:34.976    3363-3400/com.kevintcoughlin.smodr I/OMXClient﹕ Using client-side OMX mux.
    07-06 15:52:35.144    3363-3393/com.kevintcoughlin.smodr D/dalvikvm﹕ GC_FOR_ALLOC freed 1512K, 12% free 13705K/15532K, paused 12ms, total 36ms
    07-06 15:52:35.152    3363-3393/com.kevintcoughlin.smodr I/dalvikvm-heap﹕ Grow heap (frag case) to 15.315MB for 1962812-byte allocation
    07-06 15:52:35.200    3363-3388/com.kevintcoughlin.smodr D/dalvikvm﹕ GC_FOR_ALLOC freed <1K, 11% free 15621K/17452K, paused 39ms, total 39ms
    07-06 15:52:35.356    3363-3400/com.kevintcoughlin.smodr E/ACodec﹕ OMX/mediaserver died, signalling error!
    07-06 15:52:35.356    3363-3400/com.kevintcoughlin.smodr E/MediaCodec﹕ Codec reported an error. (omx error 0x8000100d, internalError -32)
    07-06 15:52:35.376    3363-3400/com.kevintcoughlin.smodr A/ACodec﹕ frameworks/av/media/libstagefright/ACodec.cpp:499 CHECK(mem.get() != NULL) failed.
    07-06 15:52:35.376    3363-3400/com.kevintcoughlin.smodr A/libc﹕ Fatal signal 4 (SIGILL) at 0xb76d563d (code=2), thread 3400 (MediaCodec_loop)

    To my untrained eye it seems like I can get the TrackInfo correctly, but am having a problem reading the data over HTTP. It also seems that how the framework sniffs the mimetype is failing even though I do receive type ’audio/mpeg’.

    Again, I appreciate any direction I receive. I realize that these APIs are very new.

    Thanks !

  • Adventures In NAS

    1er janvier, par Multimedia Mike — General

    In my post last year about my out-of-control single-board computer (SBC) collection which included my meager network attached storage (NAS) solution, I noted that :

    I find that a lot of my fellow nerds massively overengineer their homelab NAS setups. I’ll explore this in a future post. For my part, people tend to find my homelab NAS solution slightly underengineered.

    So here I am, exploring this is a future post. I’ve been in the home NAS game a long time, but have never had very elaborate solutions for such. For my part, I tend to take an obsessively reductionist view of what constitutes a NAS : Any small computer with a pool of storage and a network connection, running the Linux operating system and the Samba file sharing service.


    Simple hard drive and ethernet cable

    Many home users prefer to buy turnkey boxes, usually that allow you to install hard drives yourself, and then configure the box and its services with a friendly UI. My fellow weird computer nerds often buy cast-off enterprise hardware and set up more resilient, over-engineered solutions, as long as they have strategies to mitigate the noise and dissipate the heat, and don’t mind the electricity bills.

    If it works, awesome ! As an old hand at this, I am rather stuck in my ways, however, preferring to do my own stunts, both with the hardware and software solutions.

    My History With Home NAS Setups
    In 1998, I bought myself a new computer — beige box tower PC, as was the style as the time. This was when normal people only had one computer at most. It ran Windows, but I was curious about this new thing called “Linux” and learned to dual boot that. Later that year, it dawned on me that nothing prevented me from buying a second ugly beige box PC and running Linux exclusively on it. Further, it could be a headless Linux box, connected by ethernet, and I could consolidate files into a single place using this file sharing software named Samba.

    I remember it being fairly onerous to get Samba working in those days. And the internet was not quite so helpful in those days. I recall that the thing that blocked me for awhile was needing to know that I had to specify an entry for the Samba server machine in the LMHOSTS (Lanman hosts) file on the Windows 95 machine.

    However, after I cracked that code, I have pretty much always had some kind of ad-hoc home NAS setup, often combined with a headless Linux development box.

    In the early 2000s, I built a new beige box PC for a file server, with a new hard disk, and a coworker tutored me on setting up a (P)ATA UDMA 133 (or was it 150 ? anyway, it was (P)ATA’s last hurrah before SATA conquered all) expansion card and I remember profiling that the attached hard drive worked at a full 21 MBytes/s reading. It was pretty slick. Except I hadn’t really thought things through. You see, I had a hand-me-down ethernet hub cast-off from my job at the time which I wanted to use. It was a 100 Mbps repeater hub, not a switch, so the catch was that all connected machines had to be capable of 100 Mbps. So, after getting all of my machines (3 at the time) upgraded to support 10/100 ethernet (the old off-brand PowerPC running Linux was the biggest challenge), I profiled transfers and realized that the best this repeater hub could achieve was about 3.6 MBytes/s. For a long time after that, I just assumed that was the upper limit of what a 100 Mbps network could achieve. Obviously, I now know that the upper limit ought to be around 11.2 MBytes/s and if I had gamed out that fact in advance, I would have realized it didn’t make sense to care about super-fast (for the time) disk performance.

    At this time, I was doing a lot for development for MPlayer/xine/FFmpeg. I stored all of my multimedia material on this NAS. I remember being confused when I was working with Y4M data, which is raw frames, which is lots of data. xine, which employed a pre-buffering strategy, would play fine for a few seconds and then stutter. Eventually, I reasoned out that the files I was working with had a data rate about twice what my awful repeater hub supported, which is probably the first time I came to really understand and respect streaming speeds and their implications for multimedia playback.

    Smaller Solutions
    For a period, I didn’t have a NAS. Then I got an Apple AirPort Extreme, which I noticed had a USB port. So I bought a dual drive brick to plug into it and used that for a time. Later (2009), I had this thing called the MSI Wind Nettop which is the only PC I’ve ever seen that can use a CompactFlash (CF) card for a boot drive. So I did just that, and installed a large drive so it could function as a NAS, as well as a headless dev box. I’m still amazed at what a low-power I/O beast this thing is, at least when compared to all the ARM SoCs I have tried in the intervening 1.5 decades. I’ve had spinning hard drives in this thing that could read at 160 MBytes/s (‘dd’ method) and have no trouble saturating the gigabit link at 112 MBytes/s, all with its early Intel Atom CPU.

    Around 2015, I wanted a more capable headless dev box and discovered Intel’s line of NUCs. I got one of the fat models that can hold a conventional 2.5″ spinning drive in addition to the M.2 SATA SSD and I was off and running. That served me fine for a few years, until I got into the ARM SBC scene. One major limitation here is that 2.5″ drives aren’t available in nearly the capacities that make a NAS solution attractive.

    Current Solution
    My current NAS solution, chronicled in my last SBC post– the ODroid-HC2, which is a highly compact ARM SoC with an integrated USB3-SATA bridge so that a SATA drive can be connected directly to it :


    ODROID-HC2 NAS

    ODROID-HC2 NAS


    I tend to be weirdly proficient at recalling dates, so I’m surprised that I can’t recall when I ordered this and put it into service. But I’m pretty sure it was circa 2018. It’s only equipped with an 8 TB drive now, but I seem to recall that it started out with only a 4 TB drive. I think I upgraded to the 8 TB drive early in the pandemic in 2020, when ISPs were implementing temporary data cap amnesty and I was doing what a r/DataHoarder does.

    The HC2 has served me well, even though it has a number of shortcomings for a hardware set chartered for NAS :

    1. While it has a gigabit ethernet port, it’s documented that it never really exceeds about 70 MBytes/s, due to the SoC’s limitations
    2. The specific ARM chip (Samsung Exynos 5422 ; more than a decade old as of this writing) lacks cryptography instructions, slowing down encryption if that’s your thing (e.g., LUKS)
    3. While the SoC supports USB3, that block is tied up for the SATA interface ; the remaining USB port is only capable of USB2 speeds
    4. 32-bit ARM, which prevented me from running certain bits of software I wanted to try (like Minio)
    5. Only 1 drive, so no possibility for RAID (again, if that’s your thing)

    I also love to brag on the HC2’s power usage : I once profiled the unit for a month using a Kill-A-Watt and under normal usage (with the drive spinning only when in active use). The unit consumed 4.5 kWh… in an entire month.

    New Solution
    Enter the ODroid-HC4 (I purchased mine from Ameridroid but Hardkernel works with numerous distributors) :


    ODroid-HC4 with 2 drives

    ODroid-HC4 with an SSD and a conventional drive


    I ordered this earlier in the year and after many months of procrastinating and obsessing over the best approach to take with its general usage, I finally have it in service as my new NAS. Comparing point by point with the HC2 :

    1. The gigabit ethernet runs at full speed (though a few things on my network run at 2.5 GbE now, so I guess I’ll always be behind)
    2. The ARM chip (Amlogic S905X3) has AES cryptography acceleration and handles all the LUKS stuff without breaking a sweat ; “cryptsetup benchmark” reports between 500-600 MBytes/s on all the AES variants
    3. The USB port is still only USB2, so no improvement there
    4. 64-bit ARM, which means I can run Minio to simulate block storage in a local dev environment for some larger projects I would like to undertake
    5. Supports 2 drives, if RAID is your thing

    How I Set It Up
    How to set up the drive configuration ? As should be apparent from the photo above, I elected for an SSD (500 GB) for speed, paired with a conventional spinning HDD (18 TB) for sheer capacity. I’m not particularly trusting of RAID. I’ve watched it fail too many times, on systems that I don’t even manage, not to mention that aforementioned RAID brick that I had attached to the Apple AirPort Extreme.

    I had long been planning to use bcache, the block caching interface for Linux, which can use the SSD as a speedy cache in front of the more capacious disk. There is also LVM cache, which is supposed to achieve something similar. And then I had to evaluate the trade-offs in whether I wanted write-back, write-through, or write-around configurations.

    This was all predicated on the assumption that the spinning drive would not be able to saturate the gigabit connection. When I got around to setting up the hardware and trying some basic tests, I found that the conventional HDD had no trouble keeping up with the gigabit data rate, both reading and writing, somewhat obviating the need for SSD acceleration using any elaborate caching mechanisms.

    Maybe that’s because I sprung for the WD Red Pro series this time, rather than the Red Plus ? I’m guessing that conventional drives do deteriorate over the years. I’ll find out.

    For the operating system, I stuck with my newest favorite Linux distro : DietPi. While HardKernel (parent of ODroid) makes images for the HC units, I had also used DietPi for the HC2 for the past few years, as it tends to stay more up to date.

    Then I rsync’d my data from HC2 -> HC4. It was only about 6.5 TB of total data but it took days as this WD Red Plus drive is only capable of reading at around 10 MBytes/s these days. Painful.

    For file sharing, I’m pretty sure most normal folks have nice web UIs in their NAS boxes which allow them to easily configure and monitor the shares. I know there are such applications I could set up. But I’ve been doing this so long, I just do a bare bones setup through the terminal. I installed regular Samba and then brought over my smb.conf file from the HC2. 1 by 1, I tested that each of the old shares were activated on the new NAS and deactivated on the old NAS. I also set up a new share for the SSD. I guess that will just serve as a fast I/O scratch space on the NAS.

    The conventional drive spins up and down. That’s annoying when I’m actively working on something but manage not to hit the drive for like 5 minutes and then an application blocks while the drive wakes up. I suppose I could set it up so that it is always running. However, I micro-manage this with a custom bash script I wrote a long time ago which logs into the NAS and runs the “date” command every 2 minutes, appending the output to a file. As a bonus, it also prints data rate up/down stats every 5 seconds. The spinning file (“nas-main/zz-keep-spinning/keep-spinning.txt”) has never been cleared and has nearly a quarter million lines. I suppose that implies that it has kept the drive spinning for 1/2 million minutes which works out to around 347 total days. I should compare that against the drive’s SMART stats, if I can remember how. The earliest timestamp in the file is from March 2018, so I know the HC2 NAS has been in service at least that long.

    For tasks, vintage cron still does everything I could need. In this case, that means reaching out to websites (like this one) and automatically backing up static files.

    I also have to have a special script for starting up. Fortunately, I was able to bring this over from the HC2 and tweak it. The data disks (though not boot disk) are encrypted. Those need to be unlocked and only then is it safe for the Samba and Minio services to start up. So one script does all that heavy lifting in the rare case of a reboot (this is the type of system that’s well worth having on a reliable UPS).

    Further Work
    I need to figure out how to use the OLED display on the NAS, and how to make it show something more useful than the current time and date, which is what it does in its default configuration with HardKernel’s own Linux distro. With DietPi, it does nothing by default. I’m thinking it should be able to show the percent usage of each of the 2 drives, at a minimum.

    I also need to establish a more responsible backup regimen. I’m way too lazy about this. Fortunately, I reason that I can keep the original HC2 in service, repurposed to accept backups from the main NAS. Again, I’m sort of micro-managing this since a huge amount of data isn’t worth backing up (remember the whole DataHoarder bit), but the most important stuff will be shipped off.

    The post Adventures In NAS first appeared on Breaking Eggs And Making Omelettes.

  • Subtitling Sierra RBT Files

    2 juin 2016, par Multimedia Mike — Game Hacking

    This is part 2 of the adventure started in my Subtitling Sierra VMD Files post. After I completed the VMD subtitling, The Translator discovered a wealth of animation files in a format called RBT (this apparently stands for “Robot” but I think “Ribbit” format could be more fun). What are we going to do ? We had come so far by solving the VMD subtitling problem for Phantasmagoria. It would be a shame if the effort ground to a halt due to this.

    Fortunately, the folks behind the ScummVM project already figured out enough of the format to be able to decode the RBT files in Phantasmagoria.

    In the end, I was successful in creating a completely standalone tool that can take a Robot file and a subtitle file and create a new Robot file with subtitles. The source code is here (subtitle-rbt.c). Here’s what the final result looks like :


    Spanish refrigerator
    “What’s in the refrigerator ?” I should note at this juncture that I am not sure if this particular Robot file even has sound or dialogue since I was conducting these experiments on a computer with non-working audio.

    The RBT Format
    I have created a new MultimediaWiki page describing the Robot Animation format based on the ScummVM source code. I have not worked with a format quite like this before. These are paletted animations which consist of a sequence of independent frames that are designed to be overlaid on top of static background. Because of these characteristics, each frame encodes its own unique dimensions and origin coordinate within the frame. While the Phantasmagoria VMD files are usually 288×144 (which are usually double-sized for the benefit of a 640×400 Super VGA canvas), these frames are meant to be plotted on a game field that was roughly 576×288 (288×144 doublesized).

    For example, 2 minimalist animation frames from a desk investigation Robot file :


    Robot Animation Frame #1
    100×147

    Robot Animation Frame #2
    101×149

    As for compression, my first impression was that the algorithm was the same as VMD. This is wrong. It evidently uses an unmodified version of a standard algorithm called Lempel-Ziv-Stac (LZS). It shows up in several RFCs and was apparently used in MS-DOS’s transparent disk compression scheme.

    Approach
    Thankfully, many of the lessons I learned from the previous project are applicable to this project, including : subtitle library interfacing, subtitling in the paletted colorspace, and replacing encoded frames from the original file instead of trying to create a new file.

    Here is the pitch for this project :

    • Create a C program that can traverse through an input file, piece by piece, and generate an output file. The result of this should be a bitwise identical file.
    • Adapt the LZS compression decoding algorithm from ScummVM into the new tool. Make the tool dump raw Portable NetMap (PNM) files of varying dimensions and ensure that they look correct.
    • Compress using LZS.
    • Stretch the frames and draw subtitles.
    • More compression. Find the minimum window for each frame.

    Compression
    Normally, my first goal is to decompress the video and store the data in a raw form. However, this turned out to be mathematically intractable. While the format does support both compressed and uncompressed frames (even though ScummVM indicates that the uncompressed path is yet unexercised), the goal of this project requires making the frames so large that they overflow certain parameters of the file.

    A Robot file has a sequence of frames and 2 tables describing the size of each frame. One table describes the entire frame size (audio + video) while the second table describes just the video frame size. Since these tables only use 16 bits to specify a size, the maximum frame size is 65536 bytes. Leaving space for the audio portion of the frame, this only leaves a per-frame byte budget of about 63000 bytes for the video. Expanding the frame to 576×288 (165,888 pixels) would overflow this limit.

    Anyway, the upshot is that I needed to compress the data up front.

    Fortunately, the LZS compressor is pretty straightforward, at least if you have experience writing VLC-oriented codecs. While the algorithm revolves around back references, my approach was to essentially write an RLE encoder. My compressor would search for runs of data (plentiful when I started to stretch the frame for subtitling purposes). When a run length of n=3 or more of the same pixel is found, encode the pixel by itself, and then store a back reference of offset -1 and length (n-1). It took a little while to iron out a few problems, but I eventually got it to work perfectly.

    I have to say, however, that the format is a little bit weird in how it codes very large numbers. The length encoding is somewhat Golomb-like, i.e., smaller values are encoded with fewer bits. However, when it gets to large numbers, it starts encoding counts of 15 as blocks of 1111. For example, 24 is bigger than 7. Thus, emit 1111 into the bitstream and subtract 8 from 23 -> 16. Still bigger than 15, so stuff another 1111 into the bitstream and subtract 15. Now we’re at 1, so stuff 0001. So 24 is 11111111 0001. 12 bits is not too horrible. But the total number of bytes (value / 30). So a value of 300 takes around 10 bytes (80 bits) to encode.

    Palette Slices
    As in the VMD subtitling project, I took the subtitle color offered in the subtitle spec file as a suggestion and used Euclidean distance to match to the closest available color in the palette. One problem, however, is that the palette is a lot smaller in these animations. According to my notes, for the set of animations I scanned, only about 80 colors were specified, starting at palette index 55. I hypothesize that different slices of the palette are reserved for different uses. E.g., animation, background, and user interface. Thus, there is a smaller number of colors to draw upon for subtitling purposes.

    Scaling
    One bit of residual weirdness in this format is the presence of a per-frame scale factor. While most frames set this to 100 (100% scale), I have observed 70%, 80%, and 90%. ScummVM is a bit unsure about how to handle these, so I am as well. However, I eventually realized I didn’t really need to care, at least not when decoding and re-encoding the frame. Just preserve the scale factor. I intend to modify the tool further to take scale factor into account when creating the subtitle.

    The Final Resolution
    Right around the time that I was composing this post, The Translator emailed me and notified me that he had found a better way to subtitle the Robot files by modifying the scripts, rendering my entire approach moot. The result is much cleaner :


    Proper RBT Subtitles
    Turns out that the engine supported subtitles all along

    It’s a good thing that I enjoyed the challenge or I might be annoyed at this point.

    See Also

    The post Subtitling Sierra RBT Files first appeared on Breaking Eggs And Making Omelettes.