Recherche avancée

Médias (1)

Mot : - Tags -/lev manovitch

Autres articles (68)

  • Gestion générale des documents

    13 mai 2011, par

    MédiaSPIP ne modifie jamais le document original mis en ligne.
    Pour chaque document mis en ligne il effectue deux opérations successives : la création d’une version supplémentaire qui peut être facilement consultée en ligne tout en laissant l’original téléchargeable dans le cas où le document original ne peut être lu dans un navigateur Internet ; la récupération des métadonnées du document original pour illustrer textuellement le fichier ;
    Les tableaux ci-dessous expliquent ce que peut faire MédiaSPIP (...)

  • Des sites réalisés avec MediaSPIP

    2 mai 2011, par

    Cette page présente quelques-uns des sites fonctionnant sous MediaSPIP.
    Vous pouvez bien entendu ajouter le votre grâce au formulaire en bas de page.

  • HTML5 audio and video support

    13 avril 2011, par

    MediaSPIP uses HTML5 video and audio tags to play multimedia files, taking advantage of the latest W3C innovations supported by modern browsers.
    The MediaSPIP player used has been created specifically for MediaSPIP and can be easily adapted to fit in with a specific theme.
    For older browsers the Flowplayer flash fallback is used.
    MediaSPIP allows for media playback on major mobile platforms with the above (...)

Sur d’autres sites (7062)

  • Adventures In NAS

    1er janvier, par Multimedia Mike — General

    In my post last year about my out-of-control single-board computer (SBC) collection which included my meager network attached storage (NAS) solution, I noted that :

    I find that a lot of my fellow nerds massively overengineer their homelab NAS setups. I’ll explore this in a future post. For my part, people tend to find my homelab NAS solution slightly underengineered.

    So here I am, exploring this is a future post. I’ve been in the home NAS game a long time, but have never had very elaborate solutions for such. For my part, I tend to take an obsessively reductionist view of what constitutes a NAS : Any small computer with a pool of storage and a network connection, running the Linux operating system and the Samba file sharing service.


    Simple hard drive and ethernet cable

    Many home users prefer to buy turnkey boxes, usually that allow you to install hard drives yourself, and then configure the box and its services with a friendly UI. My fellow weird computer nerds often buy cast-off enterprise hardware and set up more resilient, over-engineered solutions, as long as they have strategies to mitigate the noise and dissipate the heat, and don’t mind the electricity bills.

    If it works, awesome ! As an old hand at this, I am rather stuck in my ways, however, preferring to do my own stunts, both with the hardware and software solutions.

    My History With Home NAS Setups
    In 1998, I bought myself a new computer — beige box tower PC, as was the style as the time. This was when normal people only had one computer at most. It ran Windows, but I was curious about this new thing called “Linux” and learned to dual boot that. Later that year, it dawned on me that nothing prevented me from buying a second ugly beige box PC and running Linux exclusively on it. Further, it could be a headless Linux box, connected by ethernet, and I could consolidate files into a single place using this file sharing software named Samba.

    I remember it being fairly onerous to get Samba working in those days. And the internet was not quite so helpful in those days. I recall that the thing that blocked me for awhile was needing to know that I had to specify an entry for the Samba server machine in the LMHOSTS (Lanman hosts) file on the Windows 95 machine.

    However, after I cracked that code, I have pretty much always had some kind of ad-hoc home NAS setup, often combined with a headless Linux development box.

    In the early 2000s, I built a new beige box PC for a file server, with a new hard disk, and a coworker tutored me on setting up a (P)ATA UDMA 133 (or was it 150 ? anyway, it was (P)ATA’s last hurrah before SATA conquered all) expansion card and I remember profiling that the attached hard drive worked at a full 21 MBytes/s reading. It was pretty slick. Except I hadn’t really thought things through. You see, I had a hand-me-down ethernet hub cast-off from my job at the time which I wanted to use. It was a 100 Mbps repeater hub, not a switch, so the catch was that all connected machines had to be capable of 100 Mbps. So, after getting all of my machines (3 at the time) upgraded to support 10/100 ethernet (the old off-brand PowerPC running Linux was the biggest challenge), I profiled transfers and realized that the best this repeater hub could achieve was about 3.6 MBytes/s. For a long time after that, I just assumed that was the upper limit of what a 100 Mbps network could achieve. Obviously, I now know that the upper limit ought to be around 11.2 MBytes/s and if I had gamed out that fact in advance, I would have realized it didn’t make sense to care about super-fast (for the time) disk performance.

    At this time, I was doing a lot for development for MPlayer/xine/FFmpeg. I stored all of my multimedia material on this NAS. I remember being confused when I was working with Y4M data, which is raw frames, which is lots of data. xine, which employed a pre-buffering strategy, would play fine for a few seconds and then stutter. Eventually, I reasoned out that the files I was working with had a data rate about twice what my awful repeater hub supported, which is probably the first time I came to really understand and respect streaming speeds and their implications for multimedia playback.

    Smaller Solutions
    For a period, I didn’t have a NAS. Then I got an Apple AirPort Extreme, which I noticed had a USB port. So I bought a dual drive brick to plug into it and used that for a time. Later (2009), I had this thing called the MSI Wind Nettop which is the only PC I’ve ever seen that can use a CompactFlash (CF) card for a boot drive. So I did just that, and installed a large drive so it could function as a NAS, as well as a headless dev box. I’m still amazed at what a low-power I/O beast this thing is, at least when compared to all the ARM SoCs I have tried in the intervening 1.5 decades. I’ve had spinning hard drives in this thing that could read at 160 MBytes/s (‘dd’ method) and have no trouble saturating the gigabit link at 112 MBytes/s, all with its early Intel Atom CPU.

    Around 2015, I wanted a more capable headless dev box and discovered Intel’s line of NUCs. I got one of the fat models that can hold a conventional 2.5″ spinning drive in addition to the M.2 SATA SSD and I was off and running. That served me fine for a few years, until I got into the ARM SBC scene. One major limitation here is that 2.5″ drives aren’t available in nearly the capacities that make a NAS solution attractive.

    Current Solution
    My current NAS solution, chronicled in my last SBC post– the ODroid-HC2, which is a highly compact ARM SoC with an integrated USB3-SATA bridge so that a SATA drive can be connected directly to it :


    ODROID-HC2 NAS

    ODROID-HC2 NAS


    I tend to be weirdly proficient at recalling dates, so I’m surprised that I can’t recall when I ordered this and put it into service. But I’m pretty sure it was circa 2018. It’s only equipped with an 8 TB drive now, but I seem to recall that it started out with only a 4 TB drive. I think I upgraded to the 8 TB drive early in the pandemic in 2020, when ISPs were implementing temporary data cap amnesty and I was doing what a r/DataHoarder does.

    The HC2 has served me well, even though it has a number of shortcomings for a hardware set chartered for NAS :

    1. While it has a gigabit ethernet port, it’s documented that it never really exceeds about 70 MBytes/s, due to the SoC’s limitations
    2. The specific ARM chip (Samsung Exynos 5422 ; more than a decade old as of this writing) lacks cryptography instructions, slowing down encryption if that’s your thing (e.g., LUKS)
    3. While the SoC supports USB3, that block is tied up for the SATA interface ; the remaining USB port is only capable of USB2 speeds
    4. 32-bit ARM, which prevented me from running certain bits of software I wanted to try (like Minio)
    5. Only 1 drive, so no possibility for RAID (again, if that’s your thing)

    I also love to brag on the HC2’s power usage : I once profiled the unit for a month using a Kill-A-Watt and under normal usage (with the drive spinning only when in active use). The unit consumed 4.5 kWh… in an entire month.

    New Solution
    Enter the ODroid-HC4 (I purchased mine from Ameridroid but Hardkernel works with numerous distributors) :


    ODroid-HC4 with 2 drives

    ODroid-HC4 with an SSD and a conventional drive


    I ordered this earlier in the year and after many months of procrastinating and obsessing over the best approach to take with its general usage, I finally have it in service as my new NAS. Comparing point by point with the HC2 :

    1. The gigabit ethernet runs at full speed (though a few things on my network run at 2.5 GbE now, so I guess I’ll always be behind)
    2. The ARM chip (Amlogic S905X3) has AES cryptography acceleration and handles all the LUKS stuff without breaking a sweat ; “cryptsetup benchmark” reports between 500-600 MBytes/s on all the AES variants
    3. The USB port is still only USB2, so no improvement there
    4. 64-bit ARM, which means I can run Minio to simulate block storage in a local dev environment for some larger projects I would like to undertake
    5. Supports 2 drives, if RAID is your thing

    How I Set It Up
    How to set up the drive configuration ? As should be apparent from the photo above, I elected for an SSD (500 GB) for speed, paired with a conventional spinning HDD (18 TB) for sheer capacity. I’m not particularly trusting of RAID. I’ve watched it fail too many times, on systems that I don’t even manage, not to mention that aforementioned RAID brick that I had attached to the Apple AirPort Extreme.

    I had long been planning to use bcache, the block caching interface for Linux, which can use the SSD as a speedy cache in front of the more capacious disk. There is also LVM cache, which is supposed to achieve something similar. And then I had to evaluate the trade-offs in whether I wanted write-back, write-through, or write-around configurations.

    This was all predicated on the assumption that the spinning drive would not be able to saturate the gigabit connection. When I got around to setting up the hardware and trying some basic tests, I found that the conventional HDD had no trouble keeping up with the gigabit data rate, both reading and writing, somewhat obviating the need for SSD acceleration using any elaborate caching mechanisms.

    Maybe that’s because I sprung for the WD Red Pro series this time, rather than the Red Plus ? I’m guessing that conventional drives do deteriorate over the years. I’ll find out.

    For the operating system, I stuck with my newest favorite Linux distro : DietPi. While HardKernel (parent of ODroid) makes images for the HC units, I had also used DietPi for the HC2 for the past few years, as it tends to stay more up to date.

    Then I rsync’d my data from HC2 -> HC4. It was only about 6.5 TB of total data but it took days as this WD Red Plus drive is only capable of reading at around 10 MBytes/s these days. Painful.

    For file sharing, I’m pretty sure most normal folks have nice web UIs in their NAS boxes which allow them to easily configure and monitor the shares. I know there are such applications I could set up. But I’ve been doing this so long, I just do a bare bones setup through the terminal. I installed regular Samba and then brought over my smb.conf file from the HC2. 1 by 1, I tested that each of the old shares were activated on the new NAS and deactivated on the old NAS. I also set up a new share for the SSD. I guess that will just serve as a fast I/O scratch space on the NAS.

    The conventional drive spins up and down. That’s annoying when I’m actively working on something but manage not to hit the drive for like 5 minutes and then an application blocks while the drive wakes up. I suppose I could set it up so that it is always running. However, I micro-manage this with a custom bash script I wrote a long time ago which logs into the NAS and runs the “date” command every 2 minutes, appending the output to a file. As a bonus, it also prints data rate up/down stats every 5 seconds. The spinning file (“nas-main/zz-keep-spinning/keep-spinning.txt”) has never been cleared and has nearly a quarter million lines. I suppose that implies that it has kept the drive spinning for 1/2 million minutes which works out to around 347 total days. I should compare that against the drive’s SMART stats, if I can remember how. The earliest timestamp in the file is from March 2018, so I know the HC2 NAS has been in service at least that long.

    For tasks, vintage cron still does everything I could need. In this case, that means reaching out to websites (like this one) and automatically backing up static files.

    I also have to have a special script for starting up. Fortunately, I was able to bring this over from the HC2 and tweak it. The data disks (though not boot disk) are encrypted. Those need to be unlocked and only then is it safe for the Samba and Minio services to start up. So one script does all that heavy lifting in the rare case of a reboot (this is the type of system that’s well worth having on a reliable UPS).

    Further Work
    I need to figure out how to use the OLED display on the NAS, and how to make it show something more useful than the current time and date, which is what it does in its default configuration with HardKernel’s own Linux distro. With DietPi, it does nothing by default. I’m thinking it should be able to show the percent usage of each of the 2 drives, at a minimum.

    I also need to establish a more responsible backup regimen. I’m way too lazy about this. Fortunately, I reason that I can keep the original HC2 in service, repurposed to accept backups from the main NAS. Again, I’m sort of micro-managing this since a huge amount of data isn’t worth backing up (remember the whole DataHoarder bit), but the most important stuff will be shipped off.

    The post Adventures In NAS first appeared on Breaking Eggs And Making Omelettes.

  • A Quick Start Guide to the Payment Services Directive (PSD2)

    22 novembre 2024, par Daniel Crough — Banking and Financial Services, Privacy

    In 2023, there were 266.2 billion real-time payments indicating that the demand for secure transactions has never been higher. As we move towards a more open banking system, there are a host of new payment solutions that offer convenience and efficiency, but they also present new risks.

    The Payment Services Directive 2 (PSD2) is one of many regulations established to address these concerns. PSD2 is a European Union (EU) business initiative to offer smooth payment experiences while helping customers feel safe from online threats. 

    In this post, learn what PSD2 includes, how it improves security for online payments, and how Matomo supports banks and financial institutions with PSD2 compliance.

    What is PSD2 ? 

    PSD2 is an EU directive that aims to improve the security of electronic payments across the EU. It enforces strong customer authentication and allows third-party access to consumer accounts with explicit consent. 

    Its main objectives are :

    • Strengthening security and data privacy measures around digital payments.
    • Encouraging innovation by allowing third-party providers access to banking data.
    • Improving transparency with clear communication regarding fees, terms and conditions associated with payment services.
    • Establishing a framework for sharing customer data securely through APIs for PSD2 open banking.

    Rationale behind PSD2 

    PSD2’s primary purpose is to engineer a more integrated and efficient European payment market without compromising the security of online transactions. 

    The original directive aimed to standardise payment services across EU member states, but as technology evolved, an updated version was needed.

    PSD2 is mandatory for various entities within the European Economic Area (EEA), like :

    • Banks and credit institutions
    • Electronic money institutions or digital banks like Revolut
    • Card issuing and acquiring institutions
    • Fintech companies
    • Multi-national organisations operating in the EU

    PSD2 implementation timeline

    With several important milestones, PSD2 has reshaped how payment services work in Europe. Here’s a closer look at the pivotal events that paved the way for its launch.

    • 2002 : The banking industry creates the European Payments Council (EC), which drives the Single Euro Payments Area (SEPA) initiative to include non-cash payment instruments across European regions. 
    • 2007 : PSD1 goes into effect.
    • 2013 : EC proposes PSD2 to include protocols for upcoming payment services.
    • 2015 : The Council of European Union passes PSD2 and gives member states two years to incorporate it.
    • 2018 : PSD2 goes into effect. 
    • 2019 : The final deadline for all companies within the EU to comply with PSD2’s regulations and rules for strong customer authentication. 

    PSD2 : Key components 

    PSD2 introduces several key components. Let’s take a look at each one.

    Strong Customer Authentication (SCA)

    The Regulatory Technical Standards (RTS) under PSD2 outline specific requirements for SCA. 

    SCA requires multi-factor authentication for online transactions. When customers make a payment online, they need to verify their identity using at least two of the three following elements :

    • Knowledge : Something they know (like a password, a code or a secret answer)
    • Possession : Something they have (like their phone or card)
    • Inherence : Something they are (like biometrics — fingerprints or facial features)
    Strong customer authentication three factors

    Before SCA, banks verified an individual’s identity only using a password. This dual verification allows only authorised users to complete transactions. SCA implementation reduces fraud and increases the security of electronic payments.

    SCA implementation varies for different payment methods. Debit and credit cards use the 3D Secure (3DS) protocol. E-wallets and other local payment measures often have their own SCA-compliant steps. 

    3DS is an extra step to authenticate a customer’s identity. Most European debit and credit card companies implement it. Also, in case of fraudulent chargebacks, the issuing bank becomes liable due to 3DS, not the business. 

    However, in SCA, certain transactions are exempt : 

    • Low-risk transactions : A transaction by an issuer or an acquirer whose fraud level is below a specific threshold. If the acquirer feels that a transaction is low risk, they can request to skip SCA. 
    • Low-value transactions : Transactions under €30.
    • Trusted beneficiaries : Trusted merchants customers choose to safelist.
    • Recurring payments : Recurring transactions for a fixed amount are exempt from SCA after the first transaction.

    Third-party payment service providers (TPPs) framework

    TPPs are entities authorised to access customer banking data and initiate payments. There are three types of TPPs :

    Account Information Service Providers (AISPs)

    AISPs are services that can view customers’ account details, but only with their permission. For example, a budgeting app might use AISP services to gather transaction data from a user’s bank account, helping them monitor expenses and oversee finances. 

    Payment Initiation Service Providers (PISPs)

    PISPs enable clients to initiate payments directly from their bank accounts, bypassing the need for conventional payment options such as debit or credit cards. After the customer makes a payment, PISPs immediately contact the merchant to ensure the user can access the online services or products they bought. 

    Card-Based Payment Instruments (CBPII)

    CBPIIs refer to services that issue payment cards linked to customer accounts. 

    Requirements for TPPs

    To operate effectively under PSD2, TPPs must meet several requirements :

    Consumer consent : Customers must explicitly authorise TPPs to retrieve their financial data. This way, users can control who can view their information and for what purpose.

    Security compliance : TPPs must follow SCA and secure communication guidelines to protect users from fraud and unauthorised access.

    API availability : Banks must make their Application Programming Interfaces (APIs) accessible and allow TPPs to connect securely with the bank’s systems. This availability helps in easy integration and lets TPPs access essential data. 

    Consumer protection methods

    PSD2 implements various consumer protection measures to increase trust and transparency between consumers and financial institutions. Here’s a closer look at some of these key methods :

    • Prohibition of unjustified fees : PSD2 requires banks to clearly communicate any additional charges or fees for international transfers or account maintenance. This ensures consumers are fully aware of the actual costs and charges.
    • Timely complaint resolution : PSD2 mandates that payment service providers (PSPs) have a straightforward complaint procedure. If a customer faces any problems, the provider must respond within 15 business days. This requirement encourages consumers to engage more confidently with financial services.
    • Refund in case of unauthorised payment : Customers are entitled to a full refund for payments made without their consent.
    • Surcharge ban : Additional charges on credit and debit card payments aren’t allowed. Businesses can’t impose extra fees on these payment methods, which increases customers’ purchasing power.

    Benefits of PSD2 

    Businesses — particularly those in banking, fintech, finserv, etc. — stand to benefit from PSD2 in several ways.

    Access to customer data

    With customer consent, banks can analyse spending patterns to develop tailored financial products that match customer needs, from personalised savings accounts to more relevant loan offerings.

    Innovation and cost benefits 

    PSD2 opened payment processing up to more market competition. New payment companies bring fresh approaches to banking services, making daily transactions more efficient while driving down processing fees across the sector.

    Also, banks now work alongside payment technology providers, combining their strengths to create better services. This collaboration brings faster payment options to businesses, helping them stay competitive while reducing operational costs.

    Improved customer trust and experience

    Due to PSD2 guidelines, modern systems handle transactions quickly without compromising the safety of payment data, creating a balanced approach to digital banking.

    PSD2 compliance benefits

    Banking customers now have more control over their financial information. Clear processes allow consumers to view and adjust their financial preferences as needed.

    Strong security standards form the foundation of these new payment systems. Payment provider platforms must adhere to strict regulations and implement additional protection measures.

    Challenges in PSD2 compliance 

    What challenges can banks and financial institutions face regarding PSD2 compliance ? Let’s examine them. 

    Resource requirements

    For many businesses, the new requirements come with a high price tag. PSD2 requires banks and fintechs to build and update their systems so that other providers can access customer data safely. For example, they must develop APIs to allow TPPs to acquire customer data. 

    Many banks still use older systems that can’t meet PSD2’s added requirements. In addition to the cost of upgrades, complying with PSD2 requires banks to devote resources to training staff and monitoring compliance.

    The significant costs required to update legacy systems and IT infrastructure while keeping services running remain challenging.

    Risks and penalties

    Organisations that fail to comply with PSD2 regulations can face significant penalties.

    Additionally, the overlapping requirements of PSD2 and other regulations, such as the General Data Protection Regulation (GDPR), can create confusion. 

    Banks need clear agreements with TPPs about who’s responsible when things go wrong. This includes handling data breaches, preventing data misuse and protecting customer information. 

    Increased competition 

    Introducing new players in the financial ecosystem, such as AISPs and PISPs, creates competition. Banks must adapt their services to stay competitive while managing compliance costs.

    PSD2 aims to protect customers but the stronger authentication requirements can make banking less convenient. Banks must balance security with user experience. Focused time, effort and continuous monitoring are needed for businesses to stay compliant and competitive.

    How Matomo can help 

    Matomo gives banks and financial institutions complete control over their data through privacy-focused web analytics, keeping collected information internal rather than being used for marketing or other purposes. 

    Its advanced security setup includes access controls, audit logs, SSL encryption, single sign-on and two-factor authentication. This creates a secure environment where sensitive data remains accessible only to authorised staff.

    While prioritizing privacy, Matomo provides tools to understand user flow and customer segments, such as session recordings, heatmaps and A/B testing.

    Financial institutions particularly benefit from several key features : 

    • Tools for obtaining explicit consent before processing personal data like this Do Not Track preference
    • Insights into how financial institutions integrate TPPs (including API usage, user engagement and potential authentication drop-off points)
    • Tracking of failed login attempts or unusual access patterns
    • IP anonymization to analyse traffic patterns and detect potential fraud
    Matomo's Do Not Track preference selection screen

    PSD3 : The next step 

    In recent years, we have seen the rise of innovative payment companies and increasingly clever fraud schemes. This has prompted regulators to propose updates to payment rules.

    PSD3’s scope is to adapt to the evolving digital transformation and to better handle these fraud risks. The proposed measures : 

    • Encourage PSPs to share fraud-related information.
    • Make customers aware of the different types of fraud.
    • Strengthen customer authentication standards.
    • Provide non-bank PSPs restricted access to EU payment systems. 
    • Enact payment rules in a directly applicable regulation and harmonise and enforce the directive.

    Web analytics that respect user privacy 

    Achieving compliance with PSD2 may be a long road for some businesses. With Matomo, organisations can enjoy peace of mind knowing their data practices align with legal requirements.

    Ready to stop worrying over compliance with regulations like PSD2 and take control of your data ? Start your 21-day free trial with Matomo.

  • Assembling frames into a video in Node.js using fluent-ffmpeg [closed]

    14 novembre 2024, par Andrei

    I have the original video and it's modified frames in a folder. Now I want to assemble the modified frames into a new video. I also want to take the audio from the original video and add it to the new video.

    


    My latest code is this using fluent-ffmpeg. But it's not working.

    


    await new Promise((resolve, reject) => {
  ffmpeg(path.join(processedFramesDir, "frame-%d.png"))
    .inputOptions(["-framerate 30", "-start_number 1"])
    .input(videoPath)
    .outputOptions([
      "-c:v",
      "libx264",
      "-c:a",
      "copy",
      "-shortest",
      "-r",
      "30",
      "-pix_fmt",
      "yuv420p",
    ])
    .outputOptions(["-map 0:v:0", "-map 1:a:0"])
    .save(outputVideoPath)
    .on("start", (commandLine) => {
      console.log("FFmpeg command: " + commandLine);
    })
    .on("stderr", (stderrLine) => {
      console.log("FFmpeg stderr: " + stderrLine);
    })
    .on("end", resolve)
    .on("error", (err, stdout, stderr) => {
      console.error("Error assembling video:", err);
      console.error("FFmpeg stderr:", stderr);
      reject(err);
    });
});


    


    I get this error even though the frames exist and they are valid pngs :&#xA;FFmpeg command: ffmpeg -framerate 30 -start_number 1 -i /tmp/video-processing-CyULOE/processedFrames/frame-%05d.png -i /tmp/video-processing-CyULOE/input.mp4 -y -c:v libx264 -c:a copy -shortest -r 30 -pix_fmt yuv420p -map 0:v:0 -map 1:a:0 /tmp/video-processing-CyULOE/output.mp4 2024-11-13 23:40:27 FFmpeg stderr: ffmpeg version 4.1.11-0&#x2B;deb10u1 Copyright (c) 2000-2023 the FFmpeg developers 2024-11-13 23:40:27 FFmpeg stderr:   built with gcc 8 (Debian 8.3.0-6) 2024-11-13 23:40:27 FFmpeg stderr:   configuration: --prefix=/usr --extra-version=0&#x2B;deb10u1 --toolchain=hardened --libdir=/usr/lib/x86_64-linux-gnu --incdir=/usr/include/x86_64-linux-gnu --arch=amd64 --enable-gpl --disable-stripping --enable-avresample --disable-filter=resample --enable-avisynth --enable-gnutls --enable-ladspa --enable-libaom --enable-libass --enable-libbluray --enable-libbs2b --enable-libcaca --enable-libcdio --enable-libcodec2 --enable-libflite --enable-libfontconfig --enable-libfreetype --enable-libfribidi --enable-libgme --enable-libgsm --enable-libjack --enable-libmp3lame --enable-libmysofa --enable-libopenjpeg --enable-libopenmpt --enable-libopus --enable-libpulse --enable-librsvg --enable-librubberband --enable-libshine --enable-libsnappy --enable-libsoxr --enable-libspeex --enable-libssh --enable-libtheora --enable-libtwolame --enable-libvidstab --enable-libvorbis --enable-libvpx --enable-libwavpack --enable-libwebp --enable-libx265 --enable-libxml2 --enable-libxvid --enable-libzmq --enable-libzvbi --enable-lv2 --enable-omx --enable-openal --enable-opengl --enable-sdl2 --enable-libdc1394 --enable-libdrm --enable-libiec61883 --enable-chromaprint --enable-frei0r --enable-libx264 --enable-shared 2024-11-13 23:40:27 FFmpeg stderr:   libavutil      56. 22.100 / 56. 22.100 2024-11-13 23:40:27 FFmpeg stderr:   libavcodec     58. 35.100 / 58. 35.100 2024-11-13 23:40:27 FFmpeg stderr:   libavformat    58. 20.100 / 58. 20.100 2024-11-13 23:40:27 FFmpeg stderr:   libavdevice    58.  5.100 / 58.  5.100 2024-11-13 23:40:27 FFmpeg stderr:   libavfilter     7. 40.101 /  7. 40.101 2024-11-13 23:40:27 FFmpeg stderr:   libavresample   4.  0.  0 /  4.  0.  0 2024-11-13 23:40:27 FFmpeg stderr:   libswscale      5.  3.100 /  5.  3.100 2024-11-13 23:40:27 FFmpeg stderr:   libswresample   3.  3.100 /  3.  3.100 2024-11-13 23:40:27 FFmpeg stderr:   libpostproc    55.  3.100 / 55.  3.100 2024-11-13 23:40:27 FFmpeg stderr: [png @ 0x5630fb89a880] Invalid PNG signature 0x52494646F8A20000. 2024-11-13 23:40:27 FFmpeg stderr: [png @ 0x5630fb89a880] Invalid PNG signature 0x52494646C89F0000. 2024-11-13 23:40:27 FFmpeg stderr: [png @ 0x5630fb89a880] Invalid PNG signature 0x524946464EA10000. 2024-11-13 23:40:27 FFmpeg stderr: [png @ 0x5630fb89a880] Invalid PNG signature 0x5249464628A40000. 2024-11-13 23:40:27 FFmpeg stderr: [png @ 0x5630fb89a880] Invalid PNG signature 0x52494646AAA30000. 2024-11-13 23:40:27 FFmpeg stderr: [png @ 0x5630fb89a880] Invalid PNG signature 0x524946467CA10000. 2024-11-13 23:40:27 FFmpeg stderr: [png @ 0x5630fb89a880] Invalid PNG signature 0x52494646AAA30000. 2024-11-13 23:40:27 FFmpeg stderr: [png @ 0x5630fb89a880] Invalid PNG signature 0x52494646BAA10000. 2024-11-13 23:40:27 FFmpeg stderr: [png @ 0x5630fb89a880] Invalid PNG signature 0x52494646ACA30000. 2024-11-13 23:40:27 FFmpeg stderr: [png @ 0x5630fb89a880] Invalid PNG signature 0x5249464670A80000. 2024-11-13 23:40:27 FFmpeg stderr: [png @ 0x5630fb89a880] Invalid PNG signature 0x52494646E8A60000. 2024-11-13 23:40:27 FFmpeg stderr: [png @ 0x5630fb89a880] Invalid PNG signature 0x524946469AA60000. 2024-11-13 23:40:27 FFmpeg stderr: [png @ 0x5630fb89a880] Invalid PNG signature 0x5249464672A90000. 2024-11-13 23:40:27 FFmpeg stderr: [png @ 0x5630fb89a880] Invalid PNG signature 0x52494646B8A50000. 2024-11-13 23:40:27 FFmpeg stderr: [png @ 0x5630fb89a880] Invalid PNG signature 0x52494646AAA80000. 2024-11-13 23:40:27 FFmpeg stderr: [png @ 0x5630fb89a880] Invalid PNG signature 0x52494646A0A70000. 2024-11-13 23:40:27 FFmpeg stderr:     Last message repeated 1 times 2024-11-13 23:40:27 FFmpeg stderr: [png @ 0x5630fb89a880] Invalid PNG signature 0x5249464646A80000. 2024-11-13 23:40:27 FFmpeg stderr: [image2 @ 0x5630fb898a40] decoding for stream 0 failed 2024-11-13 23:40:27 FFmpeg stderr: [image2 @ 0x5630fb898a40] Could not find codec parameters for stream 0 (Video: png, none(pc)): unspecified size 2024-11-13 23:40:27 FFmpeg stderr: Consider increasing the value for the &#x27;analyzeduration&#x27; and &#x27;probesize&#x27; options 2024-11-13 23:40:27 FFmpeg stderr: Input #0, image2, from &#x27;/tmp/video-processing-CyULOE/processedFrames/frame-%05d.png&#x27;: 2024-11-13 23:40:27 FFmpeg stderr:   Duration: 00:00:00.60, start: 0.000000, bitrate: N/A 2024-11-13 23:40:27 FFmpeg stderr:     Stream #0:0: Video: png, none(pc), 30 fps, 30 tbr, 30 tbn, 30 tbc 2024-11-13 23:40:28 FFmpeg stderr: Input #1, mov,mp4,m4a,3gp,3g2,mj2, from &#x27;/tmp/video-processing-CyULOE/input.mp4&#x27;: 2024-11-13 23:40:28 FFmpeg stderr:   Metadata: 2024-11-13 23:40:28 FFmpeg stderr:     major_brand     : isom 2024-11-13 23:40:28 FFmpeg stderr:     minor_version   : 512 2024-11-13 23:40:28 FFmpeg stderr:     compatible_brands: isomiso2avc1mp41 2024-11-13 23:40:28 FFmpeg stderr:     encoder         : Lavf60.16.100 2024-11-13 23:40:28 FFmpeg stderr:   Duration: 00:00:00.62, start: 0.000000, bitrate: 4289 kb/s 2024-11-13 23:40:28 FFmpeg stderr:     Stream #1:0(und): Video: h264 (High) (avc1 / 0x31637661), yuv420p(tv, bt709), 1920x1080 [SAR 1:1 DAR 16:9], 4395 kb/s, 30 fps, 30 tbr, 15360 tbn, 60 tbc (default) 2024-11-13 23:40:28 FFmpeg stderr:     Metadata: 2024-11-13 23:40:28 FFmpeg stderr:       handler_name    : VideoHandler 2024-11-13 23:40:28 FFmpeg stderr:     Stream #1:1(und): Audio: aac (LC) (mp4a / 0x6134706D), 48000 Hz, stereo, fltp, 2 kb/s (default) 2024-11-13 23:40:28 FFmpeg stderr:     Metadata: 2024-11-13 23:40:28 FFmpeg stderr:       handler_name    : SoundHandler 2024-11-13 23:40:28 FFmpeg stderr: Stream mapping: 2024-11-13 23:40:28 FFmpeg stderr:   Stream #0:0 -> #0:0 (png (native) -> h264 (libx264)) 2024-11-13 23:40:28 FFmpeg stderr:   Stream #1:1 -> #0:1 (copy) 2024-11-13 23:40:28 FFmpeg stderr: Press [q] to stop, [?] for help 2024-11-13 23:40:28 FFmpeg stderr: [image2 @ 0x5630fb898a40] Thread message queue blocking; consider raising the thread_queue_size option (current value: 8) 2024-11-13 23:40:28 FFmpeg stderr: [png @ 0x5630fb9b7700] Invalid PNG signature 0x52494646F8A20000. 2024-11-13 23:40:28 FFmpeg stderr: [png @ 0x5630fb988dc0] Invalid PNG signature 0x52494646C89F0000. 2024-11-13 23:40:28 FFmpeg stderr: [png @ 0x5630fba3edc0] Invalid PNG signature 0x524946464EA10000. 2024-11-13 23:40:28 FFmpeg stderr: [png @ 0x5630fba40600] Invalid PNG signature 0x5249464628A40000. 2024-11-13 23:40:28 FFmpeg stderr: Error while decoding stream #0:0: Invalid data found when processing input 2024-11-13 23:40:28 FFmpeg stderr: [png @ 0x5630fb8c7900] Invalid PNG signature 0x52494646AAA30000. 2024-11-13 23:40:28 FFmpeg stderr: Error while decoding stream #0:0: Invalid data found when processing input 2024-11-13 23:40:28 FFmpeg stderr: [png @ 0x5630fb9b7700] Invalid PNG signature 0x524946467CA10000. 2024-11-13 23:40:28 FFmpeg stderr: Error while decoding stream #0:0: Invalid data found when processing input 2024-11-13 23:40:28 FFmpeg stderr: [png @ 0x5630fb988dc0] Invalid PNG signature 0x52494646AAA30000. 2024-11-13 23:40:28 FFmpeg stderr: Error while decoding stream #0:0: Invalid data found when processing input 2024-11-13 23:40:28 FFmpeg stderr: [png @ 0x5630fba3edc0] Invalid PNG signature 0x52494646BAA10000. 2024-11-13 23:40:28 FFmpeg stderr: [png @ 0x5630fba40600] Invalid PNG signature 0x52494646ACA30000. 2024-11-13 23:40:28 FFmpeg stderr: Error while decoding stream #0:0: Invalid data found when processing input 2024-11-13 23:40:28 FFmpeg stderr:     Last message repeated 1 times 2024-11-13 23:40:28 FFmpeg stderr: [png @ 0x5630fb8c7900] Invalid PNG signature 0x5249464670A80000. 2024-11-13 23:40:28 FFmpeg stderr: Error while decoding stream #0:0: Invalid data found when processing input 2024-11-13 23:40:28 FFmpeg stderr: [png @ 0x5630fb9b7700] Invalid PNG signature 0x52494646E8A60000. 2024-11-13 23:40:28 FFmpeg stderr: Error while decoding stream #0:0: Invalid data found when processing input 2024-11-13 23:40:28 FFmpeg stderr: [png @ 0x5630fb988dc0] Invalid PNG signature 0x524946469AA60000. 2024-11-13 23:40:28 FFmpeg stderr: Error while decoding stream #0:0: Invalid data found when processing input 2024-11-13 23:40:28 FFmpeg stderr: [png @ 0x5630fba3edc0] Invalid PNG signature 0x5249464672A90000. 2024-11-13 23:40:28 FFmpeg stderr: Error while decoding stream #0:0: Invalid data found when processing input 2024-11-13 23:40:28 FFmpeg stderr: [png @ 0x5630fba40600] Invalid PNG signature 0x52494646B8A50000. 2024-11-13 23:40:28 FFmpeg stderr: Error while decoding stream #0:0: Invalid data found when processing input 2024-11-13 23:40:28 FFmpeg stderr: [png @ 0x5630fb8c7900] Invalid PNG signature 0x52494646AAA80000. 2024-11-13 23:40:28 FFmpeg stderr: Error while decoding stream #0:0: Invalid data found when processing input 2024-11-13 23:40:28 FFmpeg stderr: [png @ 0x5630fb9b7700] Invalid PNG signature 0x52494646A0A70000. 2024-11-13 23:40:28 FFmpeg stderr: [png @ 0x5630fb988dc0] Invalid PNG signature 0x52494646A0A70000. 2024-11-13 23:40:28 FFmpeg stderr: Error while decoding stream #0:0: Invalid data found when processing input 2024-11-13 23:40:28 FFmpeg stderr: [png @ 0x5630fba3edc0] Invalid PNG signature 0x5249464646A80000. 2024-11-13 23:40:28 FFmpeg stderr: Error while decoding stream #0:0: Invalid data found when processing input 2024-11-13 23:40:28 FFmpeg stderr:     Last message repeated 4 times 2024-11-13 23:40:28 FFmpeg stderr: Cannot determine format of input stream 0:0 after EOF 2024-11-13 23:40:28 FFmpeg stderr: Error marking filters as finished 2024-11-13 23:40:28 FFmpeg stderr: Conversion failed! 2024-11-13 23:40:28 FFmpeg stderr:  2024-11-13 23:40:28 Error assembling video: Error: ffmpeg exited with code 1: Cannot determine format of input stream 0:0 after EOF 2024-11-13 23:40:28 Error marking filters as finished 2024-11-13 23:40:28 Conversion failed! 2024-11-13 23:40:28  2024-11-13 23:40:28     at ChildProcess.<anonymous> (/usr/src/app/node_modules/fluent-ffmpeg/lib/processor.js:180:22) 2024-11-13 23:40:28     at ChildProcess.emit (node:events:519:28) 2024-11-13 23:40:28     at ChildProcess._handle.onexit (node:internal/child_process:294:12)</anonymous>

    &#xA;

    The code used to

    &#xA;

    ffmpeg(videoPath) .screenshots({ folder: framesDir, filename: "frame-%i.png", size: "?x1080", count: 10, }) .on("end", resolve) .on("error", reject);&#xA;

    &#xA;

    The pngs are valid and displaying in windows photos viewer enter image description here

    &#xA;