
Recherche avancée
Médias (91)
-
999,999
26 septembre 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Audio
-
The Slip - Artworks
26 septembre 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Texte
-
Demon seed (wav version)
26 septembre 2011, par
Mis à jour : Avril 2013
Langue : English
Type : Audio
-
The four of us are dying (wav version)
26 septembre 2011, par
Mis à jour : Avril 2013
Langue : English
Type : Audio
-
Corona radiata (wav version)
26 septembre 2011, par
Mis à jour : Avril 2013
Langue : English
Type : Audio
-
Lights in the sky (wav version)
26 septembre 2011, par
Mis à jour : Avril 2013
Langue : English
Type : Audio
Autres articles (19)
-
Keeping control of your media in your hands
13 avril 2011, parThe vocabulary used on this site and around MediaSPIP in general, aims to avoid reference to Web 2.0 and the companies that profit from media-sharing.
While using MediaSPIP, you are invited to avoid using words like "Brand", "Cloud" and "Market".
MediaSPIP is designed to facilitate the sharing of creative media online, while allowing authors to retain complete control of their work.
MediaSPIP aims to be accessible to as many people as possible and development is based on expanding the (...) -
Les vidéos
21 avril 2011, parComme les documents de type "audio", Mediaspip affiche dans la mesure du possible les vidéos grâce à la balise html5 .
Un des inconvénients de cette balise est qu’elle n’est pas reconnue correctement par certains navigateurs (Internet Explorer pour ne pas le nommer) et que chaque navigateur ne gère en natif que certains formats de vidéos.
Son avantage principal quant à lui est de bénéficier de la prise en charge native de vidéos dans les navigateur et donc de se passer de l’utilisation de Flash et (...) -
Submit bugs and patches
13 avril 2011Unfortunately a software is never perfect.
If you think you have found a bug, report it using our ticket system. Please to help us to fix it by providing the following information : the browser you are using, including the exact version as precise an explanation as possible of the problem if possible, the steps taken resulting in the problem a link to the site / page in question
If you think you have solved the bug, fill in a ticket and attach to it a corrective patch.
You may also (...)
Sur d’autres sites (1800)
-
Overcoming Fintech and Finserv’s Biggest Data Analytics Challenges
Data powers innovation in financial technology (fintech), from personalized banking services to advanced fraud detection systems. Industry leaders recognize the value of strong security measures and customer privacy. A recent survey highlights this focus, with 72% of finance Chief Risk Officers identifying cybersecurity as their primary concern.
Beyond cybersecurity, fintech and financial services (finserv) companies are bogged down with massive amounts of data spread throughout disconnected systems. Between this, a complex regulatory landscape and an increasingly tech-savvy and sceptical consumer base, fintech and finserv companies have a lot on their plates.
How can marketing teams get the information they need while staying focused on compliance and providing customer value ?
This article will examine strategies to address common challenges in the finserv and fintech industries. We’ll focus on using appropriate tools, following effective data management practices, and learning from traditional banks’ approaches to similar issues.
What are the biggest fintech data analytics challenges, and how do they intersect with traditional banking ?
Recent years have been tough for the fintech industry, especially after the pandemic. This period has brought new hurdles in data analysis and made existing ones more complex. As the market stabilises, both fintech and finserve companies must tackle these evolving data issues.
Let’s examine some of the most significant data analytics challenges facing the fintech industry, starting with an issue that’s prevalent across the financial sector :
1. Battling data silos
In a recent survey by InterSystems, 54% of financial institution leaders said data silos are their biggest barrier to innovation, while 62% said removing silos is their priority data strategy for the next year.
Data silos segregate data repositories across departments, products and other divisions. This is a major issue in traditional banking and something fintech companies should avoid inheriting at all costs.
Siloed data makes it harder for decision-makers to view business performance with 360-degree clarity. It’s also expensive to maintain and operationalise and can evolve into privacy and data compliance issues if left unchecked.
To avoid or remove data silos, develop a data governance framework and centralise your data repositories. Next, simplify your analytics stack into as few integrated tools as possible because complex tech stacks are one of the leading causes of data silos.
Use an analytics system like Matomo that incorporates web analytics, marketing attribution and CRO testing into one toolkit.
Matomo’s support plans help you implement a data system to meet the unique needs of your business and avoid issues like data silos. We also offer data warehouse exporting as a feature to bring all of your web analytics, customer data, support data, etc., into one centralised location.
Try Matomo for free today, or contact our sales team to discuss support plans.
2. Compliance with laws and regulations
A survey by Alloy reveals that 93% of fintech companies find it difficult to meet compliance regulations. The cost of staying compliant tops their list of worries (23%), outranking even the financial hit from fraud (21%) – and this in a year marked by cyber threats.
Data privacy laws are constantly changing, and the landscape varies across global regions, making adherence even more challenging for fintechs and traditional banks operating in multiple markets.
In the US market, companies grapple with regulations at both federal and state levels. Here are some of the state-level legislation coming into effect for 2024-2026 :
- Oregon – Oregon Consumer Privacy Act (July 1, 2024)
- Florida – Florida Digital Bill of Rights (July 1, 2024)
- Texas – Texas Data Privacy and Security Act (July 1, 2024)
- Montana – Montana Consumer Data Privacy Act (October 1, 2024)
- Delaware – Delaware Personal Privacy Act (January 1, 2025)
- Iowa – Iowa Consumer Data Protection Act (January 1, 2025)
- New Jersey – SB 332 (January 15, 2025)
- Tennessee – Tennessee Information Protection Act (July 1, 2025)
- Indiana – Indiana Consumer Data Protection Act ( January 1, 2026)
Other countries are also ramping up regional regulations. For instance, Canada has Quebec’s Act Respecting the Protection of Personal Information in the Private Sector and British Columbia’s Personal Information Protection Act (BC PIPA).
Ignorance of country- or region-specific laws will not stop companies from suffering the consequences of violating them.
The only answer is to invest in adherence and manage business growth accordingly. Ultimately, compliance is more affordable than non-compliance – not only in terms of the potential fines but also the potential risks to reputation, consumer trust and customer loyalty.
This is an expensive lesson that fintech and traditional financial companies have had to learn together. GDPR regulators hit CaixaBank S.A, one of Spain’s largest banks, with multiple multi-million Euro fines, and Klarna Bank AB, a popular Swedish fintech company, for €720,000.
To avoid similar fates, companies should :
- Build solid data systems
- Hire compliance experts
- Train their teams thoroughly
- Choose data analytics tools carefully
Remember, even popular tools like Google Analytics aren’t automatically safe. Find out how Matomo helps you gather useful insights while sticking to rules like GDPR.
3. Protecting against data security threats
Cyber threats are increasing in volume and sophistication, with the financial sector becoming the most breached in 2023.
The cybersecurity risks will only worsen, with WEF estimating annual cybercrime expenses of up to USD $10.5 trillion globally by 2025, up from USD $3 trillion in 2015.
While technology brings new security solutions, it also amplifies existing risks and creates new ones. A 2024 McKinsey report warns that the risk of data breaches will continue to increase as the financial industry increasingly relies on third-party data tools and cloud computing services unless they simultaneously improve their security posture.
The reality is that adopting a third-party data system without taking the proper precautions means adopting its security vulnerabilities.
In 2023, the MOVEit data breach affected companies worldwide, including financial institutions using its file transfer system. One hack created a global data crisis, potentially affecting the customer data of every company using this one software product.
The McKinsey report emphasises choosing tools wisely. Why ? Because when customer data is compromised, it’s your company that takes the heat, not the tool provider. As the report states :
“Companies need reliable, insightful metrics and reporting (such as security compliance, risk metrics and vulnerability tracking) to prove to regulators the health of their security capabilities and to manage those capabilities.”
Don’t put user or customer data in the hands of companies you can’t trust. Work with providers that care about security as much as you do. With Matomo, you own all of your data, ensuring it’s never used for unknown purposes.
4. Protecting users’ privacy
With security threats increasing, fintech companies and traditional banks must prioritise user privacy protection. Users are also increasingly aware of privacy threats and ready to walk away from companies that lose their trust.
Cisco’s 2023 Data Privacy Benchmark Study reveals some eye-opening statistics :
- 94% of companies said their customers wouldn’t buy from them if their data wasn’t protected, and
- 95% see privacy as a business necessity, not just a legal requirement.
Modern financial companies must balance data collection and management with increasing privacy demands. This may sound contradictory for companies reliant on dated practices like third-party cookies, but they need to learn to thrive in a cookieless web as customers move to banks and service providers that have strong data ethics.
This privacy protection journey starts with implementing web analytics ethically from the very first session.
The most important elements of ethically-sound web analytics in fintech are :
- 100% data ownership : Make sure your data isn’t used in other ways by the tools that collect it.
- Respecting user privacy : Only collect the data you absolutely need to do your job and avoid personally identifiable information.
- Regulatory compliance : Stick with solutions built for compliance to stay out of legal trouble.
- Data transparency : Know how your tools use your data and let your customers know how you use it.
Read our guide to ethical web analytics for more information.
5. Comparing customer trust across industries
While fintech companies are making waves in the financial world, they’re still playing catch-up when it comes to earning customer trust. According to RFI Global, fintech has a consumer trust score of 5.8/10 in 2024, while traditional banking scores 7.6/10.
This trust gap isn’t just about perception – it’s rooted in real issues :
- Security breaches are making headlines more often.
- Privacy regulations like GDPR are making consumers more aware of their rights.
- Some fintech companies are struggling to handle fraud effectively.
According to the UK’s Payment Systems Regulator, digital banking brands Monzo and Starling had some of the highest fraudulent activity rates in 2022. Yet, Monzo only reimbursed 6% of customers who reported suspicious transactions, compared to 70% for NatWest and 91% for Nationwide.
So, what can fintech firms do to close this trust gap ?
- Start with privacy-centric analytics from day one. This shows customers you value their privacy from the get-go.
- Build and maintain a long-term reputation free of data leaks and privacy issues. One major breach can undo years of trust-building.
- Learn from traditional banks when it comes to handling issues like fraudulent transactions, identity theft, and data breaches. Prompt, customer-friendly resolutions go a long way.
- Remember : cutting-edge financial technology doesn’t make up for poor customer care. If your digital bank won’t refund customers who’ve fallen victim to credit card fraud, they’ll likely switch to a traditional bank that will.
The fintech sector has made strides in innovation, but there’s still work to do in establishing trustworthiness. By focusing on robust security, transparent practices, and excellent customer service, fintech companies can bridge the trust gap and compete more effectively with traditional banks.
6. Collecting quality data
Adhering to data privacy regulations, protecting user data and implementing ethical analytics raises another challenge. How can companies do all of these things and still collect reliable, quality data ?
Google’s answer is using predictive models, but this replaces real data with calculations and guesswork. The worst part is that Google Analytics doesn’t even let you use all of the data you collect in the first place. Instead, it uses something called data sampling once you pass certain thresholds.
In practice, this means that Google Analytics uses a limited set of your data to calculate reports. We’ve discussed GA4 data sampling at length before, but there are two key problems for companies here :
- A sample size that’s too small won’t give you a full representation of your data.
- The more visitors that come to your site, the less accurate your reports will become.
For high-growth companies, data sampling simply can’t keep up. Financial marketers widely recognise the shortcomings of big tech analytics providers. In fact, 80% of them say they’re concerned about data bias from major providers like Google and Meta affecting valuable insights.
This is precisely why CRO:NYX Digital approached us after discovering Google Analytics wasn’t providing accurate campaign data. We set up an analytics system to suit the company’s needs and tested it alongside Google Analytics for multiple campaigns. In one instance, Google Analytics failed to register 6,837 users in a single day, approximately 9.8% of the total tracked by Matomo.
In another instance, Google Analytics only tracked 600 visitors over 24 hours, while Matomo recorded nearly 71,000 visitors – an 11,700% discrepancy.
Financial companies need a more reliable, privacy-centric alternative to Google Analytics that captures quality data without putting users at potential risk. This is why we built Matomo and why our customers love having total control and visibility of their data.
Unlock the full power of fintech data analytics with Matomo
Fintech companies face many data-related challenges, so compliant web analytics shouldn’t be one of them.
With Matomo, you get :
- An all-in-one solution that handles traditional web analytics, behavioural analytics and more with strong integrations to minimise the likelihood of data siloing
- Full compliance with GDPR, CCPA, PIPL and more
- Complete ownership of your data to minimise cybersecurity risks caused by negligent third parties
- An abundance of ways to protect customer privacy, like IP address anonymisation and respect for DoNotTrack settings
- The ability to import data from Google Analytics and distance yourself from big tech
- High-quality data that doesn’t rely on sampling
- A tool built with financial analytics in mind
Don’t let big tech companies limit the power of your data with sketchy privacy policies and counterintuitive systems like data sampling.
Start your Matomo free trial or request a demo to unlock the full power of fintech data analytics without putting your customers’ personal information at unnecessary risk.
-
Stream sent via FFMPEG (NodeJS) to RTMP (YouTube) not being received
10 décembre 2024, par QumberI am writing a very basic chrome extension that captures and sends video stream to a nodeJS server, which in turns sends it to Youtube live server.


Here is my implementation of the backend which receives data via WebRTC and send to YT using FFMPEG :


const express = require('express');
const cors = require('cors');
const { RTCPeerConnection, RTCSessionDescription } = require('@roamhq/wrtc');
const { spawn } = require('child_process');

const app = express();
app.use(express.json());
app.use(cors());

app.post('/webrtc', async (req, res) => {
 const peerConnection = new RTCPeerConnection();

 // Start ffmpeg process for streaming
 const ffmpeg = spawn('ffmpeg', [
 '-f', 'flv',
 '-i', 'pipe:0',
 '-c:v', 'libx264',
 '-preset', 'veryfast',
 '-maxrate', '3000k',
 '-bufsize', '6000k',
 '-pix_fmt', 'yuv420p',
 '-g', '50',
 '-f', 'flv',
 'rtmp://a.rtmp.youtube.com/live2/MY_KEY'
 ]);

 ffmpeg.on('error', (err) => {
 console.error('FFmpeg error:', err);
 });

 ffmpeg.stderr.on('data', (data) => {
 console.error('FFmpeg stderr:', data.toString());
 });

 ffmpeg.stdout.on('data', (data) => {
 console.log('FFmpeg stdout:', data.toString());
 });

 // Handle incoming tracks
 peerConnection.ontrack = (event) => {
 console.log('Track received:', event.track.kind);
 const track = event.track;

 // Stream the incoming track to FFmpeg
 track.onunmute = () => {
 console.log('Track unmuted:', track.kind);
 const reader = track.createReadStream();
 reader.on('data', (chunk) => {
 console.log('Forwarding chunk to FFmpeg:', chunk.length);
 ffmpeg.stdin.write(chunk);
 });
 reader.on('end', () => {
 console.log('Stream ended');
 ffmpeg.stdin.end();
 });
 };

 track.onmute = () => {
 console.log('Track muted:', track.kind);
 };
 };

 // Set the remote description (offer) received from the client
 await peerConnection.setRemoteDescription(new RTCSessionDescription(req.body.sdp));

 // Create an answer and send it back to the client
 const answer = await peerConnection.createAnswer();
 await peerConnection.setLocalDescription(answer);

 res.json({ sdp: peerConnection.localDescription });
});

app.listen(3000, () => {
 console.log('WebRTC to RTMP server running on port 3000');
});




This is the output I get, but nothing gets sent to YouTube :




FFmpeg stderr: ffmpeg version 7.0.2 Copyright (c) 2000-2024 the FFmpeg developers
 built with Apple clang version 15.0.0 (clang-1500.3.9.4)

FFmpeg stderr: configuration: --prefix=/opt/homebrew/Cellar/ffmpeg/7.0.2_1 --enable-shared --enable-pthreads --enable-version3 --cc=clang --host-cflags= --host-ldflags='-Wl,-ld_classic' --enable-ffplay --enable-gnutls --enable-gpl --enable-libaom --enable-libaribb24 --enable-libbluray --enable-libdav1d --enable-libharfbuzz --enable-libjxl --enable-libmp3lame --enable-libopus --enable-librav1e --enable-librist --enable-librubberband --enable-libsnappy --enable-libsrt --enable-libssh --enable-libsvtav1 --enable-libtesseract --enable-libtheora --enable-libvidstab --enable-libvmaf --enable-libvorbis --enable-libvpx --enable-libwebp --enable-libx264 --enable-libx265 --enable-libxml2 --enable-libxvid --enable-lzma --enable-libfontconfig --enable-libfreetype --enable-frei0r --enable-libass --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libopenjpeg --enable-libspeex --enable-libsoxr --enable-libzmq --enable-libzimg --disable-libjack --disable-indev=jack --enable-videotoolbox --enable-audiotoolbox --enable-neon

FFmpeg stderr: libavutil 59. 8.100 / 59. 8.100
 libavcodec 61. 3.100 / 61. 3.100
 libavformat 61. 1.100 / 61. 1.100
 libavdevice 61. 1.100 / 61. 1.100

FFmpeg stderr: libavfilter 10. 1.100 / 10. 1.100
 libswscale 8. 1.100 / 8. 1.100
 libswresample 5. 1.100 / 5. 1.100
 libpostproc 58. 1.100 / 58. 1.100





I do not understand what I am doing wrong. Any help would be appreciated.



Optionally Here's the frontend code from the extension, which (to me) appears to be recording and sending the capture :


popup.js & popup.html




document.addEventListener('DOMContentLoaded', () => {
 document.getElementById('openCapturePage').addEventListener('click', () => {
 chrome.tabs.create({
 url: chrome.runtime.getURL('capture.html')
 });
 });
});






 
 <code class="echappe-js"><script src='http://stackoverflow.com/feeds/tag/popup.js'></script>




StreamSavvy













capture.js & capture.html




let peerConnection;

async function startStreaming() {
 try {
 const stream = await navigator.mediaDevices.getDisplayMedia({
 video: {
 cursor: "always"
 },
 audio: false
 });

 peerConnection = new RTCPeerConnection({
 iceServers: [{
 urls: 'stun:stun.l.google.com:19302'
 }]
 });

 stream.getTracks().forEach(track => peerConnection.addTrack(track, stream));

 const offer = await peerConnection.createOffer();
 await peerConnection.setLocalDescription(offer);

 const response = await fetch('http://localhost:3000/webrtc', {
 method: 'POST',
 headers: {
 'Content-Type': 'application/json'
 },
 body: JSON.stringify({
 sdp: peerConnection.localDescription
 })
 });

 const {
 sdp
 } = await response.json();
 await peerConnection.setRemoteDescription(new RTCSessionDescription(sdp));

 console.log("Streaming to server via WebRTC...");
 } catch (error) {
 console.error("Error starting streaming:", error.name, error.message);
 }
}

async function stopStreaming() {
 if (peerConnection) {
 // Stop all media tracks
 peerConnection.getSenders().forEach(sender => {
 if (sender.track) {
 sender.track.stop();
 }
 });

 // Close the peer connection
 peerConnection.close();
 peerConnection = null;
 console.log("Streaming stopped");
 }
}

document.addEventListener('DOMContentLoaded', () => {
 document.getElementById('startCapture').addEventListener('click', startStreaming);
 document.getElementById('stopCapture').addEventListener('click', stopStreaming);
});






 
 <code class="echappe-js"><script src='http://stackoverflow.com/feeds/tag/capture.js'></script>




StreamSavvy Capture















background.js (service worker)




chrome.runtime.onInstalled.addListener(() => {
 console.log("StreamSavvy Extension Installed");
});

chrome.runtime.onMessage.addListener((message, sender, sendResponse) => {
 if (message.type === 'startStreaming') {
 chrome.tabs.create({
 url: chrome.runtime.getURL('capture.html')
 });
 sendResponse({
 status: 'streaming'
 });
 } else if (message.type === 'stopStreaming') {
 chrome.tabs.query({
 url: chrome.runtime.getURL('capture.html')
 }, (tabs) => {
 if (tabs.length > 0) {
 chrome.tabs.sendMessage(tabs[0].id, {
 type: 'stopStreaming'
 });
 sendResponse({
 status: 'stopped'
 });
 }
 });
 }
 return true; // Keep the message channel open for sendResponse
});







-
Error Opening RTMP Stream through FFmpeg command when executed through exec package [closed]
3 octobre 2024, par AkhilI have been trying to transcode the live stream from RTMP server running on
rtmp://localhost:1936/live/test
with FFmpeg in a Go application usingos/exec
package, But seems to not work and gives the input/output error (I have attached below). The same exact ffmpeg command when I execute on terminal, works as its supposed to. Not Sure why that is, here is my code for reproducing and analyzing the mistakes.

ffmpegCmd := fmt.Sprintf("ffmpeg -nostdin -i rtmp://localhost:1936/live/%s -c:v libx264 -s %s -f %s %s/stream.mpd",
 streamKey, resolution, sp.OutputFormat, outputPath)
 log.Printf("Executing FFmpeg command: %s", ffmpegCmd)

 // Prepare the command execution with a timeout context
 ctx, cancel := context.WithTimeout(context.Background(), 60*time.Second) // Set a 60-second timeout
 defer cancel()

 cmd := exec.CommandContext(ctx, "bash", "-c", ffmpegCmd)



the ffmpeg command looks like this :

ffmpeg -nostdin -i rtmp://localhost:1936/live/test -c:v libx264 -s 1920x1080 -f dash output/test/1080p/stream.mpd


I get the following error :


Error opening input: Input/output error

Error opening input file rtmp://localhost:1936/live/test.

Error opening input files: Input/output error

Exiting normally, received signal 2.

signal: interrupt



I have already tried to break the command, and then execute it. Something like :


cmd := exec.CommandContext(ctx,
 "ffmpeg",
 "-nostdin",
 "-i", "rtmp://localhost:1936/live/"+streamKey,
 "-c:v", "libx264",
 "-s", resolution,
 "-f", sp.OutputFormat,
 outputPath+"/stream.mpd")



After running the ffmpeg command with -loglevel debug and -report :


Here is the logs and errors I get :


When I run it within the go application :


ffmpeg started on 2024-10-02 at 12:00:06
Report written to "ffmpeg-20241002-120006.log"
Log level: 48
Command line:
ffmpeg -loglevel debug -report -i rtmp://localhost:1936/live/test -c:v libx264 -s 1920x1080 -f dash ./output/test/1080p/stream.mpd
ffmpeg version 7.0.2 Copyright (c) 2000-2024 the FFmpeg developers
 built with Apple clang version 15.0.0 (clang-1500.3.9.4)
 configuration: --prefix=/opt/homebrew/Cellar/ffmpeg/7.0.2_1 --enable-shared --enable-pthreads --enable-version3 --cc=clang --host-cflags= --host-ldflags='-Wl,-ld_classic' --enable-ffplay --enable-gnutls --enable-gpl --enable-libaom --enable-libaribb24 --enable-libbluray --enable-libdav1d --enable-libharfbuzz --enable-libjxl --enable-libmp3lame --enable-libopus --enable-librav1e --enable-librist --enable-librubberband --enable-libsnappy --enable-libsrt --enable-libssh --enable-libsvtav1 --enable-libtesseract --enable-libtheora --enable-libvidstab --enable-libvmaf --enable-libvorbis --enable-libvpx --enable-libwebp --enable-libx264 --enable-libx265 --enable-libxml2 --enable-libxvid --enable-lzma --enable-libfontconfig --enable-libfreetype --enable-frei0r --enable-libass --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libopenjpeg --enable-libspeex --enable-libsoxr --enable-libzmq --enable-libzimg --disable-libjack --disable-indev=jack --enable-videotoolbox --enable-audiotoolbox --enable-neon
 libavutil 59. 8.100 / 59. 8.100
 libavcodec 61. 3.100 / 61. 3.100
 libavformat 61. 1.100 / 61. 1.100
 libavdevice 61. 1.100 / 61. 1.100
 libavfilter 10. 1.100 / 10. 1.100
 libswscale 8. 1.100 / 8. 1.100
 libswresample 5. 1.100 / 5. 1.100
 libpostproc 58. 1.100 / 58. 1.100
Splitting the commandline.
Reading option '-loglevel' ... matched as option 'loglevel' (set logging level) with argument 'debug'.
Reading option '-report' ... matched as option 'report' (generate a report) with argument '1'.
Reading option '-i' ... matched as input url with argument 'rtmp://localhost:1936/live/test'.
Reading option '-c:v' ... matched as option 'c' (select encoder/decoder ('copy' to copy stream without reencoding)) with argument 'libx264'.
Reading option '-s' ... matched as option 's' (set frame size (WxH or abbreviation)) with argument '1920x1080'.
Reading option '-f' ... matched as option 'f' (force container format (auto-detected otherwise)) with argument 'dash'.
Reading option './output/test/1080p/stream.mpd' ... matched as output url.
Finished splitting the commandline.
Parsing a group of options: global .
Applying option loglevel (set logging level) with argument debug.
Applying option report (generate a report) with argument 1.
Successfully parsed a group of options.
Parsing a group of options: input url rtmp://localhost:1936/live/test.
Successfully parsed a group of options.
Opening an input file: rtmp://localhost:1936/live/test.
[AVFormatContext @ 0x13f721f90] Opening 'rtmp://localhost:1936/live/test' for reading
[rtmp @ 0x13f6040e0] No default whitelist set
[tcp @ 0x13f7223d0] No default whitelist set
[tcp @ 0x13f7223d0] Original list of addresses:
[tcp @ 0x13f7223d0] Address ::1 port 1936
[tcp @ 0x13f7223d0] Address 127.0.0.1 port 1936
[tcp @ 0x13f7223d0] Interleaved list of addresses:
[tcp @ 0x13f7223d0] Address ::1 port 1936
[tcp @ 0x13f7223d0] Address 127.0.0.1 port 1936
[tcp @ 0x13f7223d0] Starting connection attempt to ::1 port 1936
[tcp @ 0x13f7223d0] Connection attempt to ::1 port 1936 failed: Connection refused
[tcp @ 0x13f7223d0] Starting connection attempt to 127.0.0.1 port 1936
[tcp @ 0x13f7223d0] Successfully connected to 127.0.0.1 port 1936
[rtmp @ 0x13f6040e0] Handshaking...
[rtmp @ 0x13f6040e0] Type answer 3
[rtmp @ 0x13f6040e0] Server version 13.14.10.13
[rtmp @ 0x13f6040e0] Proto = rtmp, path = /live/test, app = live, fname = test
[rtmp @ 0x13f6040e0] Window acknowledgement size = 5000000
[rtmp @ 0x13f6040e0] Max sent, unacked = 5000000
[rtmp @ 0x13f6040e0] New incoming chunk size = 4096
[rtmp @ 0x13f6040e0] Creating stream...
[rtmp @ 0x13f6040e0] Sending play command for 'test'
[rtmp @ 0x13f6040e0] Deleting stream...
[in#0 @ 0x13f721d40] Error opening input: Input/output error
Error opening input file rtmp://localhost:1936/live/test.
Error opening input files: Input/output error
Exiting normally, received signal 2.



This is what i get when i run the same command on terminal :


<same as="as" but="but" please="please" scroll="scroll" further="further">

[rtmp @ 0x1437144c0] No default whitelist set
[tcp @ 0x143604f20] No default whitelist set
[tcp @ 0x143604f20] Original list of addresses:
[tcp @ 0x143604f20] Address ::1 port 1936
[tcp @ 0x143604f20] Address 127.0.0.1 port 1936
[tcp @ 0x143604f20] Interleaved list of addresses:
[tcp @ 0x143604f20] Address ::1 port 1936
[tcp @ 0x143604f20] Address 127.0.0.1 port 1936
[tcp @ 0x143604f20] Starting connection attempt to ::1 port 1936
[tcp @ 0x143604f20] Connection attempt to ::1 port 1936 failed: Connection refused
[tcp @ 0x143604f20] Starting connection attempt to 127.0.0.1 port 1936
[tcp @ 0x143604f20] Successfully connected to 127.0.0.1 port 1936
[rtmp @ 0x1437144c0] Handshaking...
[rtmp @ 0x1437144c0] Type answer 3
[rtmp @ 0x1437144c0] Server version 13.14.10.13
[rtmp @ 0x1437144c0] Proto = rtmp, path = /live/test, app = live, fname = test
[rtmp @ 0x1437144c0] Window acknowledgement size = 5000000
[rtmp @ 0x1437144c0] Max sent, unacked = 5000000
[rtmp @ 0x1437144c0] New incoming chunk size = 4096
[rtmp @ 0x1437144c0] Creating stream...
[rtmp @ 0x1437144c0] Sending play command for 'test'
[flv @ 0x143604b30] Format flv probed with size=2048 and score=100
[flv @ 0x143604b30] Before avformat_find_stream_info() pos: 13 bytes read:2263 seeks:0 nb_streams:0
Transform tree:
 mdct_inv_float_c - type: mdct_float, len: 64, factors[2]: [2, any], flags: [unaligned, out_of_place, inv_only]
 fft32_ns_float_neon - type: fft_float, len: 32, factor: 2, flags: [aligned, inplace, out_of_place, preshuf]
Transform tree:
 mdct_inv_float_c - type: mdct_float, len: 64, factors[2]: [2, any], flags: [unaligned, out_of_place, inv_only]
 fft32_ns_float_neon - type: fft_float, len: 32, factor: 2, flags: [aligned, inplace, out_of_place, preshuf]
Transform tree:
 mdct_pfa_15xM_inv_float_c - type: mdct_float, len: 120, factors[2]: [15, any], flags: [unaligned, out_of_place, inv_only]
 fft4_fwd_float_neon - type: fft_float, len: 4, factor: 2, flags: [aligned, inplace, out_of_place, preshuf]
Transform tree:
 mdct_inv_float_c - type: mdct_float, len: 128, factors[2]: [2, any], flags: [unaligned, out_of_place, inv_only]
 fft_sr_ns_float_neon - type: fft_float, len: 64, factor: 2, flags: [aligned, inplace, out_of_place, preshuf]
Transform tree:
 mdct_pfa_15xM_inv_float_c - type: mdct_float, len: 480, factors[2]: [15, any], flags: [unaligned, out_of_place, inv_only]
 fft16_ns_float_neon - type: fft_float, len: 16, factor: 2, flags: [aligned, inplace, out_of_place, preshuf]
Transform tree:
 mdct_inv_float_c - type: mdct_float, len: 512, factors[2]: [2, any], flags: [unaligned, out_of_place, inv_only]
 fft_sr_ns_float_neon - type: fft_float, len: 256, factor: 2, flags: [aligned, inplace, out_of_place, preshuf]
Transform tree:
 mdct_pfa_15xM_inv_float_c - type: mdct_float, len: 960, factors[2]: [15, any], flags: [unaligned, out_of_place, inv_only]
 fft32_ns_float_neon - type: fft_float, len: 32, factor: 2, flags: [aligned, inplace, out_of_place, preshuf]
Transform tree:
 mdct_inv_float_c - type: mdct_float, len: 1024, factors[2]: [2, any], flags: [unaligned, out_of_place, inv_only]
 fft_sr_ns_float_neon - type: fft_float, len: 512, factor: 2, flags: [aligned, inplace, out_of_place, preshuf]
Transform tree:
 mdct_fwd_float_c - type: mdct_float, len: 1024, factors[2]: [2, any], flags: [unaligned, out_of_place, fwd_only]
 fft_sr_ns_float_neon - type: fft_float, len: 512, factor: 2, flags: [aligned, inplace, out_of_place, preshuf]
[NULL @ 0x144124920] nal_unit_type: 7(SPS), nal_ref_idc: 3
[NULL @ 0x144124920] Decoding VUI
[NULL @ 0x144124920] nal_unit_type: 8(PPS), nal_ref_idc: 3
[NULL @ 0x144124920] Decoding VUI
[h264 @ 0x144124920] nal_unit_type: 7(SPS), nal_ref_idc: 3
[h264 @ 0x144124920] Decoding VUI
[h264 @ 0x144124920] nal_unit_type: 8(PPS), nal_ref_idc: 3
[h264 @ 0x144124920] nal_unit_type: 7(SPS), nal_ref_idc: 3
[h264 @ 0x144124920] nal_unit_type: 8(PPS), nal_ref_idc: 3
[h264 @ 0x144124920] nal_unit_type: 5(IDR), nal_ref_idc: 3
[h264 @ 0x144124920] Decoding VUI
[h264 @ 0x144124920] Format yuv420p chosen by get_format().
[h264 @ 0x144124920] Reinit context to 1280x720, pix_fmt: yuv420p
[h264 @ 0x144124920] no picture 
[flv @ 0x143604b30] All info found
[flv @ 0x143604b30] rfps: 29.666667 0.016552
[flv @ 0x143604b30] rfps: 29.750000 0.009347
[flv @ 0x143604b30] rfps: 29.750000 0.009347
[flv @ 0x143604b30] rfps: 29.833333 0.004197
[flv @ 0x143604b30] rfps: 29.916667 0.001104
[flv @ 0x143604b30] rfps: 29.916667 0.001104
[flv @ 0x143604b30] rfps: 30.000000 0.000067
[flv @ 0x143604b30] rfps: 30.000000 0.000067
[flv @ 0x143604b30] rfps: 60.000000 0.000270
[flv @ 0x143604b30] rfps: 60.000000 0.000270
[flv @ 0x143604b30] rfps: 120.000000 0.001079
[flv @ 0x143604b30] rfps: 120.000000 0.001079
[flv @ 0x143604b30] rfps: 240.000000 0.004316
[flv @ 0x143604b30] rfps: 240.000000 0.004316
[flv @ 0x143604b30] rfps: 29.970030 0.000204
[flv @ 0x143604b30] rfps: 29.970030 0.000204
[flv @ 0x143604b30] rfps: 59.940060 0.000814
[flv @ 0x143604b30] rfps: 59.940060 0.000814
[flv @ 0x143604b30] After avformat_find_stream_info() pos: 496783 bytes read:496783 seeks:0 frames:179
Input #0, flv, from 'rtmp://localhost:1936/live/test':
 Metadata:
 |RtmpSampleAccess: true
 Server : NGINX RTMP (github.com/arut/nginx-rtmp-module)
 displayWidth : 1280
 displayHeight : 720
 fps : 30
 profile : 
 level : 
 Duration: 00:00:00.00, start: 6.742000, bitrate: N/A
 Stream #0:0, 138, 1/1000: Audio: aac (LC), 48000 Hz, stereo, fltp, 163 kb/s
 Stream #0:1, 41, 1/1000: Video: h264 (High), 1 reference frame, yuv420p(tv, bt709, progressive, left), 1280x720 [SAR 1:1 DAR 16:9], 0/1, 2560 kb/s, 30 fps, 30 tbr, 1k tbn
Successfully opened the file.
Parsing a group of options: output url ./output/test/1080p/stream.mpd.
Applying option c:v (select encoder/decoder ('copy' to copy stream without reencoding)) with argument libx264.
Applying option s (set frame size (WxH or abbreviation)) with argument 1920x1080.
Applying option f (force container format (auto-detected otherwise)) with argument dash.
Successfully parsed a group of options.
Opening an output file: ./output/test/1080p/stream.mpd.
[out#0/dash @ 0x123707480] No explicit maps, mapping streams automatically...
[vost#0:0/libx264 @ 0x123707d60] Created video stream from input stream 0:1
detected 10 logical cores
[h264 @ 0x123607b70] nal_unit_type: 7(SPS), nal_ref_idc: 3
[h264 @ 0x123607b70] Decoding VUI
[h264 @ 0x123607b70] nal_unit_type: 8(PPS), nal_ref_idc: 3
[aost#0:1/aac @ 0x144028080] Created audio stream from input stream 0:0
Transform tree:
 mdct_inv_float_c - type: md

<it simply="simply" starts="starts" working="working">
</it></same>


I am not sure if there is something to do with Permissions.