Recherche avancée

Médias (0)

Mot : - Tags -/xmlrpc

Aucun média correspondant à vos critères n’est disponible sur le site.

Autres articles (88)

  • Personnaliser en ajoutant son logo, sa bannière ou son image de fond

    5 septembre 2013, par

    Certains thèmes prennent en compte trois éléments de personnalisation : l’ajout d’un logo ; l’ajout d’une bannière l’ajout d’une image de fond ;

  • Organiser par catégorie

    17 mai 2013, par

    Dans MédiaSPIP, une rubrique a 2 noms : catégorie et rubrique.
    Les différents documents stockés dans MédiaSPIP peuvent être rangés dans différentes catégories. On peut créer une catégorie en cliquant sur "publier une catégorie" dans le menu publier en haut à droite ( après authentification ). Une catégorie peut être rangée dans une autre catégorie aussi ce qui fait qu’on peut construire une arborescence de catégories.
    Lors de la publication prochaine d’un document, la nouvelle catégorie créée sera proposée (...)

  • Publier sur MédiaSpip

    13 juin 2013

    Puis-je poster des contenus à partir d’une tablette Ipad ?
    Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir

Sur d’autres sites (6339)

  • asm SIMD sniffer

    1er août 2023, par Андрей Тернити

    There is x264.
It use a lot of x86 asm files. For example pixel-32.asm.
This files can use different SIMD instruction set : mmx, 3DNow !, sse family, others

    


    I need the simple way to automatically analyze every file. I want get which SIMD family in which file are used. How ?

    


    I think every asm file must contain information about which SIMD family it use (or information that no SIMD). Without this information it is very bad idea try to use this files...
    
I am angry, my x86 CPU support mmx and 3DNow ! only, but x264 try call sse, so I get "Illegal instruction" sometimes. I plan to make patch for x264.

    


    P.S. If you can make issues in official repo let me know.

    


    P.P.S. This thread on Doom9 (mirror).

    


  • How do I convert a .wav file to 16bit 44.1kz using ffmpeg or other utility [closed]

    26 mai 2023, par Seth Edwards

    A preface :
I am building an environment for a my own streaming box. Since building the UI. I turned to the now obsolete MSNTV box to find its UI sound effects.

    


    I found the dump on GitHub. I downloaded and located where the sounds where located.

    


    I listened to them one by one. I noticed that they are wave files. But they sound like they were low quality and may have been compressed before being turned into a wave file.

    


    I was using the Apple Files app on an iPhone 6s running iOS 15.7.1.

    


    They play back fine.

    


    I try importing them into GarageBand for iOS and it gives me an error saying that it only allows 16bit 44.1khz files. This confirmed my suspicion of it being low quality.

    


    I then tried playing them on a Dell Chromebook 3100 running ChromeOS. Chrome’s player would also not play the files.

    


    I need to find out how to convert them to 16bit 44.1khz wave files.

    


    My guess is that since the MSNTV had a small amount of storage space that they compressed the audio.

    


    I tried converting the files to mp3. And they are Noticeably worse.

    


    Does anyone know how to convert these files so they can be played back normally.

    


    In the end I plan to use these files and play them using the pygame library.

    


    I have tried changing the metadata

    


    Converting to mp3

    


  • Is there any open source solution to display a remote stream inside a Hololens2 UWP Vuforia application ?

    19 avril 2023, par T777

    What do we need ?

    


    We are trying to develop an application for quality management in which we show an hologram on a metal part as an assitance marking. (using Hololen2 + Vuforia + ModleTargets) The employee uses an sensor to follow this assitance marking and the data will be analyzed live by a test device. The results are outputed on a screen / are visible at an closed source application of the manufacturer of the test device.

    


    Capturing of the video output :
The current plan is to capture the video stream of the test device via capture card. Add a via mrtk2 videopanel inside the vuforia app and stream the captured video to the Hololens2 using obs or an OpenCV python script for screen recording.

    


    What we have tried so far

    


    1) Sending Raw udp stream
via RMTP and decoding + converting with gstreamer server and writing an own library in Unity for Receiving
Result : Temporary stopped, because receiving the udp streams needs connection/ session management (signalling) frame syncing and agreement on video size, color format, frame rate etc.. and we have no solution.
An own implementation of any of this would have high complexity is consuming a lot of time.

    


    2) Using available protocols that i can find on the web
Actually there are some protocols already developed for session creation and streaming :

    


      

    • HTTP streaming (HLS) (Transport + Session)
    • 


    • RTMP (Transport + Session),
    • 


    • RTP (Transport) + RTPS (Session),
    • 


    • WebRTC : Is possible with different protocol stacks
RTMP/TCP/UDP (Transport) + SDP (standardized format for video paramaters) + ICE (p2p)/ WHIP (http, client-server) / Websocket(client-server) (signaling protocols) that can be used and some good open source streaming servers (gstreamer, mediamtx and srs)
    • 


    


    When using these the video will be encoded typcially with xh264 and need to be decoded on the HoloLens 2. There are APIs to C/C++ native (hardware) decoding libraries like unity-vlc and ffmpeg.NET that needing media library ffmpeg. I could figure out (not tested) that there is an hardware h264 decoder on the HoloLens2 but I have no clue how to access it. Since there I couldnt disvocer any information about HoloLens2 media libraries.

    


    3) Using Unity packages

    


    


    Will be testing other compile options tomorrow..

    


      

    • Mixed Reality WebRTC (https://github.com/microsoft/MixedReality-WebRTC) :
Various protocol support, Microsoft brought Webrtc specifically to HoloLens.
Deprecated, as fas as I can see just support for Hololens1 and ARM32. So i can not evaluate if trying it with this is worth it.
    • 


    


    What are the next options ?

    


      

    • Developing a raw udp streaming library with untiy directly.
    • 


    • Rebuilding the application with visionlib (ARM32) compatible and MixedRealityWebRTC (ARM32)
    • 


    • Porting ffmpeg + API to UWP ?
    • 


    • Also there seem some affords to make WebRTC in general available to UWP platforms : https://github.com/microsoft/winrtc
    • 


    


    The questions

    


      

    • Does Vuforia support ARM32 ?
    • 


    • How to access hardware decoder of Hololens2 via Unity Code ?
    •