Recherche avancée

Médias (0)

Mot : - Tags -/organisation

Aucun média correspondant à vos critères n’est disponible sur le site.

Autres articles (102)

  • Supporting all media types

    13 avril 2011, par

    Unlike most software and media-sharing platforms, MediaSPIP aims to manage as many different media types as possible. The following are just a few examples from an ever-expanding list of supported formats : images : png, gif, jpg, bmp and more audio : MP3, Ogg, Wav and more video : AVI, MP4, OGV, mpg, mov, wmv and more text, code and other data : OpenOffice, Microsoft Office (Word, PowerPoint, Excel), web (html, CSS), LaTeX, Google Earth and (...)

  • Les sons

    15 mai 2013, par
  • Configurer la prise en compte des langues

    15 novembre 2010, par

    Accéder à la configuration et ajouter des langues prises en compte
    Afin de configurer la prise en compte de nouvelles langues, il est nécessaire de se rendre dans la partie "Administrer" du site.
    De là, dans le menu de navigation, vous pouvez accéder à une partie "Gestion des langues" permettant d’activer la prise en compte de nouvelles langues.
    Chaque nouvelle langue ajoutée reste désactivable tant qu’aucun objet n’est créé dans cette langue. Dans ce cas, elle devient grisée dans la configuration et (...)

Sur d’autres sites (9385)

  • ffmpeg - spark - azure databricks - error writing trailer of "filename.mp3" : Operation not supported

    4 juillet 2021, par CRAFTY DBA

    I have been trying to figure out this tough problem.

    


    I am trying to convert *.mp4 files to *.mp3 files.

    


    I tried using MoviePy but I found out that is uses ffmpeg and was having the same issue.

    


    I used these two articles to get the latest version of ffmpeg installed on the Azure Databricks Cluster during startup. I am using a single node cluster for this POC code.

    


    Pyspark : Use ffmpeg on the driver and workers
    
https://ubuntuhandbook.org/index.php/2020/06/install-ffmpeg-4-3-via-ppa-ubuntu-18-04-16-04

    


    The issue is that even the simplest command results in errors.

    


    %%bash ffmpeg -i /dbfs/Craftydba/recording.mp4 /dbfs/Craftydba/recording.mp3

    


    I even tried .wav as an output format and still the same issue.

    


    I retested this command on a Data Science VM in Azure with Python and FFMPEG. It works fine on that OS/Build.

    


    It has something to do with the version of the code on the spark cluster.

    


    Any help is appreciated.

    


    Sincerely

    


    John Miner

    


    PS : I am add a dump of the OS version as well as a ffmpeg error.

    


    Os Version Dump

    


    NAME="Ubuntu"
VERSION="18.04.5 LTS (Bionic Beaver)"
ID=ubuntu
ID_LIKE=debian
PRETTY_NAME="Ubuntu 18.04.5 LTS"
VERSION_ID="18.04"
HOME_URL="https://www.ubuntu.com/"
SUPPORT_URL="https://help.ubuntu.com/"
BUG_REPORT_URL="https://bugs.launchpad.net/ubuntu/"
PRIVACY_POLICY_URL="https://www.ubuntu.com/legal/terms-and-policies/privacy-policy"
VERSION_CODENAME=bionic
UBUNTU_CODENAME=bionic

    


    FFMPEG Dump

    


    ffmpeg version 4.3.2-0york0~18.04 Copyright (c) 2000-2021 the FFmpeg developers
  built with gcc 7 (Ubuntu 7.5.0-3ubuntu1~18.04)
  configuration: --prefix=/usr --extra-version='0york0~18.04' --toolchain=hardened --libdir=/usr/lib/x86_64-linux-gnu --incdir=/usr/include/x86_64-linux-gnu --arch=amd64 --enable-gpl --disable-stripping --enable-avresample --disable-filter=resample --enable-gnutls --enable-ladspa --enable-libaom --enable-libass --enable-libbluray --enable-libbs2b --enable-libcaca --enable-libcdio --enable-libcodec2 --enable-libflite --enable-libfontconfig --enable-libfreetype --enable-libfribidi --enable-libgme --enable-libgsm --enable-libjack --enable-libmp3lame --enable-libmysofa --enable-libopenjpeg --enable-libopenmpt --enable-libopus --enable-libpulse --enable-librabbitmq --enable-librsvg --enable-librubberband --enable-libshine --enable-libsnappy --enable-libsoxr --enable-libspeex --enable-libsrt --enable-libssh --enable-libtheora --enable-libtwolame --enable-libvidstab --enable-libvorbis --enable-libvpx --enable-libwavpack --enable-libwebp --enable-libx265 --enable-libxml2 --enable-libxvid --enable-libzmq --enable-libzvbi --enable-lv2 --enable-omx --enable-openal --enable-opencl --enable-opengl --enable-sdl2 --enable-libzimg --enable-pocketsphinx --enable-libdc1394 --enable-libdrm --enable-libiec61883 --enable-chromaprint --enable-frei0r --enable-libx264 --enable-shared
  libavutil      56. 51.100 / 56. 51.100
  libavcodec     58. 91.100 / 58. 91.100
  libavformat    58. 45.100 / 58. 45.100
  libavdevice    58. 10.100 / 58. 10.100
  libavfilter     7. 85.100 /  7. 85.100
  libavresample   4.  0.  0 /  4.  0.  0
  libswscale      5.  7.100 /  5.  7.100
  libswresample   3.  7.100 /  3.  7.100
  libpostproc    55.  7.100 / 55.  7.100
Input #0, mov,mp4,m4a,3gp,3g2,mj2, from '/dbfs/Craftydba/recording.mp4':
  Metadata:
    major_brand     : isom
    minor_version   : 512
    compatible_brands: isomiso2
    creation_time   : 2021-06-18T19:07:17.000000Z
  Duration: 00:04:48.64, start: 0.000000, bitrate: 1065 kb/s
    Stream #0:0(eng): Video: h264 (Constrained Baseline) (avc1 / 0x31637661), yuv420p, 1920x1080, 1000 kb/s, 7.96 fps, 8 tbr, 10k tbn, 20k tbc (default)
    Metadata:
      creation_time   : 2021-06-18T19:07:17.000000Z
    Stream #0:1(eng): Audio: aac (LC) (mp4a / 0x6134706D), 16000 Hz, mono, fltp, 63 kb/s (default)
    Metadata:
      creation_time   : 2021-06-18T19:07:17.000000Z
Stream mapping:
  Stream #0:1 -> #0:0 (aac (native) -> mp3 (libmp3lame))
Press [q] to stop, [?] for help
Output #0, mp3, to '/dbfs/Craftydba/recording.mp3':
  Metadata:
    major_brand     : isom
    minor_version   : 512
    compatible_brands: isomiso2
    TSSE            : Lavf58.45.100
    Stream #0:0(eng): Audio: mp3 (libmp3lame), 16000 Hz, mono, fltp (default)
    Metadata:
      creation_time   : 2021-06-18T19:07:17.000000Z
      encoder         : Lavc58.91.100 libmp3lame
Error writing trailer of /dbfs/Craftydba/recording.mp3: Operation not supported
size=     846kB time=00:04:48.65 bitrate=  24.0kbits/s speed=86.9x    
video:0kB audio:846kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: 0.036945%


    


  • webm Video file is around 1GB (expected to be 45 minutes), but only plays 5 minutes and stops. how to repair with ffmpeg ?

    25 juillet 2021, par AcidMicrowave

    I have a .webm video file that is expected to be 45 minutes (this is how long the recording was). It was 45mins of 720p and the filesize is around 1GB.

    


    The video file stops playing after 5 minutes. When I tried converting it with ffmpeg, to see if there is any damage, the conversion happens, but only the first 5 minutes are output with a 30MB file. During conversion, the console shows the following warnings :

    


    


    [webm @ 0x7ffa2181f000] Non-monotonous DTS in output stream 0:1 ; previous : 299933, current : 299893 ; changing to 299933. This may result in incorrect timestamps in the output file.

    


    


    


    [webm @ 0x7ffa2181f000] Non-monotonous DTS in output stream 0:1 ; previous : 299933, current : 299913 ; changing to 299933. This may result in incorrect timestamps in the output file.

    


    


    


    [libopus @ 0x7ffa2180f800] Queue input is backward in time

    


    


    (These errors appear many many many times..... I assume over 1000x.. are these corrupted frames ?)

    


    I would appreciate any help,

    


    Thank you very much.

    


  • Installing ffmpeg, librosa nad pydub in Apache Spark container

    17 avril 2023, par yaviens

    I'm working on a Python spark streaming project, that needs to use pydub and librosa to process audio, this libraries require having installed ffmpeg library. I'm having trubles to build the spark containers with this libraries, I don't know how to solve this problem.

    


    I use a docker-compose.yml to buil the images, define ports, etc of the spark master and workers.
docker-compose :

    


    version: "3.3"
services:
  spark-master:
    build:
      context: ./
      dockerfile: Dockerfile
    #image: docker.io/bitnami/spark:3.3
    ports:
      - "9090:8080"
      - "7077:7077"
    volumes:
       - ./apps:/opt/spark-apps
       - ./data:/opt/spark-data
       - ./data:/data
       - ./src:/src
       - ./output:/output
    environment:
      - SPARK_LOCAL_IP=spark-master
      - SPARK_WORKLOAD=master
  spark-worker-a:
    build:
      context: ./
      dockerfile: Dockerfile
    #image: docker.io/bitnami/spark:3.3
    ports:
      - "9091:8080"
      - "7000:7000"
    depends_on:
      - spark-master
    environment:
      - SPARK_MASTER=spark://spark-master:7077
      - SPARK_WORKER_CORES=1
      - SPARK_WORKER_MEMORY=1G
      - SPARK_DRIVER_MEMORY=1G
      - SPARK_EXECUTOR_MEMORY=1G
      - SPARK_WORKLOAD=worker
      - SPARK_LOCAL_IP=spark-worker-a
    volumes:
       - ./apps:/opt/spark-apps
       - ./data:/opt/spark-data
       - ./data:/data
       - ./src:/src
       - ./output:/output       
  spark-worker-b:
    build:
      context: ./
      dockerfile: Dockerfile
    #image: docker.io/bitnami/spark:3.3
    ports:
      - "9092:8080"
      - "7001:7000"
    depends_on:
      - spark-master
    environment:
      - SPARK_MASTER=spark://spark-master:7077
      - SPARK_WORKER_CORES=1
      - SPARK_WORKER_MEMORY=1G
      - SPARK_DRIVER_MEMORY=1G
      - SPARK_EXECUTOR_MEMORY=1G
      - SPARK_WORKLOAD=worker
      - SPARK_LOCAL_IP=spark-worker-b
    volumes:
        - ./apps:/opt/spark-apps
        - ./data:/opt/spark-data
        - ./data:/data
        - ./src:/src
        - ./output:/output 


    


    In the same path of the docker-compose.yml is the Dockerfile I'm using to build the image :

    


    # builder step used to download and configure spark environment
FROM openjdk:11.0.11-jre-slim-buster as builder

# Add Dependencies for PySpark
RUN apt-get update && apt-get install -y curl vim wget software-properties-common ssh net-tools ca-certificates python3 python3-pip python3-numpy python3-matplotlib python3-scipy python3-pandas python3-simpy

RUN update-alternatives --install "/usr/bin/python" "python" "$(which python3)" 1

# Fix the value of PYTHONHASHSEED
# Note: this is needed when you use Python 3.3 or greater
ENV SPARK_VERSION=3.0.2 \
HADOOP_VERSION=3.2 \
SPARK_HOME=/opt/spark \
PYTHONHASHSEED=1

# Download and uncompress spark from the apache archive
RUN wget --no-verbose -O apache-spark.tgz "https://archive.apache.org/dist/spark/spark-${SPARK_VERSION}/spark-${SPARK_VERSION}-bin-hadoop${HADOOP_VERSION}.tgz" \
&& mkdir -p /opt/spark \
&& tar -xf apache-spark.tgz -C /opt/spark --strip-components=1 \
&& rm apache-spark.tgz


# Apache spark environment
FROM builder as apache-spark

WORKDIR /opt/spark

ENV SPARK_MASTER_PORT=7077 \
SPARK_MASTER_WEBUI_PORT=8080 \
SPARK_LOG_DIR=/opt/spark/logs \
SPARK_MASTER_LOG=/opt/spark/logs/spark-master.out \
SPARK_WORKER_LOG=/opt/spark/logs/spark-worker.out \
SPARK_WORKER_WEBUI_PORT=8080 \
SPARK_WORKER_PORT=7000 \
SPARK_MASTER="spark://spark-master:7077" \
SPARK_WORKLOAD="master"

EXPOSE 8080 7077 6066

RUN mkdir -p $SPARK_LOG_DIR && \
touch $SPARK_MASTER_LOG && \
touch $SPARK_WORKER_LOG && \
ln -sf /dev/stdout $SPARK_MASTER_LOG && \
ln -sf /dev/stdout $SPARK_WORKER_LOG

# Install ffmpeg lib
RUN apt-get -y update
RUN apt-get -y upgrade
RUN apt-get install -y ffmpeg
RUN apt-get -y install apt-utils gcc libpq-dev libsndfile-dev

#RUN apt-get update \
#&& apt-get upgrade -y \
#&& apt-get install -y \
#&& apt-get -y install apt-utils gcc libpq-dev libsndfile-dev

# Install required python libs
COPY requirements.txt .
RUN pip3 install -r requirements.txt

COPY start-spark.sh /

CMD ["/bin/bash", "https://net.cloudinfrastructureservices.co.uk/start-spark.sh"]


    


    The start-spark.sh :

    


    #!/bin/bash
. "https://net.cloudinfrastructureservices.co.uk/opt/spark/bin/load-spark-env.sh"
# When the spark work_load is master run class org.apache.spark.deploy.master.Master
if [ "$SPARK_WORKLOAD" == "master" ];
then

export SPARK_MASTER_HOST=`hostname`

cd /opt/spark/bin && ./spark-class org.apache.spark.deploy.master.Master --ip $SPARK_MASTER_HOST --port $SPARK_MASTER_PORT --webui-port $SPARK_MASTER_WEBUI_PORT >> $SPARK_MASTER_LOG

elif [ "$SPARK_WORKLOAD" == "worker" ];
then
# When the spark work_load is worker run class org.apache.spark.deploy.master.Worker
cd /opt/spark/bin && ./spark-class org.apache.spark.deploy.worker.Worker --webui-port $SPARK_WORKER_WEBUI_PORT $SPARK_MASTER >> $SPARK_WORKER_LOG

elif [ "$SPARK_WORKLOAD" == "submit" ];
then
    echo "SPARK SUBMIT"
else
    echo "Undefined Workload Type $SPARK_WORKLOAD, must specify: master, worker, submit"
fi


    


    When I execute : "docker-compose up" I have the next error message :

    


    [+] Running 4/4
 - Network spark_container_default             Created                                                                                0.9s 
 - Container spark_container-spark-master-1    Created                                                                                0.3s 
 - Container spark_container-spark-worker-a-1  Created                                                                                0.4s 
 - Container spark_container-spark-worker-b-1  Created                                                                                0.4s 
Attaching to spark_container-spark-master-1, spark_container-spark-worker-a-1, spark_container-spark-worker-b-1
spark_container-spark-master-1    | /bin/bash: https://net.cloudinfrastructureservices.co.uk/start-spark.sh: No such file or directory     
spark_container-spark-master-1 exited with code 127
spark_container-spark-worker-a-1  | /bin/bash: https://net.cloudinfrastructureservices.co.uk/start-spark.sh: No such file or directory     
spark_container-spark-worker-b-1  | /bin/bash: https://net.cloudinfrastructureservices.co.uk/start-spark.sh: No such file or directory     
spark_container-spark-worker-a-1 exited with code 127
spark_container-spark-worker-b-1 exited with code 127