Recherche avancée

Médias (0)

Mot : - Tags -/diogene

Aucun média correspondant à vos critères n’est disponible sur le site.

Autres articles (33)

  • MediaSPIP v0.2

    21 juin 2013, par

    MediaSPIP 0.2 est la première version de MediaSPIP stable.
    Sa date de sortie officielle est le 21 juin 2013 et est annoncée ici.
    Le fichier zip ici présent contient uniquement les sources de MediaSPIP en version standalone.
    Comme pour la version précédente, il est nécessaire d’installer manuellement l’ensemble des dépendances logicielles sur le serveur.
    Si vous souhaitez utiliser cette archive pour une installation en mode ferme, il vous faudra également procéder à d’autres modifications (...)

  • Mise à disposition des fichiers

    14 avril 2011, par

    Par défaut, lors de son initialisation, MediaSPIP ne permet pas aux visiteurs de télécharger les fichiers qu’ils soient originaux ou le résultat de leur transformation ou encodage. Il permet uniquement de les visualiser.
    Cependant, il est possible et facile d’autoriser les visiteurs à avoir accès à ces documents et ce sous différentes formes.
    Tout cela se passe dans la page de configuration du squelette. Il vous faut aller dans l’espace d’administration du canal, et choisir dans la navigation (...)

  • MediaSPIP version 0.1 Beta

    16 avril 2011, par

    MediaSPIP 0.1 beta est la première version de MediaSPIP décrétée comme "utilisable".
    Le fichier zip ici présent contient uniquement les sources de MediaSPIP en version standalone.
    Pour avoir une installation fonctionnelle, il est nécessaire d’installer manuellement l’ensemble des dépendances logicielles sur le serveur.
    Si vous souhaitez utiliser cette archive pour une installation en mode ferme, il vous faudra également procéder à d’autres modifications (...)

Sur d’autres sites (6522)

  • Real time compression/encoding using ffmpeg in objective c

    20 février 2014, par halfwaythru

    I have a small application written in Objective-c that looks for the video devices on the machine and allows the user to record video. I need to be able to compress this video stream in real time. I do not want to save the whole video, I want to compress it as much as possible and only write out the compressed version.

    I also don't want to use the AVFoundation's build in compression methods and need to use a third party library like ffmpeg.

    So far, I have been able to record the video and get individual frames using 'AVCaptureVideoDataOutputSampleBufferDelegate' in this method :

    - (void)captureOutput:(AVCaptureOutput *)captureOutput
      didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
      fromConnection:(AVCaptureConnection *)connection

    So I have a stream of images basically, and I want to throw them into ffmpeg (which is all set up on my machine). Do I need to call a terminal command to do this ? And if I do, how do I use the image stack as my input to the ffmpeg command, instead of the video. Also, how do I combine all the little videos in the end ?

    Any help is appreciated. Thanks !

  • Extracting audio from video using Xuggler

    13 février 2014, par Sudh

    I am trying to extract audio(mp3) from a video file (flv), but I keep getting Exceptions :

    05:02:10.326 [AWT-EventQueue-0] ERROR org.ffmpeg - [aac @ 000000000043B3F0] channel element 0.0 is not allocated
    java.lang.IllegalArgumentException : stream[0] is not video

    I tried with this :

    public void runExample(int a) {
       String sourceUrl="F:\\Software\\library\\test1.mp4";
       String destUrl="F:\\Software\\library\\test1.flv";
       IMediaReader reader = null;
       IMediaWriter writer = null;
       try {
           reader = ToolFactory.makeReader(sourceUrl);
           writer = ToolFactory.makeWriter(destUrl, reader);
           reader.addListener(writer);
           int sampleRate = 44100;
           int channels = 1;
           //writer.addAudioStream(0, 0, ICodec.ID.CODEC_ID_MP3, channels, sampleRate);

           while (reader.readPacket() == null) ;
           //Should IMediaReader automatically call close(), only if ERROR_EOF (End of File) is returned from readPacket().
           reader.setCloseOnEofOnly(false);
           //If false the media data will be left in the order in which it is presented to the IMediaWriter.
           //If true IMediaWriter will buffer media data in time stamp order, and only write out data when it has at least one same time or later packet from all streams.
           writer.setForceInterleave(false);
           System.out.println("closed...");
       } catch (Exception ex) {
           ex.printStackTrace();
       }
    }

    Also When I try this :

    public String seperateAudioStream(String pathToAudioFile)
       { String sourceUrl="F:\\Software\\library\\test1.mp4";
           String destUrl="F:\\Software\\library\\test1.mp3";

           IMediaReader reader = ToolFactory.makeReader(sourceUrl);
           reader.open();
           IMediaWriter writer = ToolFactory.makeWriter(destUrl,reader);
           reader.addListener(writer);
           int sampleRate = 44100;
           int channels = 1;
           writer.addVideoStream(0, 0, ICodec.ID.CODEC_ID_MP3, channels, sampleRate);

           while (reader.readPacket() == null);
           return null;
           IContainer container = IContainer.make();
           int result = container.open(sourceUrl, IContainer.Type.READ, null);
            // check if the operation was successful
             if (result<0)
                 throw new RuntimeException("Failed to open media file");

             int numStreams = container.getNumStreams();

             int audioStreamId = -1;

             IContainer writer = IContainer.make();
             writer.open(destUrl, IContainer.Type.WRITE, IContainerFormat.make());



             for (int i=0; i= 0){

                             if(packet.getStreamIndex() == audioStreamId)
                             {
                                 if(coder.isOpen()){

                                     System.out.println("Writing audio ...");
                                     writer.writePacket(packet);

                                 } else {throw new RuntimeException("Could not open Coder"); }
                             }
                         }
                     }else {throw new RuntimeException("Header not Written for writer container.");}
                 }

                 coder.close();
                 audioCoder.close();
             }
             writer.writeTrailer();
             writer.close();
           return null;

      }

    I get error : ERROR org.ffmpeg channel element 0.0 is not allocated multiple times

    The documentation is unclear to say the least.. xuggler's website looks sick and none of the videos given in tutorial play... even on stack overflow most of the questions related to this are unanswered.

  • VLC dead input for RTP stream

    27 mars, par CaptainCheese

    I'm working on creating an rtp stream that's meant to display live waveform data from Pioneer prolink players. The motivation for sending this video out is to be able to receive it in a flutter frontend. I initially was just sending a base-24 encoding of the raw ARGB packed ints per frame across a Kafka topic to it but processing this data in flutter proved to be untenable and was bogging down the main UI thread. Not sure if this is the most optimal way of going about this but just trying to get anything to work if it means some speedup on the frontend. So the issue the following implementation is experiencing is that when I run vlc --rtsp-timeout=120000 --network-caching=30000 -vvvv stream_1.sdp where

    


    % cat stream_1.sdp
v=0
o=- 0 1 IN IP4 127.0.0.1
s=RTP Stream
c=IN IP4 127.0.0.1
t=0 0
a=tool:libavformat
m=video 5007 RTP/AVP 96
a=rtpmap:96 H264/90000


    


    I see (among other questionable logs) the following :

    


    [0000000144c44d10] live555 demux error: no data received in 10s, aborting
[00000001430ee2f0] main input debug: EOF reached
[0000000144b160c0] main decoder debug: killing decoder fourcc `h264'
[0000000144b160c0] main decoder debug: removing module "videotoolbox"
[0000000144b164a0] main packetizer debug: removing module "h264"
[0000000144c44d10] main demux debug: removing module "live555"
[0000000144c45bb0] main stream debug: removing module "record"
[0000000144a64960] main stream debug: removing module "cache_read"
[0000000144c29c00] main stream debug: removing module "filesystem"
[00000001430ee2f0] main input debug: Program doesn't contain anymore ES
[0000000144806260] main playlist debug: dead input
[0000000144806260] main playlist debug: changing item without a request (current 0/1)
[0000000144806260] main playlist debug: nothing to play
[0000000142e083c0] macosx interface debug: Playback has been ended
[0000000142e083c0] macosx interface debug: Releasing IOKit system sleep blocker (37463)


    


    This is sort of confusing because when I run ffmpeg -protocol_whitelist file,crypto,data,rtp,udp -i stream_1.sdp -vcodec libx264 -f null -
I see a number logs about

    


    [h264 @ 0x139304080] non-existing PPS 0 referenced
    Last message repeated 1 times
[h264 @ 0x139304080] decode_slice_header error
[h264 @ 0x139304080] no frame!


    


    After which I see the stream is received and I start getting telemetry on it :

    


    Input #0, sdp, from 'stream_1.sdp':
  Metadata:
    title           : RTP Stream
  Duration: N/A, start: 0.016667, bitrate: N/A
  Stream #0:0: Video: h264 (Constrained Baseline), yuv420p(progressive), 1200x200, 60 fps, 60 tbr, 90k tbn
Stream mapping:
  Stream #0:0 -> #0:0 (h264 (native) -> h264 (libx264))
Press [q] to stop, [?] for help
[libx264 @ 0x107f04f40] using cpu capabilities: ARMv8 NEON
[libx264 @ 0x107f04f40] profile High, level 3.1, 4:2:0, 8-bit
Output #0, null, to 'pipe:':
  Metadata:
    title           : RTP Stream
    encoder         : Lavf61.7.100
  Stream #0:0: Video: h264, yuv420p(tv, progressive), 1200x200, q=2-31, 60 fps, 60 tbn
      Metadata:
        encoder         : Lavc61.19.101 libx264
      Side data:
        cpb: bitrate max/min/avg: 0/0/0 buffer size: 0 vbv_delay: N/A
[out#0/null @ 0x60000069c000] video:144KiB audio:0KiB subtitle:0KiB other streams:0KiB global headers:0KiB muxing overhead: unknown
frame= 1404 fps= 49 q=-1.0 Lsize=N/A time=00:00:23.88 bitrate=N/A speed=0.834x


    


    Not sure why VLC is turning me down like some kind of Berghain bouncer that lets nobody in the entire night.

    


    I initially tried just converting the ARGB ints to a YUV420p buffer and used this to create the Frame objects but I couldn't for the life of me figure out how to properly initialize it as the attempts I made kept spitting out garbled junk.

    


    Please go easy on me, I've made an unhealthy habit of resolving nearly all of my coding questions by simply lurking the internet for answers but that's not really helping me solve this issue.

    


    Here's the Java I'm working on (the meat of the rtp comms occurs within updateWaveformForPlayer()) :

    


    package com.bugbytz.prolink;&#xA;&#xA;import org.apache.kafka.clients.producer.KafkaProducer;&#xA;import org.apache.kafka.clients.producer.Producer;&#xA;import org.apache.kafka.clients.producer.ProducerConfig;&#xA;import org.apache.kafka.clients.producer.ProducerRecord;&#xA;import org.bytedeco.ffmpeg.global.avcodec;&#xA;import org.bytedeco.ffmpeg.global.avutil;&#xA;import org.bytedeco.javacv.FFmpegFrameGrabber;&#xA;import org.bytedeco.javacv.FFmpegFrameRecorder;&#xA;import org.bytedeco.javacv.FFmpegLogCallback;&#xA;import org.bytedeco.javacv.Frame;&#xA;import org.bytedeco.javacv.FrameGrabber;&#xA;import org.deepsymmetry.beatlink.CdjStatus;&#xA;import org.deepsymmetry.beatlink.DeviceAnnouncement;&#xA;import org.deepsymmetry.beatlink.DeviceAnnouncementAdapter;&#xA;import org.deepsymmetry.beatlink.DeviceFinder;&#xA;import org.deepsymmetry.beatlink.Util;&#xA;import org.deepsymmetry.beatlink.VirtualCdj;&#xA;import org.deepsymmetry.beatlink.data.BeatGridFinder;&#xA;import org.deepsymmetry.beatlink.data.CrateDigger;&#xA;import org.deepsymmetry.beatlink.data.MetadataFinder;&#xA;import org.deepsymmetry.beatlink.data.TimeFinder;&#xA;import org.deepsymmetry.beatlink.data.WaveformDetail;&#xA;import org.deepsymmetry.beatlink.data.WaveformDetailComponent;&#xA;import org.deepsymmetry.beatlink.data.WaveformFinder;&#xA;&#xA;import java.awt.*;&#xA;import java.awt.image.BufferedImage;&#xA;import java.io.File;&#xA;import java.nio.ByteBuffer;&#xA;import java.text.DecimalFormat;&#xA;import java.util.ArrayList;&#xA;import java.util.HashMap;&#xA;import java.util.HashSet;&#xA;import java.util.Map;&#xA;import java.util.Properties;&#xA;import java.util.Set;&#xA;import java.util.concurrent.ExecutionException;&#xA;import java.util.concurrent.Executors;&#xA;import java.util.concurrent.ScheduledExecutorService;&#xA;import java.util.concurrent.ScheduledFuture;&#xA;import java.util.concurrent.TimeUnit;&#xA;&#xA;import static org.bytedeco.ffmpeg.global.avutil.AV_PIX_FMT_RGB24;&#xA;&#xA;public class App {&#xA;    public static ArrayList<track> tracks = new ArrayList&lt;>();&#xA;    public static boolean dbRead = false;&#xA;    public static Properties props = new Properties();&#xA;    private static Map recorders = new HashMap&lt;>();&#xA;    private static Map frameCount = new HashMap&lt;>();&#xA;&#xA;    private static final ScheduledExecutorService scheduler = Executors.newScheduledThreadPool(1);&#xA;    private static final int FPS = 60;&#xA;    private static final int FRAME_INTERVAL_MS = 1000 / FPS;&#xA;&#xA;    private static Map schedules = new HashMap&lt;>();&#xA;&#xA;    private static Set<integer> streamingPlayers = new HashSet&lt;>();&#xA;&#xA;    public static String byteArrayToMacString(byte[] macBytes) {&#xA;        StringBuilder sb = new StringBuilder();&#xA;        for (int i = 0; i &lt; macBytes.length; i&#x2B;&#x2B;) {&#xA;            sb.append(String.format("%02X%s", macBytes[i], (i &lt; macBytes.length - 1) ? ":" : ""));&#xA;        }&#xA;        return sb.toString();&#xA;    }&#xA;&#xA;    private static void updateWaveformForPlayer(int player) throws Exception {&#xA;        Integer frame_for_player = frameCount.get(player);&#xA;        if (frame_for_player == null) {&#xA;            frame_for_player = 0;&#xA;            frameCount.putIfAbsent(player, frame_for_player);&#xA;        }&#xA;&#xA;        if (!WaveformFinder.getInstance().isRunning()) {&#xA;            WaveformFinder.getInstance().start();&#xA;        }&#xA;        WaveformDetail detail = WaveformFinder.getInstance().getLatestDetailFor(player);&#xA;&#xA;        if (detail != null) {&#xA;            WaveformDetailComponent component = (WaveformDetailComponent) detail.createViewComponent(&#xA;                    MetadataFinder.getInstance().getLatestMetadataFor(player),&#xA;                    BeatGridFinder.getInstance().getLatestBeatGridFor(player)&#xA;            );&#xA;            component.setMonitoredPlayer(player);&#xA;            component.setPlaybackState(player, TimeFinder.getInstance().getTimeFor(player), true);&#xA;            component.setAutoScroll(true);&#xA;            int width = 1200;&#xA;            int height = 200;&#xA;            Dimension dimension = new Dimension(width, height);&#xA;            component.setPreferredSize(dimension);&#xA;            component.setSize(dimension);&#xA;            component.setScale(1);&#xA;            component.doLayout();&#xA;&#xA;            // Create a fresh BufferedImage and clear it before rendering&#xA;            BufferedImage image = new BufferedImage(width, height, BufferedImage.TYPE_INT_RGB);&#xA;            Graphics2D g = image.createGraphics();&#xA;            g.clearRect(0, 0, width, height);  // Clear any old content&#xA;&#xA;            // Draw waveform into the BufferedImage&#xA;            component.paint(g);&#xA;            g.dispose();&#xA;&#xA;            int port = 5004 &#x2B; player;&#xA;            String inputFile = port &#x2B; "_" &#x2B; frame_for_player &#x2B; ".mp4";&#xA;            // Initialize the FFmpegFrameRecorder for YUV420P&#xA;            FFmpegFrameRecorder recorder_file = new FFmpegFrameRecorder(inputFile, width, height);&#xA;            FFmpegLogCallback.set();  // Enable FFmpeg logging for debugging&#xA;            recorder_file.setFormat("mp4");&#xA;            recorder_file.setVideoCodec(avcodec.AV_CODEC_ID_H264);&#xA;            recorder_file.setPixelFormat(avutil.AV_PIX_FMT_YUV420P);  // Use YUV420P format directly&#xA;            recorder_file.setFrameRate(FPS);&#xA;&#xA;            // Set video options&#xA;            recorder_file.setVideoOption("preset", "ultrafast");&#xA;            recorder_file.setVideoOption("tune", "zerolatency");&#xA;            recorder_file.setVideoOption("x264-params", "repeat-headers=1");&#xA;            recorder_file.setGopSize(FPS);&#xA;            try {&#xA;                recorder_file.start();  // Ensure this is called before recording any frames&#xA;                System.out.println("Recorder started successfully for player: " &#x2B; player);&#xA;            } catch (org.bytedeco.javacv.FFmpegFrameRecorder.Exception e) {&#xA;                e.printStackTrace();&#xA;            }&#xA;&#xA;            // Get all pixels in one call&#xA;            int[] pixels = new int[width * height];&#xA;            image.getRGB(0, 0, width, height, pixels, 0, width);&#xA;            recorder_file.recordImage(width,height,Frame.DEPTH_UBYTE,1,3 * width, AV_PIX_FMT_RGB24, ByteBuffer.wrap(argbToByteArray(pixels, width, height)));&#xA;            recorder_file.stop();&#xA;            recorder_file.release();&#xA;            final FFmpegFrameRecorder recorder = recorders.get(player);&#xA;            FFmpegFrameGrabber grabber = new FFmpegFrameGrabber(inputFile);&#xA;&#xA;&#xA;            try {&#xA;                grabber.start();&#xA;            } catch (Exception e) {&#xA;                e.printStackTrace();&#xA;            }&#xA;            if (recorder == null) {&#xA;                try {&#xA;                    String outputStream = "rtp://127.0.0.1:" &#x2B; port;&#xA;                    FFmpegFrameRecorder initial_recorder = new FFmpegFrameRecorder(outputStream, grabber.getImageWidth(), grabber.getImageHeight());&#xA;                    initial_recorder.setFormat("rtp");&#xA;                    initial_recorder.setVideoCodec(avcodec.AV_CODEC_ID_H264);&#xA;                    initial_recorder.setPixelFormat(avutil.AV_PIX_FMT_YUV420P);&#xA;                    initial_recorder.setFrameRate(grabber.getFrameRate());&#xA;                    initial_recorder.setGopSize(FPS);&#xA;                    initial_recorder.setVideoOption("x264-params", "keyint=60");&#xA;                    initial_recorder.setVideoOption("rtsp_transport", "tcp");&#xA;                    initial_recorder.start();&#xA;                    recorders.putIfAbsent(player, initial_recorder);&#xA;                    frameCount.putIfAbsent(player, 0);&#xA;                    putToRTP(player, grabber, initial_recorder);&#xA;                }&#xA;                catch (Exception e) {&#xA;                    e.printStackTrace();&#xA;                }&#xA;            }&#xA;            else {&#xA;                putToRTP(player, grabber, recorder);&#xA;            }&#xA;            File file = new File(inputFile);&#xA;            if (file.exists() &amp;&amp; file.delete()) {&#xA;                System.out.println("Successfully deleted file: " &#x2B; inputFile);&#xA;            } else {&#xA;                System.out.println("Failed to delete file: " &#x2B; inputFile);&#xA;            }&#xA;        }&#xA;    }&#xA;&#xA;    public static void putToRTP(int player, FFmpegFrameGrabber grabber, FFmpegFrameRecorder recorder) throws FrameGrabber.Exception {&#xA;        final Frame frame = grabber.grabFrame();&#xA;        int frameCount_local = frameCount.get(player);&#xA;        frame.keyFrame = frameCount_local&#x2B;&#x2B; % FPS == 0;&#xA;        frameCount.put(player, frameCount_local);&#xA;        try {&#xA;            recorder.record(frame);&#xA;        } catch (FFmpegFrameRecorder.Exception e) {&#xA;            throw new RuntimeException(e);&#xA;        }&#xA;    }&#xA;    public static byte[] argbToByteArray(int[] argb, int width, int height) {&#xA;        int totalPixels = width * height;&#xA;        byte[] byteArray = new byte[totalPixels * 3];  // 4 bytes per pixel (ARGB)&#xA;&#xA;        for (int i = 0; i &lt; totalPixels; i&#x2B;&#x2B;) {&#xA;            int argbPixel = argb[i];&#xA;&#xA;            byteArray[i * 3] = (byte) ((argbPixel >> 16) &amp; 0xFF);  // Red&#xA;            byteArray[i * 3 &#x2B; 1] = (byte) ((argbPixel >> 8) &amp; 0xFF);   // Green&#xA;            byteArray[i * 3 &#x2B; 2] = (byte) (argbPixel &amp; 0xFF);  // Blue&#xA;        }&#xA;&#xA;        return byteArray;&#xA;    }&#xA;&#xA;&#xA;    public static void main(String[] args) throws Exception {&#xA;        VirtualCdj.getInstance().setDeviceNumber((byte) 4);&#xA;        CrateDigger.getInstance().addDatabaseListener(new DBService());&#xA;        props.put("bootstrap.servers", "localhost:9092");&#xA;        props.put("key.serializer", "org.apache.kafka.common.serialization.StringSerializer");&#xA;        props.put("value.serializer", "com.bugbytz.prolink.CustomSerializer");&#xA;        props.put(ProducerConfig.MAX_REQUEST_SIZE_CONFIG, "20971520");&#xA;&#xA;        VirtualCdj.getInstance().addUpdateListener(update -> {&#xA;            if (update instanceof CdjStatus) {&#xA;                try (Producer producer = new KafkaProducer&lt;>(props)) {&#xA;                    DecimalFormat df_obj = new DecimalFormat("#.##");&#xA;                    DeviceStatus deviceStatus = new DeviceStatus(&#xA;                            update.getDeviceNumber(),&#xA;                            ((CdjStatus) update).isPlaying() || !((CdjStatus) update).isPaused(),&#xA;                            ((CdjStatus) update).getBeatNumber(),&#xA;                            update.getBeatWithinBar(),&#xA;                            Double.parseDouble(df_obj.format(update.getEffectiveTempo())),&#xA;                            Double.parseDouble(df_obj.format(Util.pitchToPercentage(update.getPitch()))),&#xA;                            update.getAddress().getHostAddress(),&#xA;                            byteArrayToMacString(DeviceFinder.getInstance().getLatestAnnouncementFrom(update.getDeviceNumber()).getHardwareAddress()),&#xA;                            ((CdjStatus) update).getRekordboxId(),&#xA;                            update.getDeviceName()&#xA;                    );&#xA;                    ProducerRecord record = new ProducerRecord&lt;>("device-status", "device-" &#x2B; update.getDeviceNumber(), deviceStatus);&#xA;                    try {&#xA;                        producer.send(record).get();&#xA;                    } catch (InterruptedException ex) {&#xA;                        throw new RuntimeException(ex);&#xA;                    } catch (ExecutionException ex) {&#xA;                        throw new RuntimeException(ex);&#xA;                    }&#xA;                    producer.flush();&#xA;                    if (!WaveformFinder.getInstance().isRunning()) {&#xA;                        try {&#xA;                            WaveformFinder.getInstance().start();&#xA;                        } catch (Exception ex) {&#xA;                            throw new RuntimeException(ex);&#xA;                        }&#xA;                    }&#xA;                }&#xA;            }&#xA;        });&#xA;        DeviceFinder.getInstance().addDeviceAnnouncementListener(new DeviceAnnouncementAdapter() {&#xA;            @Override&#xA;            public void deviceFound(DeviceAnnouncement announcement) {&#xA;                if (!streamingPlayers.contains(announcement.getDeviceNumber())) {&#xA;                    streamingPlayers.add(announcement.getDeviceNumber());&#xA;                    schedules.putIfAbsent(announcement.getDeviceNumber(), scheduler.scheduleAtFixedRate(() -> {&#xA;                        try {&#xA;                            Runnable task = () -> {&#xA;                                try {&#xA;                                    updateWaveformForPlayer(announcement.getDeviceNumber());&#xA;                                } catch (InterruptedException e) {&#xA;                                    System.out.println("Thread interrupted");&#xA;                                } catch (Exception e) {&#xA;                                    throw new RuntimeException(e);&#xA;                                }&#xA;                                System.out.println("Lambda thread work completed!");&#xA;                            };&#xA;                            task.run();&#xA;                        } catch (Exception e) {&#xA;                            e.printStackTrace();&#xA;                        }&#xA;                    }, 0, FRAME_INTERVAL_MS, TimeUnit.MILLISECONDS));&#xA;                }&#xA;            }&#xA;&#xA;            @Override&#xA;            public void deviceLost(DeviceAnnouncement announcement) {&#xA;                if (streamingPlayers.contains(announcement.getDeviceNumber())) {&#xA;                    schedules.get(announcement.getDeviceNumber()).cancel(true);&#xA;                    streamingPlayers.remove(announcement.getDeviceNumber());&#xA;                }&#xA;            }&#xA;        });&#xA;        BeatGridFinder.getInstance().start();&#xA;        MetadataFinder.getInstance().start();&#xA;        VirtualCdj.getInstance().start();&#xA;        TimeFinder.getInstance().start();&#xA;        DeviceFinder.getInstance().start();&#xA;        CrateDigger.getInstance().start();&#xA;&#xA;        try {&#xA;            LoadCommandConsumer consumer = new LoadCommandConsumer("localhost:9092", "load-command-group");&#xA;            Thread consumerThread = new Thread(consumer::startConsuming);&#xA;            consumerThread.start();&#xA;&#xA;            Runtime.getRuntime().addShutdownHook(new Thread(() -> {&#xA;                consumer.shutdown();&#xA;                try {&#xA;                    consumerThread.join();&#xA;                } catch (InterruptedException e) {&#xA;                    Thread.currentThread().interrupt();&#xA;                }&#xA;            }));&#xA;            Thread.sleep(60000);&#xA;        } catch (InterruptedException e) {&#xA;            System.out.println("Interrupted, exiting.");&#xA;        }&#xA;    }&#xA;}&#xA;</integer></track>

    &#xA;