Recherche avancée

Médias (0)

Mot : - Tags -/images

Aucun média correspondant à vos critères n’est disponible sur le site.

Autres articles (111)

  • Websites made ​​with MediaSPIP

    2 mai 2011, par

    This page lists some websites based on MediaSPIP.

  • Creating farms of unique websites

    13 avril 2011, par

    MediaSPIP platforms can be installed as a farm, with a single "core" hosted on a dedicated server and used by multiple websites.
    This allows (among other things) : implementation costs to be shared between several different projects / individuals rapid deployment of multiple unique sites creation of groups of like-minded sites, making it possible to browse media in a more controlled and selective environment than the major "open" (...)

  • Publier sur MédiaSpip

    13 juin 2013

    Puis-je poster des contenus à partir d’une tablette Ipad ?
    Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir

Sur d’autres sites (8211)

  • Round number of bits read to next byte

    4 décembre 2014, par watwat2014

    I have a header that can be any number of bits, and there is a variable called ByteAlign that’s calculated by subtracting the current file position from the file position at the beginning of the file, the point of this variable is to pad the header to the next complete byte. so if the header is taking up 57 bits, the ByteAlign variable needs to be 7 bits in length to pad the header to 64 bits total, or 8 bytes.

    Solutions that don’t work :

    Variable % 8 - 8, the result is the answer, but negative.

    8 % Variable ; this is completely inaccurate, and gives answers like 29, which is blatantly wrong, the largest number it should be is 7.

    how exactly do I do this ?

  • Revision d4a407c051 : [spatial svc]Multiple frame context feature We can use one frame context for ea

    18 août 2014, par Minghai Shang

    Changed Paths :
     Modify /test/svc_test.cc


     Modify /vp9/encoder/vp9_bitstream.c


     Modify /vp9/encoder/vp9_encoder.c


     Modify /vp9/encoder/vp9_svc_layercontext.c


     Modify /vp9/encoder/vp9_svc_layercontext.h


     Modify /vp9/vp9_cx_iface.c


     Modify /vpx/src/svc_encodeframe.c



    [spatial svc]Multiple frame context feature

    We can use one frame context for each layer so that we don’t have
    to reset the probs every frame. But we can’t use prev_mi since we
    may drop enhancement layers. So we have to generate a non vp9
    compatible bitstream and modify it in the player.
    1. We need to code all frames as invisible frame to let prev_mi
    not to be used. But in the bitstream we need to code the
    show_frame flag to 1 so that the publisher will know it’s
    supposed to be a visible frame.
    2. In the player we need to change the show_frame flag to 0 for
    all frames. Then add an one byte frame into the super frame
    to tell the decoder which layer we want to show.
    Change-Id : I75b7304cf31f0ab952f043e33c034495e88f01f3

  • Android : how to film a video before extracting its audio

    20 février 2017, par MrOrgon

    Despite many searches, I haven’t been able to develop a Android prototype able to film a video before extracting its audio as .wav in a separate activity.

    I have developed so far a simple filming activity which relies on Android’s Camera application. My strategty was to put the video’s Uri as Extra to the next activity, before using FFMPEG, but I can’t make the transition between Uri and FFMPEG. Indeed, I’m a fresh Android Studio beginner, so I still am not sure about what concept to use.

    Here’s my code for the video recording activity.

    import android.net.Uri;
    import android.os.Build;
    import android.os.Bundle;
    import android.provider.MediaStore;
    import android.widget.Toast;
    import android.widget.VideoView;

    import java.io.File;
    import java.io.FileInputStream;
    import java.io.FileOutputStream;
    import java.io.IOException;
    import java.nio.channels.FileChannel;

    import static java.security.AccessController.getContext;


    public class RecordActivity extends Activity{

    static final int REQUEST_VIDEO_CAPTURE = 0;

    VideoView mVideoView = null;
    Uri videoUri = null;

    @Override
    public void onCreate(Bundle savedInstanceState) {
       super.onCreate(savedInstanceState);
       mVideoView = (VideoView) findViewById(R.id.videoVieww);
       setContentView(R.layout.activity_record);

       Intent takeVideoIntent = new Intent(MediaStore.ACTION_VIDEO_CAPTURE);

       Toast.makeText(RecordActivity.this,         String.valueOf(Build.VERSION.SDK_INT) , Toast.LENGTH_SHORT).show();

       takeVideoIntent.putExtra(MediaStore.EXTRA_OUTPUT, videoUri);
       if (takeVideoIntent.resolveActivity(getPackageManager()) != null) {
           startActivityForResult(takeVideoIntent, REQUEST_VIDEO_CAPTURE);
       }

    }


       @Override
       protected void onActivityResult(int requestCode, int resultCode, Intent intent) {
           if (requestCode == REQUEST_VIDEO_CAPTURE && resultCode == RESULT_OK) {
               videoUri = intent.getData();

               Intent intentForFilterActivity = new Intent(RecordActivity.this, FilterActivity.class);
               intentForFilterActivity.putExtra("VideoToFilter", videoUri.getPath());
               startActivity(intentForFilterActivity);

           }
       }
    }

    Here’s the the code for the audio extraction activity. It is called "FilterActivity", as its final aim is to filter outdoor noise using additional functions. I’m using WritingMinds’ implementation of FFMPEG.
    https://github.com/WritingMinds/ffmpeg-android-java

    import android.app.Activity;
    import android.content.Intent;
    import android.os.Bundle;
    import android.test.ActivityUnitTestCase;
    import android.widget.Toast;

    import com.github.hiteshsondhi88.libffmpeg.ExecuteBinaryResponseHandler;
    import com.github.hiteshsondhi88.libffmpeg.FFmpeg;
    import  com.github.hiteshsondhi88.libffmpeg.exceptions.FFmpegCommandAlreadyRunningException;



    public class FilterActivity extends Activity {

    @Override
    public void onCreate(Bundle savedInstanceState) {

       super.onCreate(savedInstanceState);
       setContentView(R.layout.activity_filter);

       Intent intentVideo = getIntent();
       String pathIn = intentVideo.getStringExtra("VideoToFilter");

       FFmpeg ffmpeg = FFmpeg.getInstance(FilterActivity.this);
       try {
           String[] cmdExtract = {"-i " + pathIn + " extracted.wav"};
           ffmpeg.execute(cmdExtract, new ExecuteBinaryResponseHandler() {

               @Override
               public void onStart() {}

               @Override
               public void onProgress(String message) {}

               @Override
               public void onFailure(String message) {
                   Toast.makeText(FilterActivity.this, "Failure !", Toast.LENGTH_SHORT).show();
               }

               @Override
               public void onSuccess(String message) {}

               @Override
               public void onFinish() {}
           });
       } catch (FFmpegCommandAlreadyRunningException e) {
       }
    }


    }

    and I always get the "Failure !" message.

    Some parts of the code may look extremely bad. As as written previously, I’m a real Android Studio beginner.

    Do you have any correction that could work ? Or even just a strategy ?

    Thank you in advance !