Recherche avancée

Médias (91)

Autres articles (54)

  • List of compatible distributions

    26 avril 2011, par

    The table below is the list of Linux distributions compatible with the automated installation script of MediaSPIP. Distribution nameVersion nameVersion number Debian Squeeze 6.x.x Debian Weezy 7.x.x Debian Jessie 8.x.x Ubuntu The Precise Pangolin 12.04 LTS Ubuntu The Trusty Tahr 14.04
    If you want to help us improve this list, you can provide us access to a machine whose distribution is not mentioned above or send the necessary fixes to add (...)

  • MediaSPIP v0.2

    21 juin 2013, par

    MediaSPIP 0.2 est la première version de MediaSPIP stable.
    Sa date de sortie officielle est le 21 juin 2013 et est annoncée ici.
    Le fichier zip ici présent contient uniquement les sources de MediaSPIP en version standalone.
    Comme pour la version précédente, il est nécessaire d’installer manuellement l’ensemble des dépendances logicielles sur le serveur.
    Si vous souhaitez utiliser cette archive pour une installation en mode ferme, il vous faudra également procéder à d’autres modifications (...)

  • Mise à disposition des fichiers

    14 avril 2011, par

    Par défaut, lors de son initialisation, MediaSPIP ne permet pas aux visiteurs de télécharger les fichiers qu’ils soient originaux ou le résultat de leur transformation ou encodage. Il permet uniquement de les visualiser.
    Cependant, il est possible et facile d’autoriser les visiteurs à avoir accès à ces documents et ce sous différentes formes.
    Tout cela se passe dans la page de configuration du squelette. Il vous faut aller dans l’espace d’administration du canal, et choisir dans la navigation (...)

Sur d’autres sites (4992)

  • Libav Transcoding to H264 : Frames being dropped

    21 juin 2014, par roman

    I’m sorry if my question isn’t too well formulated, I’m only now getting started with FFmpeg and Libav. I’m not too knowledgable about media formats either, I pretty much learned all I know about the topic this past month. I’ve been doing as much research as I can, and have gotten pretty far, but I’ve only now gotten to the point where I’m almost unsure what my question actually is ! These are more like observations, but hopefully some of the experts can help me out here.

    I’m trying to transcode Gifs into MP4s using FFmpeg’s libraries, but I’m running into a strange issue when using the H264 Codec. In my transcoding loop, I keep a count of the number of frames that I write out (by verifying the return value of av_write_frame). In a particular sample, I count a total of 166 frames written out. If I examine FFmpeg’s converted file using FFprobe (the functionality I’m wanting to emulate using my program, a conversion from Gif to MP4), FFmpeg’s output file also seems to have 166 frames, but when I examine my output with FFprobe, I seem to only have 144 frames. What I find a bit interesting is that if I simply change my codec from H264 to MPEG4, my output appears to have the 166 frames, matching FFmpeg’s output and my counter. I get very similar results with different Gif files, where my counter of frames written matches FFmpeg’s output’s frame count, but my output seems to drop some frames.

    Encoder settings :

    ostream_codec_context->codec_id = CODEC_IN_USE;  //CODEC_ID_H264 or CODEC_ID_MPEG4
    ostream_codec_context->pix_fmt = AV_PIX_FMT_YUV420P;
    ostream_codec_context->codec_type = AVMEDIA_TYPE_VIDEO;
    ostream_codec_context->flags = CODEC_FLAG_GLOBAL_HEADER;
    ostream_codec_context->profile = FF_PROFILE_MPEG4_SIMPLE;

    ostream_codec_context->gop_size = istream_codec_context->gop_size;
    ostream_codec_context->time_base = istream_codec_context->time_base;

    ostream_codec_context->width  = (istream_codec_context->width  / 2) * 2;
    ostream_codec_context->height = (istream_codec_context->height / 2) * 2;    

    Transcoding loop :

    I’ve omitted some error-checking code and debugging statements

    avformat_write_header(oformat_context, NULL);
    while (av_read_frame(iformat_context, &packet) == 0 )
    {
      if (packet.stream_index == istream_index)
      {  
         avcodec_decode_video2(istream_codec_context, ipicture, &full_frame, &packet);
         if (full_frame)
         {
            sws_scale(image_conversion_context,
                      (uint8_t const * const *) ipicture->data,
                      ipicture->linesize, 0, istream_codec_context->height,
                      opicture->data, opicture->linesize);

            opicture->pts = av_rescale_q(packet.pts, istream_codec_context->time_base,
                                         ostream->time_base);

            ret = avcodec_encode_video2(ostream_codec_context, &packet,
                                        opicture, &got_packet);
            if (!ret)
            {
               ret = av_write_frame(oformat_context, &packet);
               if (ret < 0)
                  num_frames_written++;
            }
         }
      }
      av_free_packet(&packet);
      av_init_packet(&packet);
    }

    I’m also having issues with my output’s bit-rate. I can try setting it with the rest of my encoder settings, but the bit-rate that FFprobe shows is not the same as what I give the codec context. I tried setting the bit-rate to constant values just to see how it affected my output, and although my output’s bit-rate isn’t the same as what I give it, I found that my input definitely seems to influence the output’s actual bit-rate. I found a post that seems to be dealing with my issue, but the solution listed there does not seem to fix my issue.

    http://ffmpeg.org/pipermail/libav-user/2012-July/002492.html

    Another thing worth mentioning is that my various time bases don’t seem to match up with those of FFmpeg’s output. Notably, my output’s TBC seems to be twice the inverse of my output codec context’s time base. I’m not too sure if this is an issue with the Gif file format, my output codec context’s always seems to be set to 1/100.

    Bit-rate calculation and setting

    int calculated_br = istream_codec_context->time_base.den *    
                       avpicture_get_size(AV_PIX_FMT_YUV420P, ostream_codec_context->width,
                                          ostream_codec_context->height);    

    ostream_codec_context->bit_rate = calculated_br;
    ostream_codec_context->rc_min_rate = calculated_br;
    ostream_codec_context->rc_max_rate = calculated_br;
    ostream_codec_context->rc_buffer_size = calculated_br;

    I’ve got a hunch that all these issues could be related to me not setting my PTS/DTS correctly, even though my output’s pts/dts values match those of FFmpeg’s output.

    I would appreciate some help,
    Thank you !

  • Modifying motion vectors in ffmpeg H.264 decoder

    14 février 2012, par qontranami

    For research purposes, I am trying to modify H.264 motion vectors (MVs) for each P- and B-frame prior to motion compensation during the decoding process. I am using FFmpeg for this purpose. An example of a modification is replacing each MV with its original spatial neighbors and then using the resultant MVs for motion compensation, rather than the original ones. Please direct me appropriately.

    So far, I have been able to do a simple modification of MVs in the file /libavcodec/h264_cavlc.c. In the function, ff_h264_decode_mb_cavlc(), modifying the mx and my variables, for instance, by increasing their values modifies the MVs used during decoding.

    For example, as shown below, the mx and my values are increased by 50, thus lengthening the MVs used in the decoder.

    mx += get_se_golomb(&s->gb)+50;
    my += get_se_golomb(&s->gb)+50;

    However, in this regard, I don't know how to access the neighbors of mx and my for my spatial mean analysis that I mentioned in the first paragraph. I believe that the key to doing so lies in manipulating the array, mv_cache.

    Another experiment that I performed was in the file, libavcodec/error_resilience.c. Based on the guess_mv() function, I created a new function, mean_mv() that is executed in ff_er_frame_end() within the first if-statement. That first if-statement exits the function ff_er_frame_end() if one of the conditions is a zero error-count (s->error_count == 0). However, I decided to insert my mean_mv() function at this point so that is always executed when there is a zero error-count. This experiment somewhat yielded the results I wanted as I could start seeing artifacts in the top portions of the video but they were restricted just to the upper-right corner. I'm guessing that my inserted function is not being completed so as to meet playback deadlines or something.

    Below is the modified if-statement. The only addition is my function, mean_mv(s).

    if(!s->error_recognition || s->error_count==0 || s->avctx->lowres ||
          s->avctx->hwaccel ||
          s->avctx->codec->capabilities&CODEC_CAP_HWACCEL_VDPAU ||
          s->picture_structure != PICT_FRAME || // we dont support ER of field pictures yet, though it should not crash if enabled
          s->error_count==3*s->mb_width*(s->avctx->skip_top + s->avctx->skip_bottom)) {
           //av_log(s->avctx, AV_LOG_DEBUG, "ff_er_frame_end in er.c\n"); //KG
           if(s->pict_type==AV_PICTURE_TYPE_P)
               mean_mv(s);
           return;

    And here's the mean_mv() function I created based on guess_mv().

    static void mean_mv(MpegEncContext *s){
       //uint8_t fixed[s->mb_stride * s->mb_height];
       //const int mb_stride = s->mb_stride;
       const int mb_width = s->mb_width;
       const int mb_height= s->mb_height;
       int mb_x, mb_y, mot_step, mot_stride;

       //av_log(s->avctx, AV_LOG_DEBUG, "mean_mv\n"); //KG

       set_mv_strides(s, &mot_step, &mot_stride);

       for(mb_y=0; mb_ymb_height; mb_y++){
           for(mb_x=0; mb_xmb_width; mb_x++){
               const int mb_xy= mb_x + mb_y*s->mb_stride;
               const int mot_index= (mb_x + mb_y*mot_stride) * mot_step;
               int mv_predictor[4][2]={{0}};
               int ref[4]={0};
               int pred_count=0;
               int m, n;

               if(IS_INTRA(s->current_picture.f.mb_type[mb_xy])) continue;
               //if(!(s->error_status_table[mb_xy]&MV_ERROR)){
               //if (1){
               if(mb_x>0){
                   mv_predictor[pred_count][0]= s->current_picture.f.motion_val[0][mot_index - mot_step][0];
                   mv_predictor[pred_count][1]= s->current_picture.f.motion_val[0][mot_index - mot_step][1];
                   ref         [pred_count]   = s->current_picture.f.ref_index[0][4*(mb_xy-1)];
                   pred_count++;
               }

               if(mb_x+1current_picture.f.motion_val[0][mot_index + mot_step][0];
                   mv_predictor[pred_count][1]= s->current_picture.f.motion_val[0][mot_index + mot_step][1];
                   ref         [pred_count]   = s->current_picture.f.ref_index[0][4*(mb_xy+1)];
                   pred_count++;
               }

               if(mb_y>0){
                   mv_predictor[pred_count][0]= s->current_picture.f.motion_val[0][mot_index - mot_stride*mot_step][0];
                   mv_predictor[pred_count][1]= s->current_picture.f.motion_val[0][mot_index - mot_stride*mot_step][1];
                   ref         [pred_count]   = s->current_picture.f.ref_index[0][4*(mb_xy-s->mb_stride)];
                   pred_count++;
               }

               if(mb_y+1current_picture.f.motion_val[0][mot_index + mot_stride*mot_step][0];
                   mv_predictor[pred_count][1]= s->current_picture.f.motion_val[0][mot_index + mot_stride*mot_step][1];
                   ref         [pred_count]   = s->current_picture.f.ref_index[0][4*(mb_xy>mb_stride)];
                   pred_count++;
               }

               if(pred_count==0) continue;

               if(pred_count>=1){
                   int sum_x=0, sum_y=0, sum_r=0;
                   int k;

                   for(k=0; k/ Sum all the MVx from MVs avail. for EC
                       sum_y+= mv_predictor[k][1]; // Sum all the MVy from MVs avail. for EC
                       sum_r+= ref[k];
                       // if(k && ref[k] != ref[k-1])
                       // goto skip_mean_and_median;
                   }

                   mv_predictor[pred_count][0] = sum_x/k;
                   mv_predictor[pred_count][1] = sum_y/k;
                   ref         [pred_count]    = sum_r/k;
               }

               s->mv[0][0][0] = mv_predictor[pred_count][0];
               s->mv[0][0][1] = mv_predictor[pred_count][1];

               for(m=0; mcurrent_picture.f.motion_val[0][mot_index + m + n * mot_stride][0] = s->mv[0][0][0];
                       s->current_picture.f.motion_val[0][mot_index + m + n * mot_stride][1] = s->mv[0][0][1];
                   }
               }

               decode_mb(s, ref[pred_count]);

               //}
           }
       }
    }

    I would really appreciate some assistance on how to go about this properly.

  • Banking Data Strategies – A Primer to Zero-party, First-party, Second-party and Third-party data

    25 octobre 2024, par Daniel Crough — Banking and Financial Services, Privacy

    Banks hold some of our most sensitive information. Every transaction, loan application, and account balance tells a story about their customers’ lives. Under GDPR and banking regulations, protecting this information isn’t optional – it’s essential.

    Yet banks also need to understand how customers use their services to serve them better. The solution lies in understanding different types of banking data and how to handle each responsibly. From direct customer interactions to market research, each data source serves a specific purpose and requires its own privacy controls.

    Before diving into how banks can use each type of data effectively, let’s look into the key differences between them :

    Data TypeWhat It IsBanking ExampleLegal Considerations
    First-partyData from direct customer interactions with your servicesTransaction records, service usage patternsDifferent legal bases apply (contract, legal obligation, legitimate interests)
    Zero-partyInformation customers actively provideStated preferences, financial goalsRequires specific legal basis despite being voluntary ; may involve profiling
    Second-partyData shared through formal partnershipsInsurance history from partnersMust comply with PSD2 and specific data sharing regulations
    Third-partyData from external providersMarket analysis, demographic dataRequires due diligence on sources and specific transparency measures

    What is first-party data ?

    Person looking at their first party banking data.

    First-party data reveals how customers actually use your banking services. When someone logs into online banking, withdraws money from an ATM, or speaks with customer service, they create valuable information about real banking habits.

    This direct interaction data proves more reliable than assumptions or market research because it shows genuine customer behaviour. Banks need specific legal grounds to process this information. Basic banking services fall under contractual necessity, while fraud detection is required by law. Marketing activities need explicit customer consent. The key is being transparent with customers about what information you process and why.

    Start by collecting only what you need for each specific purpose. Store information securely and give customers clear control through privacy settings. This approach builds trust while helping meet privacy requirements under the GDPR’s data minimisation principle.

    What is zero-party data ?

    A person sharing their banking data with their bank to illustrate zero party data in banking.

    Zero-party data emerges when customers actively share information about their financial goals and preferences. Unlike first-party data, which comes from observing customer behaviour, zero-party data comes through direct communication. Customers might share their retirement plans, communication preferences, or feedback about services.

    Interactive tools create natural opportunities for this exchange. A retirement calculator helps customers plan their future while revealing their financial goals. Budget planners offer immediate value through personalised advice. When customers see clear benefits, they’re more likely to share their preferences.

    However, voluntary sharing doesn’t mean unrestricted use. The ICO’s guidance on purpose limitation applies even to freely shared information. Tell customers exactly how you’ll use their data, document specific reasons for collecting each piece of information, and make it simple to update or remove personal data.

    Regular reviews help ensure you still need the information customers have shared. This aligns with both GDPR requirements and customer expectations about data management. By treating voluntary information with the same care as other customer data, banks build lasting trust.

    What is second-party data ?

    Two people collaborating by sharing data to illustrate second party data sharing in banking.

    Second-party data comes from formal partnerships between banks and trusted companies. For example, a bank might work with an insurance provider to better understand shared customers’ financial needs.

    These partnerships need careful planning to protect customer privacy. The ICO’s Data Sharing Code provides clear guidelines : both organisations must agree on what data they’ll share, how they’ll protect it, and how long they’ll keep it before any sharing begins.

    Transparency builds trust in these arrangements. Tell customers about planned data sharing before it happens. Explain what information you’ll share and how it helps provide better services.

    Regular audits help ensure both partners maintain high privacy standards. Review shared data regularly to confirm it’s still necessary and properly protected. Be ready to adjust or end partnerships if privacy standards slip. Remember that your responsibility to protect customer data extends to information shared with partners.

    Successful partnerships balance improved service with diligent privacy protection. When done right, they help banks understand customer needs better while maintaining the trust that makes banking relationships work.

    What is third-party data ?

    People conducting market research to get third party banking data.

    Third-party data comes from external sources outside your bank and its partners. Market research firms, data analytics companies, and economic research organizations gather and sell this information to help banks understand broader market trends.

    This data helps fill knowledge gaps about the wider financial landscape. For example, third-party data might reveal shifts in consumer spending patterns across different age groups or regions. It can show how customers interact with different financial services or highlight emerging banking preferences in specific demographics.

    But third-party data needs careful evaluation before use. Since your bank didn’t collect this information directly, you must verify both its quality and compliance with privacy laws. Start by checking how providers collected their data and whether they had proper consent. Look for providers who clearly document their data sources and collection methods.

    Quality varies significantly among third-party data providers. Some key questions to consider before purchasing :

    • How recent is the data ?
    • How was it collected ?
    • What privacy protections are in place ?
    • How often is it updated ?
    • Which specific market segments does it cover ?

    Consider whether third-party data will truly add value beyond your existing information. Many banks find they can gain similar insights by analysing their first-party data more effectively. If you do use third-party data, document your reasons for using it and be transparent about your data sources.

    Creating your banking data strategy

    A team collaborating on a banking data strategy.

    A clear data strategy helps your bank collect and use information effectively while protecting customer privacy. This matters most with first-party data – the information that comes directly from your customers’ banking activities.

    Start by understanding what data you already have. Many banks collect valuable information through everyday transactions, website visits, and customer service interactions. Review these existing data sources before adding new ones. Often, you already have the insights you need – they just need better organization.

    Map each type of data to a specific purpose. For example, transaction data might help detect fraud and improve service recommendations. Website analytics could reveal which banking features customers use most. Each data point should serve a clear business purpose while respecting customer privacy.

    Strong data quality standards support better decisions. Create processes to update customer information regularly and remove outdated records. Check data accuracy often and maintain consistent formats across your systems. These practices help ensure your insights reflect reality.

    Remember that strategy means choosing what not to do. You don’t need to collect every piece of data possible. Focus on information that helps you serve customers better while maintaining their privacy.

    Managing multiple data sources

    An image depicting multiple data sources.

    Banks work with many types of data – from direct customer interactions to market research. Each source serves a specific purpose, but combining them effectively requires careful planning and precise attention to regulations like GDPR and ePrivacy.

    First-party data forms your foundation. It shows how your customers actually use your services and what they need from their bank. This direct interaction data proves most valuable because it reflects real behaviour rather than assumptions. When customers check their balances, transfer money, or apply for loans, they show you exactly how they use banking services.

    Zero-party data adds context to these interactions. When customers share their financial goals or preferences directly, they help you understand the “why” behind their actions. This insight helps shape better services. For example, knowing a customer plans to buy a house helps you offer relevant savings tools or mortgage information at the right time.

    Second-party partnerships can fill specific knowledge gaps. Working with trusted partners might reveal how customers manage their broader financial lives. But only pursue partnerships when they offer clear value to customers. Always explain these relationships clearly and protect shared information carefully.

    Third-party data helps provide market context, but use it selectively. External market research can highlight broader trends or opportunities. However, this data often proves less reliable than information from direct customer interactions. Consider it a supplement to, not a replacement for, your own customer insights.

    Keep these principles in mind when combining data sources :

    • Prioritize direct customer interactions
    • Focus on information that improves services
    • Maintain consistent privacy standards across sources
    • Document where each insight comes from
    • Review regularly whether each source adds value
    • Work with privacy and data experts to ensure customer information is handled properly

    Enhance your web analytics strategy with Matomo

    Users flow report in Matomo analytics

    The financial sector finds powerful and compliant web analytics increasingly valuable as it navigates data management and privacy regulations. Matomo provides a configurable privacy-centric solution that meets the requirements of banks and financial institutions.

    Matomo empowers your organisation to :

    • Collect accurate, GDPR-compliant web data
    • Integrate web analytics with your existing tools and platforms
    • Maintain full control over your analytics data
    • Gain insights without compromising user privacy

    Matomo is trusted by some of the world’s biggest banks and financial institutions. Try Matomo for free for 30 days to see how privacy-focused analytics can get you the insights you need while maintaining compliance and user trust.