android - Where the decoded frame by ffmepg stored? -


i try decode video , convert frame rgb32 or gb565le format.

then pass frame c android buffer jni.

so far, know how pass buffer c android how decode video , decoded frame.

my question how convert decoded frame rgb32 (or rgb565le) , stored?

the following code, i'm not sure correct or not.

-jargo


img_convert_ctx = sws_getcontext(pcodecctx->width, pcodecctx->height, pcodecctx->pix_fmt, 100, 100, pix_fmt_rgb32, sws_bicubic, null, null, null); if(!img_convert_ctx) return -6;  while(av_read_frame(pformatctx, &packet) >= 0) {    // packet video stream?    if(packet.stream_index == videostream) {        avcodec_decode_video2(pcodecctx, pframe, &framefinished, &packet);         // did video frame?        if(framefinished) {             avpicture pict;              if(avpicture_alloc(&pict, pix_fmt_rgb32, 100, 100) >= 0) {                 sws_scale(img_convert_ctx, (const uint8_t * const *)pframe->data, pframe->linesize, 0, pcodecctx->height, pict.data, pict.linesize);             }        } // end of if( framefinished )    } // end of if( packet.stream_index == videostream )     // free packet allocated av_read_frame    av_free_packet(&packet); } 

the decoded frame goes pict. (pframe raw frame.)

100x100 no have calculate pict size based on pframe size. guess should pframe->width*pframe->height*32;

you have allocate pict yourself.

see tutorial http://dranger.com/ffmpeg/


Comments

Popular posts from this blog

java - activate/deactivate sonar maven plugin by profile? -

python - TypeError: can only concatenate tuple (not "float") to tuple -

java - What is the difference between String. and String.this. ? -