MediaCodec之Decoder

1.介绍:

MediaCodec类可用于访问Android底层的媒体编解码器,也就是,编码器/解码器组件。它是Android底层多媒体支持基本架构的一部分(通常与MediaExtractor, MediaSync, MediaMuxer, MediaCrypto, MediaDrm, Image, Surface, 以及AudioTrack一起使用);MediaCodec作为比较年轻的Android多媒体硬件编解码框架,在终端硬解方案中带来了很大便利。Android源码中的CTS部分也给出了很多可以关于Media编解码的Demo。

2.解码

Android的MediaCodec解码需要分为视频、音频解码。
首先获取MediaCodec支持的数量,根据MediaCodec的句柄获取MediaCodec支持的编解码格式
MediaCodecList.getCodecCount()
MediaCodecList.getCodecInfoAt(i);
比如通过以下测试代码,就可以知道终端MediaCodec的解码能力:

        int n = MediaCodecList.getCodecCount();
        for (int i = 0; i < n; ++i) {
            MediaCodecInfo info = MediaCodecList.getCodecInfoAt(i);
            String[] supportedTypes = info.getSupportedTypes();
            boolean mime_support = false;
            if(info.isEncoder()){
                return;
            }
            for (int j = 0; j < supportedTypes.length; ++j) {
                Log.d(TAG, "codec info:" + info.getName()+" supportedTypes:" + supportedTypes[j]);
                if (supportedTypes[j].equalsIgnoreCase(mime)) {
                    mime_support = true;
                }
            }
        }
codesurpport
OMX.amlogic.hevc.decoder.awesomevideo/hevc
OMX.amlogic.avc.decoder.awesomevideo/avc
OMX.amlogic.mpeg4.decoder.awesomevideo/mp4v-es
OMX.amlogic.h263.decoder.awesomevideo/3gpp
OMX.amlogic.mpeg2.decoder.awesomevideo/mpeg2
OMX.amlogic.vc1.decoder.awesomevideo/vc1
OMX.amlogic.vc1.decoder.awesomevideo/wvc1
OMX.amlogic.wmv3.decoder.awesomevideo/wmv3
OMX.amlogic.mjpeg.decoder.awesomevideo/mjpeg
OMX.google.amrnb.decoderaudio/3gpp
OMX.google.amrwb.decoderaudio/amr-wb
OMX.google.aac.decoderaudio/mp4a-latm
OMX.google.adif.decoderaudio/aac-adif
OMX.google.latm.decoderaudio/aac-latm
OMX.google.adts.decoderaudio/adts
OMX.google.g711.alaw.decoderaudio/g711-alaw
OMX.google.g711.mlaw.decoderaudio/g711-mlaw
OMX.google.adpcm.ima.decoderaudio/adpcm-ima
OMX.google.adpcm.ms.decoderaudio/adpcm-ms
OMX.google.vorbis.decoderaudio/vorbis
OMX.google.alac.decoderaudio/alac
OMX.google.wma.decoderaudio/wma
OMX.google.wmapro.decoderaudio/wmapro
OMX.google.ape.decoderaudio/ape
OMX.google.truehd.decoderaudio/truehd
OMX.google.ffmpeg.decoderaudio/ffmpeg
OMX.google.raw.decoderaudio/raw
OMX.google.mpeg4.decodervideo/mp4v-es
OMX.google.h263.decodervideo/3gpp
OMX.google.h264.decodervideo/avc
OMX.google.vp8.decodervideo/x-vnd.on2.vp8
OMX.google.vp9.decodervideo/x-vnd.on2.vp9
OMX.google.vp6.decodervideo/x-vnd.on2.vp6
OMX.google.vp6a.decodervideo/x-vnd.on2.vp6a
OMX.google.vp6f.decodervideo/x-vnd.on2.vp6f
OMX.google.rm10.decodervideo/rm10
OMX.google.rm20.decodervideo/rm20
OMX.google.rm40.decodervideo/rm40
OMX.google.wmv2.decodervideo/wmv2
OMX.google.wmv1.decodervideo/wmv1
AML.google.ac3.decoderaudio/ac3
AML.google.ec3.decoderaudio/eac3
OMX.google.mp2.decoderaudio/mpeg-L2
OMX.google.mp3.decoderaudio/mpeg
AML.google.dtshd.decoderaudio/dtshd
OMX.google.raw.decoderaudio/raw
OMX.google.vp6.decodervideo/x-vnd.on2.vp6
OMX.google.vp6a.decodervideo/x-vnd.on2.vp6a
OMX.google.vp6f.decodervideo/x-vnd.on2.vp6f
OMX.google.h265.decodervideo/hevc
OMX.google.wmv2.decodervideo/wmv2
OMX.google.wmv2.decodervideo/wmv1

Android提供了MediaExtractor来分离本地/网络视频流的音视频。
首先定义了一个统一音视频处理的类:

public class MediaDecoder extends Thread{
    
    protected String mVideoFilePath = null; 
    
    public static final long TIME_US = 10000;
    
    protected MediaExtractor mExtractor = null;
    protected MediaCodec mDecoder = null;
    protected MediaFormat mediaFormat;
    protected UpstreamCallback mCallback;
    protected Surface mSurface;
    
    public MediaDecoder(String videoFilePath, Surface surface,UpstreamCallback callback) {
        this.mVideoFilePath = videoFilePath;
        this.mSurface = surface;
        this.mCallback = callback;
    }
    
    @Override
    public void run() {
        // TODO Auto-generated method stub
        super.run();
        prepare();
    }
    
    public void prepare(){
        try {
            File videoFile = new File(mVideoFilePath);
            mExtractor = new MediaExtractor();
            mExtractor.setDataSource(videoFile.toString());
        } catch (IOException e) {
            // TODO Auto-generated catch block
            e.printStackTrace();
        }
    }
    
}

① 视频解码:

首先创建视频的解码器:

            for (int i = 0; i < mExtractor.getTrackCount(); i++) {
                MediaFormat format = mExtractor.getTrackFormat(i);
                String mime = format.getString(MediaFormat.KEY_MIME);
                if (mime.startsWith("video/")) {
                    mExtractor.selectTrack(i);
                    mDecoder = MediaCodec.createDecoderByType(mime);
                    if(mCallback != null){
                        mDecoder.configure(format, null, null, 0);    //decode flag no output for surface
                    }else{
                        mDecoder.configure(format, mSurface, null, 0);    //decode flag output to surface
                    }
                    break;
                }
            }

            if (mDecoder == null) {
                Log.e(TAG, "Can't find video info!");
                return;
            }
            mDecoder.start(); 

当MediaCodec的解码buffer空闲时(mDecoder.dequeueInputBuffer),就可以把分离出的视频数据填充到buffer中(mDecoder.queueInputBuffer),让Decoder开始解码:

                    int inIndex = mDecoder.dequeueInputBuffer(TIME_US);
                    if (inIndex >= 0) {
                        ByteBuffer buffer = inputBuffers[inIndex];
                        int sampleSize = mExtractor.readSampleData(buffer, 0);
                        if (sampleSize < 0) {
                            // We shouldn't stop the playback at this point, just pass the EOS
                            // flag to mDecoder, we will get it again from the
                            // dequeueOutputBuffer
                            Log.d(TAG, "InputBuffer BUFFER_FLAG_END_OF_STREAM");
                            mDecoder.queueInputBuffer(inIndex, 0, 0, 0, MediaCodec.BUFFER_FLAG_END_OF_STREAM);
                            isEOS = true;
                        } else {
                            mDecoder.queueInputBuffer(inIndex, 0, sampleSize, mExtractor.getSampleTime(), 0);
                            mExtractor.advance();
                        }
                    }

Decoder解码出来的数据(mDecoder.dequeueOutputBuffer),解码的数据就可以做该做的处理,比如再编码,合成文件等。

                BufferInfo info = new BufferInfo();
                int outIndex = mDecoder.dequeueOutputBuffer(info, TIME_US);
                switch (outIndex) {
                case MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED:
                    Log.d(TAG, "INFO_OUTPUT_BUFFERS_CHANGED");
                    outputBuffers = mDecoder.getOutputBuffers();
                    break;
                case MediaCodec.INFO_OUTPUT_FORMAT_CHANGED:
                    Log.d(TAG, "New format " + mDecoder.getOutputFormat());
                    break;
                case MediaCodec.INFO_TRY_AGAIN_LATER:
                    Log.d(TAG, "dequeueOutputBuffer timed out!");
                    break;
                default:
                    //here erro?
                    Log.d(TAG, "outIndex:"+outIndex);
                    ByteBuffer buffer = outputBuffers[outIndex];
                    Log.d(TAG, "ByteBuffer limit:"+buffer.limit()+" info size:"+info.size);
                    final byte[] chunk = new byte[info.size];
                    buffer.get(chunk);
                    if(mCallback != null){
                        //mCallback.UpstreamCallback(chunk,info.size);
                    }
                    //clear buffer,otherwise get the same buffer which is the last buffer
                    buffer.clear();
                    if(DEBUG_VIDEO)Log.v(TAG, "We can't use this buffer but render it due to the API limit, " + buffer);
                    // We use a very simple clock to keep the video FPS, or the video
                    // playback will be too fast
                    while (info.presentationTimeUs / 1000 > System.currentTimeMillis() - startMs) {
                        try {
                            sleep(10);
                        } catch (InterruptedException e) {
                            e.printStackTrace();
                            break;
                        }
                    }
                    mDecoder.releaseOutputBuffer(outIndex, true);
                    break;
                }

视频解码的完整代码:

public class VideoDecoder extends MediaDecoder{
    private static final String TAG = "VideoDecode";  
    private static final boolean DEBUG_VIDEO = false;
    
    public VideoDecoder(String videoFilePath,Surface surface,UpstreamCallback callback){
        super(videoFilePath, surface, callback);
    }
    
    @Override
    public void run() {
        // TODO Auto-generated method stub
        super.run();
        VideoDecodePrepare();
    }
    
    public void VideoDecodePrepare() {  
        try {              
            for (int i = 0; i < mExtractor.getTrackCount(); i++) {
                MediaFormat format = mExtractor.getTrackFormat(i);
                String mime = format.getString(MediaFormat.KEY_MIME);
                if (mime.startsWith("video/")) {
                    mExtractor.selectTrack(i);
                    mDecoder = MediaCodec.createDecoderByType(mime);
                    if(mCallback != null){
                        mDecoder.configure(format, null, null, 0);  //decode flag no output for surface
                    }else{
                        mDecoder.configure(format, mSurface, null, 0);  //decode flag output to surface
                    }
                    break;
                }
            }
            
            if (mDecoder == null) {
                Log.e(TAG, "Can't find video info!");
                return;
            }
            mDecoder.start();  
            ByteBuffer[] inputBuffers = mDecoder.getInputBuffers();
            ByteBuffer[] outputBuffers = mDecoder.getOutputBuffers();
            boolean isEOS = false;
            long startMs = System.currentTimeMillis();
            while (!Thread.interrupted()) {
                if (!isEOS) {
                    int inIndex = mDecoder.dequeueInputBuffer(TIME_US);
                    if (inIndex >= 0) {
                        ByteBuffer buffer = inputBuffers[inIndex];
                        int sampleSize = mExtractor.readSampleData(buffer, 0);
                        if (sampleSize < 0) {
                            // We shouldn't stop the playback at this point, just pass the EOS
                            // flag to mDecoder, we will get it again from the
                            // dequeueOutputBuffer
                            Log.d(TAG, "InputBuffer BUFFER_FLAG_END_OF_STREAM");
                            mDecoder.queueInputBuffer(inIndex, 0, 0, 0, MediaCodec.BUFFER_FLAG_END_OF_STREAM);
                            isEOS = true;
                        } else {
                            mDecoder.queueInputBuffer(inIndex, 0, sampleSize, mExtractor.getSampleTime(), 0);
                            mExtractor.advance();
                        }
                    }
                }
                BufferInfo info = new BufferInfo();
                int outIndex = mDecoder.dequeueOutputBuffer(info, TIME_US);
                switch (outIndex) {
                case MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED:
                    Log.d(TAG, "INFO_OUTPUT_BUFFERS_CHANGED");
                    outputBuffers = mDecoder.getOutputBuffers();
                    break;
                case MediaCodec.INFO_OUTPUT_FORMAT_CHANGED:
                    Log.d(TAG, "New format " + mDecoder.getOutputFormat());
                    break;
                case MediaCodec.INFO_TRY_AGAIN_LATER:
                    Log.d(TAG, "dequeueOutputBuffer timed out!");
                    break;
                default:
                    //here erro?
                    Log.d(TAG, "outIndex:"+outIndex);
                    ByteBuffer buffer = outputBuffers[outIndex];
                    Log.d(TAG, "ByteBuffer limit:"+buffer.limit()+" info size:"+info.size);
                    final byte[] chunk = new byte[info.size];
                    buffer.get(chunk);
                    if(mCallback != null){
                        //mCallback.UpstreamCallback(chunk,info.size);
                    }
                    //clear buffer,otherwise get the same buffer which is the last buffer
                    buffer.clear();
                    if(DEBUG_VIDEO)Log.v(TAG, "We can't use this buffer but render it due to the API limit, " + buffer);
                    // We use a very simple clock to keep the video FPS, or the video
                    // playback will be too fast
                    while (info.presentationTimeUs / 1000 > System.currentTimeMillis() - startMs) {
                        try {
                            sleep(10);
                        } catch (InterruptedException e) {
                            e.printStackTrace();
                            break;
                        }
                    }
                    mDecoder.releaseOutputBuffer(outIndex, true);
                    break;
                }
                // All decoded frames have been rendered, we can stop playing now
                if ((info.flags & MediaCodec.BUFFER_FLAG_END_OF_STREAM) != 0) {
                    Log.d(TAG, "OutputBuffer BUFFER_FLAG_END_OF_STREAM");
                    break;
                }
            }
            mDecoder.stop();
            mDecoder.release();
            mExtractor.release();
        } catch (Exception ioe) {  
           Log.d(TAG,"failed init decoder", ioe);  
        }  
    } 

注意video解码时,如果使用surface来输出,则outbuffer会被消耗掉,想要在视频输出的同时转码,建议使用OpenESGL来绘制窗口(这部分后面提供代码),保留原有buffer。

② 音频解码:

基本和视频解码的方式一样,只不过是通过MediaExtractor从文件中分离出音频,然后使用指定的音频MediaFormat来通知Decoder解码。
音频解码器的创建如下:

          for (int i = 0; i < mExtractor.getTrackCount(); i++) {
                MediaFormat format = mExtractor.getTrackFormat(i);
                String mime = format.getString(MediaFormat.KEY_MIME);
                if (mime.startsWith("audio/")) {
                    mExtractor.selectTrack(i);
                    mSampleRate = format.getInteger(MediaFormat.KEY_SAMPLE_RATE);
                    channel = format.getInteger(MediaFormat.KEY_CHANNEL_COUNT);
                    mDecoder = MediaCodec.createDecoderByType(mime);
                    mDecoder.configure(format, null, null, 0);
                    break;
                }
            }

            if (mDecoder == null) {
                Log.e(TAG, "Can't find audio info!");
                return;
            }
            mDecoder.start();

在这里,我直接通过Android的AudioTrack来输出音频,完整代码如下:

public class AudioDecoder extends MediaDecoder {
    private static final String TAG = "VideoDecode";
    private static final boolean DEBUG_AUDIO = false;

    private int mSampleRate = 0;
    private int channel = 0;

    public AudioDecoder(String videoFilePath, Surface surface,UpstreamCallback callback) {
        super(videoFilePath, surface, callback);
    }

    @Override
    public void run() {
        // TODO Auto-generated method stub
        super.run();
        AudioDecodePrepare();
    }

    public void AudioDecodePrepare() {
        try {
            for (int i = 0; i < mExtractor.getTrackCount(); i++) {
                MediaFormat format = mExtractor.getTrackFormat(i);
                String mime = format.getString(MediaFormat.KEY_MIME);
                if (mime.startsWith("audio/")) {
                    mExtractor.selectTrack(i);
                    mSampleRate = format.getInteger(MediaFormat.KEY_SAMPLE_RATE);
                    channel = format.getInteger(MediaFormat.KEY_CHANNEL_COUNT);
                    mDecoder = MediaCodec.createDecoderByType(mime);
                    mDecoder.configure(format, null, null, 0);
                    break;
                }
            }

            if (mDecoder == null) {
                Log.e(TAG, "Can't find audio info!");
                return;
            }
            mDecoder.start();
            ByteBuffer[] inputBuffers = mDecoder.getInputBuffers();
            ByteBuffer[] outputBuffers = mDecoder.getOutputBuffers();
            BufferInfo info = new BufferInfo();
            int buffsize = AudioTrack.getMinBufferSize(mSampleRate,
                    AudioFormat.CHANNEL_OUT_STEREO,
                    AudioFormat.ENCODING_PCM_16BIT);
            AudioTrack audioTrack = new AudioTrack(AudioManager.STREAM_MUSIC,
                    mSampleRate, AudioFormat.CHANNEL_OUT_STEREO,
                    AudioFormat.ENCODING_PCM_16BIT, buffsize,
                    AudioTrack.MODE_STREAM);
            audioTrack.play();

            boolean isEOS = false;
            long startMs = System.currentTimeMillis();

            while (!Thread.interrupted()) {
                if (!isEOS) {
                    int inIndex = mDecoder.dequeueInputBuffer(TIME_US);
                    if (inIndex >= 0) {
                        ByteBuffer buffer = inputBuffers[inIndex];
                        int sampleSize = mExtractor.readSampleData(buffer, 0);
                        if (sampleSize < 0) {
                            // We shouldn't stop the playback at this point,
                            // just pass the EOS
                            // flag to mediaDecoder, we will get it again from
                            // the
                            // dequeueOutputBuffer
                            Log.d(TAG, "InputBuffer BUFFER_FLAG_END_OF_STREAM");
                            mDecoder.queueInputBuffer(inIndex, 0, 0, 0,
                                    MediaCodec.BUFFER_FLAG_END_OF_STREAM);
                            isEOS = true;
                        } else {
                            mDecoder.queueInputBuffer(inIndex, 0,
                                    sampleSize, mExtractor.getSampleTime(), 0);
                            mExtractor.advance();
                        }
                    }
                }
                int outIndex = mDecoder.dequeueOutputBuffer(info, TIME_US);
                switch (outIndex) {
                case MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED:
                    Log.d(TAG, "INFO_OUTPUT_BUFFERS_CHANGED");
                    outputBuffers = mDecoder.getOutputBuffers();
                    break;
                case MediaCodec.INFO_OUTPUT_FORMAT_CHANGED:
                    MediaFormat format = mDecoder.getOutputFormat();
                    Log.d(TAG, "New format " + format);
                    audioTrack.setPlaybackRate(format
                            .getInteger(MediaFormat.KEY_SAMPLE_RATE));
                    break;
                case MediaCodec.INFO_TRY_AGAIN_LATER:
                    Log.d(TAG, "dequeueOutputBuffer timed out!");
                    break;
                default:
                    ByteBuffer buffer = outputBuffers[outIndex];
                    
                    if(DEBUG_AUDIO)Log.v(TAG,"We can't use this buffer but render it due to the API limit, "+ buffer);
                    final byte[] chunk = new byte[info.size];
                    buffer.get(chunk);
                    if(mCallback != null){
                        mCallback.UpstreamCallback(chunk,info.size);
                    }
                    //clear buffer,otherwise get the same buffer which is the last buffer
                    buffer.clear();                 
                    // We use a very simple clock to keep the video FPS, or the
                    // audio playback will be too fast
                    while (info.presentationTimeUs / 1000 > System
                            .currentTimeMillis() - startMs) {
                        try {
                            sleep(10);
                        } catch (InterruptedException e) {
                            e.printStackTrace();
                            break;
                        }
                    }
                    // AudioTrack write data
                    audioTrack.write(chunk, info.offset, info.offset
                            + info.size);
                    mDecoder.releaseOutputBuffer(outIndex, false);
                    break;
                }
                // All decoded frames have been rendered, we can stop playing now
                if ((info.flags & MediaCodec.BUFFER_FLAG_END_OF_STREAM) != 0) {
                    Log.d(TAG, "OutputBuffer BUFFER_FLAG_END_OF_STREAM");
                    break;
                }
            }
            mDecoder.stop();
            mDecoder.release();
            mExtractor.release();
            audioTrack.stop();
            audioTrack.release();
        } catch (Exception ioe) {
            throw new RuntimeException("failed init encoder", ioe);
        }
    }
    
}

通过以上处理,就可以使用硬解方案实现音视频播放了。

3.结束语

解码的知识先介绍这么多,下一篇会介绍MediaCodec来实现编码。感谢持续关注!

    原文作者:Young_Allen
    原文地址: https://www.jianshu.com/p/f07d9bb44187
    本文转自网络文章,转载此文章仅为分享知识,如有侵权,请联系博主进行删除。
点赞