MediaCodec Analyse – create
Refrence: https://source.android.google.cn/devices/media

一 APK调用的核心API
Android APK使用 MediaCodec API 播放音视频的简易流程:
MediaCodec codec = MediaCodec.createDecoderByType("video/avc");
MediaFormat format = MediaFormat.createVideoFormat("video/avc", 320, 480);
codec.configure(format, surface, null, 0);
codec.start();
二 MediaCodec codec = MediaCodec.createDecoderByType("video/avc");
创建java-framework层的 MediaCodec 对象实例,并调用jni层的 native_setup(...) 函数。
frameworks\base\media\java\android\media\MediaCodec.java
public static MediaCodec createDecoderByType(@NonNull String type) throws IOException {return new MediaCodec(type, true /* nameIsType */, false /* encoder */);}private MediaCodec(@NonNull String name, boolean nameIsType, boolean encoder) {......// looper、handler处理native_setup(name, nameIsType, encoder);}private native final void native_setup(@NonNull String name, boolean nameIsType, boolean encoder);
android_media_MediaCodec_native_setup(...)中new了一个JMediaCodec实例,然后检查返回结果。
frameworks\base\media\jni\android_media_MediaCodec.cpp
static void android_media_MediaCodec_native_setup(JNIEnv *env, jobject thiz,jstring name, jboolean nameIsType, jboolean encoder) {...... //变量判断等sp<JMediaCodec> codec = new JMediaCodec(env, thiz, tmp, nameIsType, encoder);...... //mInitStatus,即create MediaCodec结果的判断setMediaCodec(env,thiz, codec); //强引用等操作,作用是?
}
通过MediaCodec::CreateByType(...)或MediaCodec::CreateByComponentName(...)创建native层的MediaCodec实例。
frameworks\base\media\jni\android_media_MediaCodec.cpp
JMediaCodec::JMediaCodec(JNIEnv *env, jobject thiz,const char *name, bool nameIsType, bool encoder): mClass(NULL),mObject(NULL) {...... //mLooper等相关变量的处理if (nameIsType) {mCodec = MediaCodec::CreateByType(mLooper, name, encoder, &mInitStatus);if (mCodec == nullptr || mCodec->getName(&mNameAtCreation) != OK) {mNameAtCreation = "(null)";}} else {mCodec = MediaCodec::CreateByComponentName(mLooper, name, &mInitStatus);mNameAtCreation = name;}CHECK((mCodec != NULL) != (mInitStatus != OK));
}
以mCodec = MediaCodec::CreateByType(mLooper, name, encoder, &mInitStatus);为例,在native层,通过MediaCodecList::findMatchingCodecs(...)查找出所有合适的编解码器,然后循环创建所有的编解码器实例,并通过codec->init(componentName)初始化相关编解码器实例,直到有一个编解码器的init返回OK,否则返回NULL。
frameworks\av\media\libstagefright\MediaCodec.cpp
sp<MediaCodec> MediaCodec::CreateByType(const sp<ALooper> &looper, const AString &mime, bool encoder, status_t *err, pid_t pid, uid_t uid) {Vector<AString> matchingCodecs;MediaCodecList::findMatchingCodecs(mime.c_str(),encoder,0,&matchingCodecs);if (err != NULL) {*err = NAME_NOT_FOUND;}for (size_t i = 0; i < matchingCodecs.size(); ++i) {sp<MediaCodec> codec = new MediaCodec(looper, pid, uid);AString componentName = matchingCodecs[i];status_t ret = codec->init(componentName);if (err != NULL) {*err = ret;}if (ret == OK) { //有一个成功就会退出循环,此时*err==OK,即上文的mInitStatus==OKreturn codec;}ALOGD("Allocating component '%s' failed (%d), try next one.", componentName.c_str(), ret);}return NULL;
}
实例化native层的MediaCodec时,会设置mState为UNINITIALIZED,要重点提一下。
frameworks\av\media\libstagefright\MediaCodec.cpp
MediaCodec::MediaCodec(const sp<ALooper> &looper, pid_t pid, uid_t uid): mState(UNINITIALIZED), //保存状态到mState中mReleasedByResourceManager(false),mLooper(looper),mCodec(NULL),mReplyID(0),mFlags(0),mStickyError(OK),mSoftRenderer(NULL),mIsVideo(false),mVideoWidth(0),mVideoHeight(0),mRotationDegrees(0),mDequeueInputTimeoutGeneration(0),mDequeueInputReplyID(0),mDequeueOutputTimeoutGeneration(0),mDequeueOutputReplyID(0),mTunneledInputWidth(0),mTunneledInputHeight(0),mTunneled(false),mHaveInputSurface(false),mHavePendingInputBuffers(false),mCpuBoostRequested(false),mLatencyUnknown(0),mNumLowLatencyEnables(0),mNumLowLatencyDisables(0),mIsLowLatencyModeOn(false),mIndexOfFirstFrameWhenLowLatencyOn(-1),mInputBufferCounter(0) {if (uid == kNoUid) {mUid = AIBinder_getCallingUid();} else {mUid = uid;}mResourceManagerProxy = new ResourceManagerServiceProxy(pid, mUid, ::ndk::SharedRefBase::make<ResourceManagerClient>(this));initMediametrics(); //初始化Media相关指标变量
}
在MediaCodec::init(const AString &name)中,通过mCodec = GetCodecBase(localname, owner);创建具体的Codec,并保存到mCodec变量中。这里要注意一点,后文还有mCodec变量,遇到时要分清楚是属于哪个类的!
frameworks\av\media\libstagefright\MediaCodec.cpp
status_t MediaCodec::init(const AString &name) {mResourceManagerProxy->init();...... //相关变量的处理mCodec = GetCodecBase(localname, owner);if (mCodec == NULL) {return NAME_NOT_FOUND;}...... // looper和callback的处理sp<AMessage> msg = new AMessage(kWhatInit, this);...... //msg等变量的处理for (int i = 0; i <= kMaxRetry; ++i) {if (i > 0) {// Don't try to reclaim resource for the first time.if (!mResourceManagerProxy->reclaimResource(resources)) {break;}}sp<AMessage> response;err = PostAndAwaitResponse(msg, &response);if (!isResourceError(err)) {break;}}return err;
}
在MediaCodec::GetCodecBase(...)中判断前缀决定使用哪种Codec,目前(22年5月)网络视频一般用ACodec,所以本文以此为例分析。
frameworks\av\media\libstagefright\MediaCodec.cpp
sp<CodecBase> MediaCodec::GetCodecBase(const AString &name, const char *owner) {if (owner) {if (strcmp(owner, "default") == 0) {return new ACodec;} else if (strncmp(owner, "codec2", 6) == 0) {return CreateCCodec();}}if (name.startsWithIgnoreCase("c2.")) {return CreateCCodec();} else if (name.startsWithIgnoreCase("omx.")) {// at this time only ACodec specifies a mime type.return new ACodec;} else if (name.startsWithIgnoreCase("android.filter.")) {return new MediaFilter;} else {return NULL;}
}
frameworks\av\media\libstagefright\ACodec.cpp
ACodec::ACodec(): mSampleRate(0),mNodeGeneration(0),mUsingNativeWindow(false),mNativeWindowUsageBits(0),mLastNativeWindowDataSpace(HAL_DATASPACE_UNKNOWN),mIsVideo(false),mIsImage(false),mIsEncoder(false),mFatalError(false),mShutdownInProgress(false),mExplicitShutdown(false),mIsLegacyVP9Decoder(false),mEncoderDelay(0),mEncoderPadding(0),mRotationDegrees(0),mChannelMaskPresent(false),mChannelMask(0),mDequeueCounter(0),mMetadataBuffersToSubmit(0),mNumUndequeuedBuffers(0),mRepeatFrameDelayUs(-1LL),mMaxPtsGapUs(0LL),mMaxFps(-1),mFps(-1.0),mCaptureFps(-1.0),mCreateInputBuffersSuspended(false),mTunneled(false),mDescribeColorAspectsIndex((OMX_INDEXTYPE)0),mDescribeHDRStaticInfoIndex((OMX_INDEXTYPE)0),mDescribeHDR10PlusInfoIndex((OMX_INDEXTYPE)0),mStateGeneration(0),mVendorExtensionsStatus(kExtensionsUnchecked) {memset(&mLastHDRStaticInfo, 0, sizeof(mLastHDRStaticInfo));mUninitializedState = new UninitializedState(this); //这里设置了ACodec的mCodec变量mLoadedState = new LoadedState(this);mLoadedToIdleState = new LoadedToIdleState(this);mIdleToExecutingState = new IdleToExecutingState(this);mExecutingState = new ExecutingState(this);mOutputPortSettingsChangedState = new OutputPortSettingsChangedState(this);mExecutingToIdleState = new ExecutingToIdleState(this);mIdleToLoadedState = new IdleToLoadedState(this);mFlushingState = new FlushingState(this);mPortEOS[kPortIndexInput] = mPortEOS[kPortIndexOutput] = false;mInputEOSResult = OK;mPortMode[kPortIndexInput] = IOMX::kPortModePresetByteBuffer;mPortMode[kPortIndexOutput] = IOMX::kPortModePresetByteBuffer;memset(&mLastNativeWindowCrop, 0, sizeof(mLastNativeWindowCrop));changeState(mUninitializedState);
}ACodec::UninitializedState::UninitializedState(ACodec *codec): BaseState(codec) {
}ACodec::BaseState::BaseState(ACodec *codec, const sp<AState> &parentState): AState(parentState),mCodec(codec) {
}
先说明一下MediaCodec::init(...)函数中的mCodec = GetCodecBase(localname, owner);,此条语句执行完,mCodec保存的就是ACodec实例了。继续分析MediaCodec::init(...)中的sp<AMessage> msg = new AMessage(kWhatInit, this);,通过AHandler-ALooper-AMessage架构发送了一条消息kWhatInit消息。在MediaCodec::onMessageReceived(...)的case kWhatInit中设置了回调、状态、编解码器相关信息等,然后调用ACodec的initiateAllocateComponent(...)。
frameworks\av\media\libstagefright\MediaCodec.cpp
void MediaCodec::onMessageReceived(const sp<AMessage> &msg) {switch (msg->what()) {......case kWhatInit:{...... //设置回调、状态、编解码器相关信息mCodec->initiateAllocateComponent(format); //此处的mCodec为ACodec实例break;}......}
}
ACodec::initiateAllocateComponent(...)中通过发送kWhatAllocateComponent消息,调用ACodec::UninitializedState::onAllocateComponent(...),在其中通过OMXClient实例化omx对象,并调用其allocateNode函数分配相关节点。然后保存好omx对象和相关omx节点,并调用mCallback->onComponentAllocated(...)回调函数,最后更改状态。
frameworks\av\media\libstagefright\ACodec.cpp
void ACodec::initiateAllocateComponent(const sp<AMessage> &msg) {msg->setWhat(kWhatAllocateComponent);msg->setTarget(this);msg->post();
}bool ACodec::UninitializedState::onMessageReceived(const sp<AMessage> &msg) {bool handled = false;switch (msg->what()) {......case ACodec::kWhatAllocateComponent:{onAllocateComponent(msg);handled = true;break;}......}return handled;
}bool ACodec::UninitializedState::onAllocateComponent(const sp<AMessage> &msg) {ALOGV("onAllocateComponent");...... //codecInfo、componentName等相关变量的处理sp<CodecObserver> observer = new CodecObserver(notify);sp<IOMX> omx;sp<IOMXNode> omxNode;status_t err = NAME_NOT_FOUND;OMXClient client;if (client.connect(owner.c_str()) != OK) {mCodec->signalError(OMX_ErrorUndefined, NO_INIT);return false;}omx = client.interface();pid_t tid = gettid();int prevPriority = androidGetThreadPriority(tid);androidSetThreadPriority(tid, ANDROID_PRIORITY_FOREGROUND);err = omx->allocateNode(componentName.c_str(), observer, &omxNode);androidSetThreadPriority(tid, prevPriority);if (err != OK) {ALOGE("Unable to instantiate codec '%s' with err %#x.", componentName.c_str(), err);mCodec->signalError((OMX_ERRORTYPE)err, makeNoSideEffectStatus(err));return false;}mDeathNotifier = new DeathNotifier(new AMessage(kWhatOMXDied, mCodec));auto tOmxNode = omxNode->getHalInterface<IOmxNode>();if (tOmxNode && !tOmxNode->linkToDeath(mDeathNotifier, 0)) {mDeathNotifier.clear();}++mCodec->mNodeGeneration;mCodec->mComponentName = componentName; //此处的mCodec是ACodec实例mCodec->mRenderTracker.setComponentName(componentName);mCodec->mFlags = 0;if (componentName.endsWith(".secure")) {mCodec->mFlags |= kFlagIsSecure;mCodec->mFlags |= kFlagIsGrallocUsageProtected;mCodec->mFlags |= kFlagPushBlankBuffersToNativeWindowOnShutdown;}mCodec->mOMX = omx;mCodec->mOMXNode = omxNode;mCodec->mCallback->onComponentAllocated(mCodec->mComponentName.c_str());mCodec->changeState(mCodec->mLoadedState);return true;
}
这里稍微分析下
mCodec->mCallback->onComponentAllocated(mCodec->mComponentName.c_str());中的mCallback。在ACodec中找不到mCallback变量,那它在哪里?怎样回调的呢?正向不好找,反向找。查找
onComponentAllocated可以找到MediaCodec.cpp:517:void CodecCallback::onComponentAllocated(const char *componentName) {...},继续查找CodecCallback可以找到MediaCodec.cpp:446:class CodecCallback : public CodecBase::CodecCallback {...},简单推测ACodec中的mCallback也是在MediaCodec中初始化的,再加上继续往下找没有明显关联。所以返回查找
mCallback,可以找到MediaCodec.cpp:2898: mCallback = callback;和CodecBase.h:243: std::unique_ptr<CodecCallback> mCallback;、CodecBase.h:267: mCallback = std::move(callback);,通过下文的分析可知,ACodec中的mCallback变量是CodecBase中的,即,CodecBase.h:243: std::unique_ptr<CodecCallback> mCallback;。而CodecBase.h:267: mCallback = std::move(callback);是CodecBase.h:211: inline void setCallback(std::unique_ptr<CodecCallback> &&callback) {...}函数中的语句。继续查找setCallback可以找到MediaCodec.cpp:1210: mCodec->setCallback((此处的mCodec指的是ACodec实例),查看code可以知道此处new了一个CodecCallback实例:mCodec->setCallback(std::unique_ptr<CodecBase::CodecCallback>(new CodecCallback(new AMessage(kWhatCodecNotify, this))));,是status_t MediaCodec::init(const AString &name) {...}函数中的语句。至此,和上面联系起来了。另外,搜索h文件查找
mCallback可以找到CodecBase.h:243: std::unique_ptr<CodecCallback> mCallback;,再找到ACodec和CodecBase的关系,应该就结束了。再ACodec.cpp文件中可以看到类定义ACodec.h:60:struct ACodec : public AHierarchicalStateMachine, public CodecBase {...}。所以可知,ACodec继承自CodecBase,所以也继承了CodecBase中的std::unique_ptr<CodecCallback> mCallback;变量。因此,对ACodec中的
mCodec->mCallback->onComponentAllocated(mCodec->mComponentName.c_str());分析为:在CodecBase.h中定义了std::unique_ptr<CodecCallback> mCallback;变量,ACodec继承过来了。等APK调用到native层的MediaCodec::init(...)函数时,会将通过mCodec->setCallback(...)也就是CodecBase中的inline void setCallback(...)函数将其赋值。之后就可以调用MediaCodec.cpp:517:void CodecCallback::onComponentAllocated(const char *componentName) {...}函数了。

















