Unity 渲染YUV数据 ---- 以Unity渲染Android Camera数据为例子

article/2025/7/19 9:05:40

1 背景

一般Unity都是RGB直接渲染的,但是总有特殊情况下,需要渲染YUV数据。比如,Unity读取Android的Camera YUV数据,并渲染。本文就基于这种情况,来展开讨论。

Unity读取Android的byte数组,本身就耗时,如果再把YUV数据转为RGB也在脚本中实现(即CPU运行),那就很卡了。
一种办法,就是这个转换,放在GPU完成,即,在shader实现!

接下来,分2块来贴出源码和实现。

2 YUV数据来源 ---- Android 侧

Android的Camera数据,一般是YUV格式的,最常用的就是NV21。其像素布局如下:
在这里插入图片描述
即数据排列是YYYYVUVU…

现在,Android就做一项工作,打开Camera,随便渲染到一个空纹理,然后呢,获得Preview的帧数据。
用一个SimpleCameraPlugin作为Unity调用Android的接口类:
代码如下:

package com.chenxf.unitycamerasdk;import android.app.Activity;
import android.content.Context;
import android.graphics.Point;
import android.graphics.SurfaceTexture;
import android.hardware.Camera;
import android.opengl.GLES11Ext;
import android.opengl.GLES20;
import android.os.Handler;
import android.os.Looper;
import android.os.Message;
import android.util.Log;
import android.util.Size;/**** 直接读取YUV数据,在unity渲染的方案*/
public class SimpleCameraPlugin implements SurfaceTexture.OnFrameAvailableListener,  Camera.PreviewCallback {private static final String TAG = "qymv#CameraPlugin";private final static int REQUEST_CODE = 1;private final static int MSG_START_PREVIEW = 1;private final static int MSG_SWITCH_CAMERA = 2;private final static int MSG_RELEASE_PREVIEW = 3;private final static int MSG_MANUAL_FOCUS = 4;private final static int MSG_ROCK = 5;private SurfaceTexture mSurfaceTexture;private boolean mIsUpdateFrame;private Context mContext;private Handler mCameraHanlder;private Size mExpectedPreviewSize;private Size mPreviewSize;private boolean isFocusing;private int mWidth;private int mHeight;private byte[] yBuffer = null;private byte[] uvBuffer = null;public SimpleCameraPlugin() {Log.i(TAG, " create");initCameraHandler();}private void initCameraHandler() {mCameraHanlder = new Handler(Looper.getMainLooper()) {@Overridepublic void handleMessage(Message msg) {switch (msg.what) {case MSG_START_PREVIEW:startPreview();break;case MSG_RELEASE_PREVIEW:releasePreview();break;case MSG_SWITCH_CAMERA://switchCamera();break;case MSG_MANUAL_FOCUS://manualFocus(msg.arg1, msg.arg2);break;case MSG_ROCK:autoFocus();break;default:break;}}};}public void releasePreview() {CameraUtil.releaseCamera();
//        mCameraSensor.stop();
//        mFocusView.cancelFocus();Log.e(TAG, "releasePreview releaseCamera");}public void startPreview() {//if (mPreviewSize != null && requestPermission() ) {if (mExpectedPreviewSize != null) {if (CameraUtil.getCamera() == null) {CameraUtil.openCamera();Log.e(TAG, "openCamera");//CameraUtil.setDisplay(mSurfaceTexture);}Camera.Size previewSize = CameraUtil.startPreview((Activity) mContext, mExpectedPreviewSize.getWidth(), mExpectedPreviewSize.getHeight());CameraUtil.setCallback(this);if(previewSize != null) {mWidth = previewSize.width;mHeight = previewSize.height;mPreviewSize = new Size(previewSize.width, previewSize.height);initSurfaceTexture(previewSize.width, previewSize.height);initBuffer(previewSize.width, previewSize.height);CameraUtil.setDisplay(mSurfaceTexture);}}}private void initBuffer(int width, int height) {yBuffer = new byte[width * height];uvBuffer = new byte[width * height / 2];}public void autoFocus() {if (CameraUtil.isBackCamera() && CameraUtil.getCamera() != null) {focus(mWidth / 2, mHeight / 2, true);}}private void focus(final int x, final int y, final boolean isAutoFocus) {Log.i(TAG, "focus, position: " + x + " " + y);if (CameraUtil.getCamera() == null || !CameraUtil.isBackCamera()) {return;}if (isFocusing && isAutoFocus) {return;}if (mWidth == 0 || mHeight == 0)return;isFocusing = true;Point focusPoint = new Point(x, y);Size screenSize = new Size(mWidth, mHeight);if (!isAutoFocus) {//mFocusView.beginFocus(x, y);}CameraUtil.newCameraFocus(focusPoint, screenSize, new Camera.AutoFocusCallback() {@Overridepublic void onAutoFocus(boolean success, Camera camera) {isFocusing = false;if (!isAutoFocus) {//mFocusView.endFocus(success);}}});}/*** 初始化* 调用该函数需要EGL 线程,否则会出现如下错误* libEGL  : call to OpenGL ES API with no current context** @param context           android的context,最好传递activity的上下文* @param width             纹理宽* @param height            纹理高*/public void start(Context context, int width, int height) {Log.w(TAG, "Start  context " + context);mContext = context;mWidth = width;mHeight = height;callStartPreview(width, height);}private void callStartPreview(int width, int height) {mExpectedPreviewSize = new Size(width, height);mCameraHanlder.sendEmptyMessage(MSG_START_PREVIEW);mCameraHanlder.sendEmptyMessageDelayed(MSG_ROCK, 2000);}public void resume() {}public void pause() {mCameraHanlder.sendEmptyMessage(MSG_RELEASE_PREVIEW);}private void initSurfaceTexture(int width, int height) {Log.i(TAG, "initSurfaceTexture " + " width " + width + " height " + height);//生成CAMERA输出的纹理idint videoTextureId = createOESTextureID();//根据创建的纹理id生成一个SurfaceTexture, 视频播放输出到surface texturemSurfaceTexture = new SurfaceTexture(videoTextureId);mSurfaceTexture.setDefaultBufferSize(width, height);mSurfaceTexture.setOnFrameAvailableListener(this);}@Overridepublic void onFrameAvailable(SurfaceTexture surfaceTexture) {Log.i(TAG, "onFrameAvailable");//视频播放开始有输出mIsUpdateFrame = true;if (frameListener != null) {frameListener.onFrameAvailable();}}/*** Unity调用,更新视频纹理,然后绘制FBO*/public void updateTexture() {Log.i(TAG, "onFrameAvailable");mIsUpdateFrame = false;mSurfaceTexture.updateTexImage();}public boolean isUpdateFrame() {return mIsUpdateFrame;}@Overridepublic void onActivityResume() {resume();}@Overridepublic void onActivityPause() {pause();}private FrameListener frameListener;public void setFrameListener(FrameListener listener) {frameListener = listener;}private synchronized boolean copyFrame(byte[] bytes) {Log.i(TAG, "copyFrame start");if(yBuffer != null && uvBuffer != null) {System.arraycopy(bytes, 0, yBuffer, 0, yBuffer.length);int uvIndex = yBuffer.length;System.arraycopy(bytes, uvIndex, uvBuffer, 0, uvBuffer.length);Log.i(TAG, "copyFrame end");return true;}return false;}public synchronized byte[] readYBuffer() {Log.i(TAG, "readYBuffer");return yBuffer;}public synchronized byte[] readUVBuffer() {Log.i(TAG, "readUVBuffer");return uvBuffer;}public int getPreviewWidth() {Log.i(TAG, "getPreviewWidth " + mWidth);return mWidth;}public int getPreviewHeight() {Log.i(TAG, "getPreviewWidth " + mHeight);return mHeight;}@Overridepublic void onPreviewFrame(byte[] bytes, Camera camera) {if(copyFrame(bytes)) {UnityMsgBridge.notifyGotFrame();}}public interface FrameListener {void onFrameAvailable();}public static int createOESTextureID() {int[] texture = new int[1];GLES20.glGenTextures(texture.length, texture, 0);GLES20.glBindTexture(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, texture[0]);GLES20.glTexParameteri(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GLES20.GL_TEXTURE_MIN_FILTER, GLES20.GL_LINEAR);GLES20.glTexParameteri(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GLES20.GL_TEXTURE_MAG_FILTER, GLES20.GL_LINEAR);GLES20.glTexParameteri(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GLES20.GL_TEXTURE_WRAP_S, GLES20.GL_CLAMP_TO_EDGE);GLES20.glTexParameteri(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GLES20.GL_TEXTURE_WRAP_T, GLES20.GL_CLAMP_TO_EDGE);return texture[0];}
}

依赖很少,有个CameraUtils来打开摄像头:

package com.qiyi.unitycamerasdk;import android.app.Activity;
import android.content.Context;
import android.content.pm.PackageManager;
import android.graphics.Point;
import android.graphics.Rect;
import android.graphics.SurfaceTexture;
import android.hardware.Camera;
import android.os.Build;
import android.util.Log;
import android.util.Size;
import android.view.Surface;import java.io.IOException;
import java.util.ArrayList;
import java.util.Collections;
import java.util.Comparator;
import java.util.List;/*** Created By Chengjunsen on 2018/8/23*/
public class CameraUtil {private static final String TAG = "qymv#CameraUtil";private static Camera mCamera = null;private static int mCameraID = Camera.CameraInfo.CAMERA_FACING_FRONT;public static void openCamera() {mCamera = Camera.open(mCameraID);if (mCamera == null) {throw new RuntimeException("Unable to open camera");}}public static Camera getCamera() {return mCamera;}public static void releaseCamera() {if (mCamera != null) {mCamera.stopPreview();mCamera.release();mCamera = null;}}public static void setCameraId(int cameraId) {mCameraID = cameraId;}public static void switchCameraId() {mCameraID = isBackCamera() ? Camera.CameraInfo.CAMERA_FACING_FRONT : Camera.CameraInfo.CAMERA_FACING_BACK;}public static boolean isBackCamera() {return mCameraID == Camera.CameraInfo.CAMERA_FACING_BACK;}public static void setDisplay(SurfaceTexture surfaceTexture) {try {if (mCamera != null) {mCamera.setPreviewTexture(surfaceTexture);}} catch (IOException e) {e.printStackTrace();}}public static Camera.Size startPreview(Activity activity, int width, int height) {if (mCamera != null) {int mOrientation = getCameraPreviewOrientation(activity, mCameraID);//int mOrientation = 90;mCamera.setDisplayOrientation(mOrientation);Camera.Parameters parameters = mCamera.getParameters();Camera.Size bestPreviewSize = getOptimalSize(parameters.getSupportedPreviewSizes(), width, height);parameters.setPreviewSize(bestPreviewSize.width, bestPreviewSize.height);Camera.Size bestPictureSize = getOptimalSize(parameters.getSupportedPictureSizes(), width, height);parameters.setPictureSize(bestPictureSize.width, bestPictureSize.height);parameters.setFocusMode(Camera.Parameters.FOCUS_MODE_AUTO);mCamera.setParameters(parameters);mCamera.startPreview();Log.d(TAG, "camera startPreview: (" + width + " x " + height +")" + " bestPreviewSize " + bestPreviewSize.width + "X" + bestPreviewSize.height );return bestPreviewSize;}return null;}public static Camera.Size getPreviewSize() {if(mCamera != null && mCamera.getParameters() != null) {return mCamera.getParameters().getPreviewSize();}return null;}public static void setCallback(Camera.PreviewCallback callback) {if(mCamera != null) {mCamera.setPreviewCallback(callback);}}/*** 获取最合适的尺寸* @param supportList* @param width* @param height* @return*/private static Camera.Size getOptimalSize(List<Camera.Size> supportList, int width, int height) {// camera的宽度是大于高度的,这里要保证expectWidth > expectHeightint expectWidth = Math.max(width, height);int expectHeight = Math.min(width, height);// 根据宽度进行排序Collections.sort(supportList, new Comparator<Camera.Size>() {@Overridepublic int compare(Camera.Size pre, Camera.Size after) {if (pre.width > after.width) {return 1;} else if (pre.width < after.width) {return -1;}return 0;}});Camera.Size result = supportList.get(0);boolean widthOrHeight = false; // 判断存在宽或高相等的Size// 辗转计算宽高最接近的值for (Camera.Size size: supportList) {// 如果宽高相等,则直接返回if (size.width == expectWidth && size.height == expectHeight) {result = size;break;}// 仅仅是宽度相等,计算高度最接近的sizeif (size.width == expectWidth) {widthOrHeight = true;if (Math.abs(result.height - expectHeight)> Math.abs(size.height - expectHeight)) {result = size;}}// 高度相等,则计算宽度最接近的Sizeelse if (size.height == expectHeight) {widthOrHeight = true;if (Math.abs(result.width - expectWidth)> Math.abs(size.width - expectWidth)) {result = size;}}// 如果之前的查找不存在宽或高相等的情况,则计算宽度和高度都最接近的期望值的Sizeelse if (!widthOrHeight) {if (Math.abs(result.width - expectWidth)> Math.abs(size.width - expectWidth)&& Math.abs(result.height - expectHeight)> Math.abs(size.height - expectHeight)) {result = size;}}}return result;}public static int getCameraPreviewOrientation(Activity activity, int cameraId) {if (mCamera == null) {throw new RuntimeException("mCamera is null");}Camera.CameraInfo info = new Camera.CameraInfo();Camera.getCameraInfo(cameraId, info);int result;int degrees = getRotation(activity);//前置if (info.facing == Camera.CameraInfo.CAMERA_FACING_FRONT) {result = (info.orientation + degrees) % 360;result = (360 - result) % 360;}//后置else {result = (info.orientation - degrees + 360) % 360;}return result;}/*** 对焦* @param focusPoint 焦点位置* @param screenSize 屏幕尺寸* @param callback 对焦成功或失败的callback* @return*/public static boolean newCameraFocus(Point focusPoint, Size screenSize, Camera.AutoFocusCallback callback) {if (mCamera == null) {throw new RuntimeException("mCamera is null");}Point cameraFoucusPoint = convertToCameraPoint(screenSize, focusPoint);Rect cameraFoucusRect = convertToCameraRect(cameraFoucusPoint, 100);Camera.Parameters parameters = mCamera.getParameters();if (Build.VERSION.SDK_INT > 14) {if (parameters.getMaxNumFocusAreas() <= 0) {return focus(callback);}clearCameraFocus();List<Camera.Area> focusAreas = new ArrayList<Camera.Area>();// 100是权重focusAreas.add(new Camera.Area(cameraFoucusRect, 100));parameters.setFocusAreas(focusAreas);// 设置感光区域parameters.setMeteringAreas(focusAreas);try {mCamera.setParameters(parameters);} catch (Exception e) {e.printStackTrace();return false;} }return focus(callback);}private static boolean focus(Camera.AutoFocusCallback callback) {if (mCamera == null) {return false;}mCamera.cancelAutoFocus();mCamera.autoFocus(callback);return true;}/*** 清除焦点*/public static void clearCameraFocus() {if (mCamera == null) {throw new RuntimeException("mCamera is null");}mCamera.cancelAutoFocus();Camera.Parameters parameters = mCamera.getParameters();parameters.setFocusAreas(null);parameters.setMeteringAreas(null);try {mCamera.setParameters(parameters);} catch (Exception e) {e.printStackTrace();}}/*** 将屏幕坐标转换成camera坐标* @param screenSize* @param focusPoint* @return cameraPoint*/private static Point convertToCameraPoint(Size screenSize, Point focusPoint){int newX = focusPoint.y * 2000/screenSize.getHeight() - 1000;int newY = -focusPoint.x * 2000/screenSize.getWidth() + 1000;return new Point(newX, newY);}private static Rect convertToCameraRect(Point centerPoint, int radius) {int left = limit(centerPoint.x - radius, 1000, -1000);int right = limit(centerPoint.x + radius, 1000, -1000);int top = limit(centerPoint.y - radius, 1000, -1000);int bottom = limit(centerPoint.y + radius, 1000, -1000);return new Rect(left, top, right, bottom);}private static int limit(int s, int max, int min) {if (s > max) { return max; }if (s < min) { return min; }return s;}public static int getRotation(Activity activity) {int rotation = activity.getWindowManager().getDefaultDisplay().getRotation();int degrees = 0;switch (rotation) {case Surface.ROTATION_0:degrees = 0;break;case Surface.ROTATION_90:degrees = 90;break;case Surface.ROTATION_180:degrees = 180;break;case Surface.ROTATION_270:degrees = 270;break;}return degrees;}
}

1.1 重点代码说明

关键代码在onPreviewFrame,这个是Camera回调的数据。bytes数组的长度是width * height * 1.5。

    @Overridepublic void onPreviewFrame(byte[] bytes, Camera camera) {if(copyFrame(bytes)) {UnityMsgBridge.notifyGotFrame();}}

每次得到回调,先拷贝bytes,分成Y数据和UV数据(具体见copyFrame)。然后,通知Unity,说一帧到了,赶紧来读取了。方法如下。

public static void notifyGotFrame() {JSONObject jsonObject = new JSONObject();try {jsonObject.put("type", 2);} catch (JSONException e) {e.printStackTrace();}String msg = jsonObject.toString();UnityPlayer.UnitySendMessage("Canvas", "OnJavaMessage", msg);}

如果想更深入了解有关Unity跟Android如何交互,请参考Unity同步或异步调用Android的方法

3 Unity读取并渲染

首先,Unity在Canvas节点下, 搞一个RawImage,用于渲染数据。

然后,新建一个shader,代码如下:

// Upgrade NOTE: replaced 'mul(UNITY_MATRIX_MVP,*)' with 'UnityObjectToClipPos(*)'Shader "Custom/UI/YUVRender"
{Properties{_YTex("Texture", 2D) = "white" {}_UVTex("Texture", 2D) = "white" {}}SubShader{// No culling or depthCull Off ZWrite Off ZTest AlwaysPass{CGPROGRAM#pragma vertex vert#pragma fragment frag#include "UnityCG.cginc"struct appdata{float4 vertex : POSITION;float2 uv : TEXCOORD0;};struct v2f{float2 uv : TEXCOORD0;float4 vertex : SV_POSITION;};v2f vert(appdata v){v2f o;o.vertex = UnityObjectToClipPos(v.vertex);o.uv = v.uv;return o;}sampler2D _YTex;sampler2D _UVTex;fixed4 frag(v2f i) : SV_Target{fixed4 col = tex2D(_YTex, i.uv);fixed4 uv4 = tex2D(_UVTex, i.uv);float y = 1.1643 * (col.r - 0.0625);float u = uv4.g - 0.5;float v = uv4.r - 0.5;float r = y + 1.403 * v;float g = y - 0.344 * u - 0.714 * v;float b = y + 1.770 * u;col.rgba = float4(r, g, b, 1.f);return col;}ENDCG}}
}

shader的主要工作,就是从2个纹理中,分别读取Y数据和UV数据。

再新建一个材质,如YUVMateiral,该材质使用上面的shader。
然后,材质赋值给RawImage。
在这里插入图片描述

从shader可知,我们把Y数据作为一个纹理,把UV数据作为一个纹理,然后从纹理中提取yuv,转化为RGB,然后渲染。
因为这个转化在GPU完成,所以可以大大提高性能。

接下来看如何读取Android的YUV数据。

搞一个脚本,挂在Canvas下,具体实现如下:

using System.Collections;
using System.Collections.Generic;
using UnityEngine;
using UnityEngine.UI;
using System;
using System.IO;public class TestGetRawYUV : BaseUIController
{private const string TAG = "qymv#Main";public GameObject YUVRawImage;public GameObject StartBtn;public GameObject GetDataBtn;private AndroidJavaObject nativeObject;private int width, height;private Texture2D texture2D;private AndroidJavaObject activity;public Image mYUVImage;int ImageXSize = 0;int ImageYSize = 0;Texture2D m_YImageTex = null;Texture2D m_UVImageTex = null;// Use this for initializationvoid Start(){Debug.Log(TAG + "Start");YUVRawImage = findObjectByPath("YUVRawImage");setBtnClickListener("StartBtn", Click);nativeObject = new AndroidJavaObject("com.qiyi.unitycamerasdk.SimpleCameraPlugin");width = 1600;height = 900;Debug.Log(TAG + " size: " + width + ", " + height);}public void OnJavaMessage(string message){Debug.Log("OnJavaMessage " + message);JSONObject msgObj = new JSONObject(message);float type = -1;if(msgObj.HasField("type")){type = msgObj["type"].n;}if(type == 2){GetData();}}// Update is called once per framevoid Update(){if (texture2D != null && nativeObject != null && nativeObject.Call<bool>("isUpdateFrame")){Debug.Log(TAG + " updateTexture");nativeObject.Call("updateTexture");GL.InvalidateState();}}public void Click(){Debug.Log(TAG + "Click");if(nativeObject == null){Debug.LogError(TAG + "invalid obj!!!");return;}nativeObject.Call("start", activity, width, height);}public void GetData(){if(nativeObject != null){Debug.Log("GetData start");byte[] yData = nativeObject.Call<byte[]>("readYBuffer");byte[] uvData = nativeObject.Call<byte[]>("readUVBuffer");if (yData != null){Debug.Log("get yData, size " + yData.Length);//File.WriteAllBytes("/sdcard/yData.y", yData);}else{Debug.Log("invalid yData");return;}if (uvData != null){Debug.Log("get uvData, size " + uvData.Length);//File.WriteAllBytes("/sdcard/uvData.uv", uvData);}else{Debug.Log("invalid uvData");return;}if(ImageXSize <= 0){ImageXSize = nativeObject.Call<int>("getPreviewWidth");ImageYSize = nativeObject.Call<int>("getPreviewHeight");}if (ImageXSize <= 0 || ImageYSize <= 0)return;if(m_YImageTex == null){m_YImageTex = new Texture2D(ImageXSize, ImageYSize, TextureFormat.R8, false);}m_YImageTex.LoadRawTextureData(yData);m_YImageTex.Apply();if(m_UVImageTex == null){m_UVImageTex = new Texture2D(ImageXSize / 2, ImageYSize / 2, TextureFormat.RG16, false);}m_UVImageTex.LoadRawTextureData(uvData);m_UVImageTex.Apply();YUVRawImage.GetComponent<RawImage>().material.SetTexture("_YTex", m_YImageTex);YUVRawImage.GetComponent<RawImage>().material.SetTexture("_UVTex", m_UVImageTex);Debug.Log("GetData done");}}}

总共就2个函数,一个是Click,开始调用Android,打开Camera。
一个GetData,每次收到Android消息都调用,读取Android Y数据和UV数据,上传2个纹理。然后作为材质设置给RawImage。

注意,Y纹理一个像素只有一个字节,所以纹理类型是TextureFormat.R8,大小就是width * height。 UV纹理,一个像素,2个字节,所以纹理类型为TextureFormat.RG16。 UV纹理的大小,只有Y数据的1/4。但这不重要,到shader中,都归一化为0~1的大小了。不影响计算。

最后

最后,把android的代码,编译成一个aar,放到Unity工程的Plugin/Android目录,然后运行,就可以跑起来了!(Camera权限逻辑不再赘述)。


http://chatgpt.dhexx.cn/article/9jH1pRFD.shtml

相关文章

图形学之Unity渲染管线流程分析

文章来源&#xff1a; 学习通http://www.bdgxy.com/ 普学网http://www.boxinghulanban.cn/ 智学网http://www.jaxp.net/ 表格制作excel教程http://www.tpyjn.cn/ 学习通http://www.tsgmyy.cn/ 下图是《Unity Shader 入门精要》一书中的渲染流程图&#xff1b; ApplicationS…

Unity渲染(二):Shader着色器基础入门之渲染Image图片

Unity渲染(二):图片渲染 通过这里&#xff0c;你会学习到怎么将一张图片渲染到UI的Image组件或者SpriteRenderer上&#xff0c;以及透明物体的渲染。 上一章:Unity渲染(一):着色器基础入门之纯色Shader 开发环境&#xff1a;Unity5.0或者更高 透明与不透明的最终效果 概述 1…

unity 渲染性能分析工具

目标 既然要优化&#xff0c;肯定要有个目标&#xff1a; pc上一般要求&#xff1a;一秒渲染60帧 移动端&#xff1a;一秒渲染30帧 这应该是最低的要求&#xff0c;如果游戏运行时&#xff0c;游戏帧率有变化&#xff0c;人眼能够明显的感觉到帧率下降。 优化的首要规则是找到…

unity 渲染环境设置

环境光分为两种&#xff0c;一种是环境光漫反射SH&#xff08;球谐光照&#xff09;&#xff0c;另一种是环境光的镜面反射IBL&#xff08;基于图像的渲染&#xff09;。 光照的配置位置可以在 窗口 -> 渲染 -> 光照 打开。 环境照明对应的就是环境漫反射&#xff0c;环…

【流程向】模型复原与Unity渲染

项目简述 简单记录下学校里的一个项目&#xff0c;涉及到对/何家村遗宝/的模型复原&#xff0c;记录一下模型制作的全流程&#xff0c;同时涉及到Unity中一些优化画面的技术点。项目中渲染效果优先&#xff0c;没有怎么考虑性能。 流程&#xff1a;Blender高低模与展UV ->…

Unity中的物体渲染顺序

big seven 文章目录 前言 一、摄像机渲染 二、划分渲染队列 三、不透明物体的渲染 四、透明物体的渲染 五、UGUI元素的渲染 总结 前言 Unity中物体的渲染顺序 提示&#xff1a;以下是本篇文章正文内容&#xff0c;下面案例可供参考 一、摄像机渲染 Unity中的渲染顺序首先是…

Unity渲染流程概述

本篇的任务是回答&#xff1a;在Untiy的渲染流程中CPU和GPU分别做了什么。 渲染到设备屏幕显示的每一帧的画面&#xff0c;都经历几个阶段的加工过程&#xff1a; 应用程序阶段&#xff08;CPU&#xff09;&#xff1a;识别出潜在可视的网格实例&#xff0c;并把他们及其材质…

Unity_渲染_灯光_前向渲染

前向渲染路径 前向渲染的作用和意义场景内有多个灯光,如何渲染每个灯光对物体的影响 前向渲染的作用和意义 前向渲染的作用:处理多光源的渲染,多光源渲染在unity 有2中渲染方式 前向渲染和延时渲染 .延时渲染主要用于主机,PC平台,不在本次讨论范围.主要来研究前向渲染前向渲染…

【Unity渲染】前向渲染和延迟渲染的区别及切换

前向渲染和延迟渲染通道的区别&#xff0c;主要在对于光源的处理上。 Unity默认是前向渲染通道&#xff0c;如果光源特别多&#xff0c;可以使用延迟渲染。 前向渲染 使用前向渲染路径时&#xff0c;被照亮的对象将在单独的通道中进行渲染。根据场景中的光源数量以及它们是否…

从FrameDebugger看Unity渲染

从FrameDebugger看Unity渲染(一) Unity如何渲染一个3D2D的游戏画面&#xff0c;今天通过FrameDebugger来看下Unity内置渲染管线的渲染策略, 后续再出一些URP渲染管线相关的文章。 对啦&#xff01;这里有个游戏开发交流小组里面聚集了一帮热爱学习游戏的零基础小白&#xff0c…

UnityShader入门精要——Unity中的渲染优化技术(二)

减少DrawCall数目 最常见的优化技术——批处理。实现原理为减少渲染每一帧所需的drawcall数目。使用同一个材质的物体可以一起处理。 优点缺点动态批处理切处理都是Unity 自动完成的&#xff0c;不需要我们自己做任何操作&#xff0c;而且物体是可以移动的限制很多&#xff0c…

Unity渲染顺序(2)

Camera 除了Screen Space - Overlay(屏幕空间覆盖模式)下的Canvas,场景中的其他物体需要渲染到屏幕中&#xff0c;都需要在指定的相机的绘制下。场景中可以创建多个相机&#xff0c;每个相机所拍摄的内容可能并不相同&#xff0c;在场景中有多相机的情况&#xff0c;不同的相机…

Unity渲染顺序(1)

添加排序层级 在Unity编辑器的右上角选择Layers 按钮&#xff0c;在下拉菜单中点击Edit Layers…选项&#xff0c;将显示当前Unity的Tags, Sorting Layers&#xff0c;和Layers 编辑选项。 Sorting Layers是Unity中对排序的层级的定义块&#xff0c;在面板中越靠后的排序层级越…

Unity的渲染流程

Unity中坐标空间的转换&#xff1a; Unity的渲染流程&#xff1a; 渲染到设备屏幕的每一帧画面都要经历如下几个阶段&#xff1a; 应用程序阶段&#xff08;CPU&#xff09;&#xff1a;将材质和模型数据发送给GPU 几何阶段&#xff08;GPU&#xff09;&#xff1a;进行顶点…

Unity 渲染原理

1 渲染流程 应用程序阶段&#xff08;CPU&#xff09;&#xff1a;识别出潜在可视的网格实例&#xff0c;并把他们及其材质提交给 GPU 以供渲染。几何阶段&#xff08;GPU&#xff09;&#xff1a;进行顶点变换等计算&#xff0c;并将三角形转换到齐次空间并进行裁剪。光栅化阶…

Unity标准化的场景渲染流程

笔者用的unity版本为2020.3.30f1c1&#xff0c;开一个HDRP模板场景&#xff0c;OK开始我们的操作。 第一步&#xff1a;删除场景中的所有东西只留下竹林玻璃房作为展示场景&#xff0c;将所有物体勾选Static&#xff0c;新增一个摄像机&#xff0c;如下图&#xff1a; 第二步&a…

Unity3D性能优化——渲染篇

大渣好&#xff0c;我又来了。 之前的文章中&#xff0c;我们了解了Profiler工具&#xff0c;以及在实际项目中unity的CPU优化分析及方法&#xff0c;本文我们主要了解在我们的项目中GPU的性能分析&#xff0c;以及对GPU性能进行优化的相关技术。 渲染优化 在了解优化渲染前&a…

Unity中的灯光和渲染

一&#xff1a;Unity中的灯光 ——Directional Light&#xff1a;模拟太阳光。它与位置无关&#xff0c;是平行光&#xff0c;可以调整旋转角度模拟昼夜 ——Spot Light&#xff1a;模拟车灯、手电筒的光、舞台灯光 ——Point Light&#xff1a;模拟灯泡 ——Area Light&#…

execv 函数的应用

execv函数族&#xff1a;系统来调用某程序模块 int execl(const char *path, const char *arg, ...); int execlp(const char *file, const char *arg, ...); int execle(const char *path, const char *arg, ..., char * const envp[]); int ex…

Linux 0.11-execve函数-35

Linux 0.11-execve函数-35 execve函数读取文件开头 1KB 的数据解析这 1KB 的数据为 exec 结构判断是脚本文件还是可执行文件准备参数空间设置 eip 和 esp&#xff0c;完成摇身一变累了吧&#xff0c;休息会 转载 execve函数 书接上回&#xff0c;上回书咱们说到&#xff0c;进…