unity中播放摄像步骤如下:

unity五.6初步扩大了videoPlayer,使得录制播放相对相比较简单,项目须求进行了须臾间研商采纳,也境遇很多坑,谷歌百度时而发觉真正有那么些标题,1些简约难题如下:

在讲代码完毕在此之前,笔者先讲讲TextureView, SurfaceTexture,OpenGL
ES都以些什么鬼东西,小编又是怎么使用这些东西来突显1个摄像的。

按: 近年来做了八个直播的预备性商讨项目,
因而记录下直播的技巧的贯彻,在那进度中壹些难题消除的思路,以android平台的兑现认证。

一.快要播放的摄像拖入projec。(注意:unity一般支持的视频格式有mov, .mpg,
.mpeg, .mp3,.avi, .asf格式  )

1)播放无声音

TextureView
顾名思义也便是三个一连了View的八个View控件而已,官网的分解是如此的:
A TextureView can be used to display a content stream. Such a content
stream can for instance be a video or an OpenGL scene. The content
stream can come from the application’s process as well as a remote
process.

它能够去展现2个剧情流,比如录制流,OpenGL渲染的场景等。那些流能够是地面程序进度也能够是长途进度流,有点绕,笔者的知晓就是,比如既能够是本地摄像流,也能够是网络摄像流。
只顾的是: TextureView
选用的是硬件加快器去渲染,就就像录制的硬解码跟软解码,2个靠的是GPU解码,多个靠CPU解码。
那正是说什么样去选择这一个TextureView呢?
ES来播音摄像,unity中播放摄像。OK,现在SurfaceTexture将要上场了,从那八个类的命名我们就清楚TextureView重点是View,而SurfaceTexture
重点是Texture它的官网解释:
Captures frames from an image stream as an OpenGL ES texture.The image
stream may come from either camera preview or video decode. \

也便是说它能捕获多个图像流的一帧来作为OpenGL
的texture也等于纹理。那么些图形流重假若源于相机的预览或录制的解码。(我想以此特点是不该能够用来做过多事了)。
到这儿,texture也有了,那么OpenGL\也就能够出来工作了,它亦可绑定texture并将其在TextureView上壹帧1帧的给绘制出来,就形成了我们所见到录像图像了(切实有关SurfaceTexture、TextureView大家能够参考那里)
说了这么,是该来点代码来瞧瞧了,好的代码就跟读农学小说亦然,那样的姣好,并不是说本身写的代码相当漂亮貌啦,这只是追求。。。

品种组织

二.在地方中添加RawImage。(因为Image使用sprite渲染,rawImage是用texture渲染)

二)通过slider控制作和播出放进程

代码

先从MainActicity主类开端:

public class MainActivity extends AppCompatActivity implements TextureView.SurfaceTextureListener,
        MediaPlayer.OnPreparedListener{
    /**本地视频的路径*/
    public String videoPath = Environment.getExternalStorageDirectory().getPath()+"/aoa.mkv";
    private TextureView textureView;
    private MediaPlayer mediaPlayer;
    /**
    * 视频绘制前的配置就发生在这个对象所在类中.
    * 真正的绘制工作则在它的子类中VideoTextureSurfaceRenderer
    */
    private TextureSurfaceRenderer videoRenderer;
    private int surfaceWidth;
    private int surfaceHeight;
    private Surface surface;


    @Override
    protected void onCreate(Bundle savedInstanceState) {
        super.onCreate(savedInstanceState);
        setContentView(R.layout.activity_main);

        textureView = (TextureView) findViewById(R.id.id_textureview);
        //注册一个SurfaceTexture,用于监听SurfaceTexure
        textureView.setSurfaceTextureListener(this);

    }
    /**
    * 播放视频的入口,当SurfaceTexure可得到时被调用
    */
    private void playVideo() {
        if (mediaPlayer == null) {
            videoRenderer = new VideoTextureSurfaceRenderer(this, textureView.getSurfaceTexture(), surfaceWidth, surfaceHeight);
            surface = new Surface(videoRenderer.getSurfaceTexture());
            initMediaPlayer();
        }
    }

    private void initMediaPlayer() {
        this.mediaPlayer = new MediaPlayer();
        try {
            mediaPlayer.setDataSource(videoPath);
            mediaPlayer.setSurface(surface);
            mediaPlayer.prepareAsync();
            mediaPlayer.setOnPreparedListener(this);
            mediaPlayer.setLooping(true);
        } catch (IllegalArgumentException e1) {
            // TODO Auto-generated catch block
            e1.printStackTrace();
        } catch (SecurityException e1) {
            // TODO Auto-generated catch block
            e1.printStackTrace();
        } catch (IllegalStateException e1) {
            // TODO Auto-generated catch block
            e1.printStackTrace();
        } catch (IOException e1) {
            // TODO Auto-generated catch block
            e1.printStackTrace();
        }
    }
    @Override
    public void onPrepared(MediaPlayer mp) {
        try {
            if (mp != null) {
                mp.start(); //视频开始播放了
            }
        } catch (IllegalStateException e) {
            e.printStackTrace();
        }
    }


    @Override
    protected void onResume() {
        super.onResume();
        if (textureView.isAvailable()) {
            playVideo();
        }
    }

    @Override
    protected void onPause() {
        super.onPause();
        if (videoRenderer != null) {
            videoRenderer.onPause();  //记得去停止视频的绘制线程
        }
        if (mediaPlayer != null) {
            mediaPlayer.release();
            mediaPlayer =null;
        }
    }

    @Override
    public void onSurfaceTextureAvailable(SurfaceTexture surface, int width, int height) {
        surfaceWidth = width;
        surfaceHeight = height;
        playVideo();
    }

    @Override
    public void onSurfaceTextureSizeChanged(SurfaceTexture surface, int width, int height) {

    }

    @Override
    public boolean onSurfaceTextureDestroyed(SurfaceTexture surface) {
        return false;
    }

    @Override
    public void onSurfaceTextureUpdated(SurfaceTexture surface) {

    }

}

那就是先后的入口类,关于Mediaplayer是怎么播放时录像源的,笔者就在此就背着了,那中间其实还有众多事物的,我们能够自动的印证。有一点本人索要说说就是,一般MediaPlayer.setSurface(param)里面的参数param都以SurfaceView.SurfaceHolder,而自笔者那时一贯用的是Surface
(有关Surface能够参照那里),笔者这些录制播放与其余的录制播放的区分就在此。那篇先一时半刻写在此时啦,后续核心的绘图工作,就前边有空就再写了。下面写的如若有怎么样难题期待我们能多多辅导,多谢不尽!
下一篇已写好TextureView+SurfaceTexture+OpenGL
ES来播放摄像(2)

  • unity纹理插件和视频采访(摄像源)
    VideoSourceCamera
  • 迈克风范集(音频源)
    AudioSourceMIC
  • 录像编码
    VideoEncoder
  • 旋律编码
    AudioEncoder
  • FLV编码(混合)
    MuxerFLV
  • http流上传(上传源)
    PublisherHttp
  • 流摄像播放(重放)
    play
  • OpenGL图形图象处理

三.rawImage下添加videoPlayer组件,将摄像赋给videoplayer,将其拖到video
clip上。

三)摄像截图(texture->texture二d)

从本篇小说开端将会介绍那多少个零件的贯彻细节,互相依赖关系的处理格局。

四.开立脚本PlayVodeoOnUGUI,宗旨代码:rawImage.texture =
videoPlayer.texture,即将video的tuxture赋值给rawImage就能看出要播放的摄像了

4)摄像甘休时事件激活

(一) —— unity纹理插件

大家的直播项目劳务于unity,而unity是3个跨平台的游艺引擎,底层根据差异平台,选用了directx,
opengl, opengles, 因而需求贯彻不一致平台的图片插件。
(unity的图样插件文书档案)
https://docs.unity3d.com/Manual/NativePluginInterface.html
在anroid平台下的直播,unity图形插件成效主尽管渲染线程通告,
因为随便录制采访,创设GALAXY Tab,
图像处理(shader),照旧编码摄像纹理传入,都急需工作在unity的渲染线程下,

  • unity创造纹理,将纹理ID传递到直播插件。

  • 打开camera设备,准备好采访三星GALAXY Tab,
    mCameraGLTexture =
    new GLTexture(width, height, GLES11Ext.GL_TEXTURE_EXTERNAL_OES,
    GLES20.GL_RGBA);
    note: camera
    三星GALAXY Tab是1种独特类型的纹理,通过GLES1一Ext.GL_TEXTURE_EXTERNAL_OES参数创造

  • 回调布告每1帧数据准备达成
    public void onFrameAvailable(final SurfaceTexture surfaceTexture)
    {
    //那里将募集线程的图象push到渲染线程处理
    getProcessor().append (new Task() {
    @Override
    public void run() {
    surfaceTexture.updateTexImage();
    }
    });
    }

    camera 苹果平板也亟需做特殊纹理表明

      #extension GL_OES_EGL_image_external : require
      precision mediump float;
      uniform samplerExternalOES uTexture0;
      varying vec2 texCoordinate;
      void main(){
          gl_FragColor = texture2D(uTexture0, texCoordinate);
      }
    
  • 将camera 平板电脑纹理写入到 unity的纹理,
    将一张纹理写入到另一纹理,可以二种办法,

    • 通过glReadPixels, 但那样会导致巨大的内存拷贝,CPU压力。

    • 渲染到纹理(render to texture)
      mTextureCanvas = new
      GLRenderTexture(mGLTexture);//声明rendertexture

        void renderCamera2Texture()
        {
            mTextureCanvas.begin();
            cameraDrawObject.draw();
            mTextureCanvas.end();
        }
      

      GLRenderTexture的实现, 如下
      GLRenderTexture(GLTexture tex)
      {
      mTex = tex;
      int fboTex = tex.getTextureID();
      GLES20.glGenFramebuffers(1, bufferObjects, 0);
      GLHelper.checkGlError(“glGenFramebuffers”);
      fobID = bufferObjects[0];

            //创建render buffer
            GLES20.glGenRenderbuffers(1, bufferObjects, 0);
            renderBufferId = bufferObjects[0];
            //绑定Frame buffer
            GLES20.glBindFramebuffer(GLES20.GL_FRAMEBUFFER, fobID);
            GLHelper.checkGlError("glBindFramebuffer");
            //Bind render buffer and define buffer dimension
            GLES20.glBindRenderbuffer(GLES20.GL_RENDERBUFFER, renderBufferId);
            GLHelper.checkGlError("glBindRenderbuffer");
            GLES20.glRenderbufferStorage(GLES20.GL_RENDERBUFFER, GLES20.GL_DEPTH_COMPONENT16, tex.getWidth(), tex.getHeight());
            GLHelper.checkGlError("glRenderbufferStorage");
            //设置为framebuffer为texutre类型
            GLES20.glFramebufferTexture2D(GLES20.GL_FRAMEBUFFER, GLES20.GL_COLOR_ATTACHMENT0, GLES20.GL_TEXTURE_2D, fboTex, 0);
            GLHelper.checkGlError("glFramebufferTexture2D");
            //设置depthbuffer
            GLES20.glFramebufferRenderbuffer(GLES20.GL_FRAMEBUFFER, GLES20.GL_DEPTH_ATTACHMENT, GLES20.GL_RENDERBUFFER, renderBufferId);
            GLHelper.checkGlError("glFramebufferRenderbuffer");
            //we are done, reset
            GLES20.glBindRenderbuffer(GLES20.GL_RENDERBUFFER, 0);
            GLHelper.checkGlError("glBindRenderbuffer");
            GLES20.glBindFramebuffer(GLES20.GL_FRAMEBUFFER, 0);
            GLHelper.checkGlError("glBindFramebuffer");
        }
      
        void begin()
        {
            GLES20.glBindFramebuffer(GLES20.GL_FRAMEBUFFER, fobID);
            GLHelper.checkGlError("glBindFramebuffer");
            GLES20.glViewport(0, 0, mTex.getWidth(), mTex.getHeight());
            GLHelper.checkGlError("glViewport");
        }
      
        void end()
        {
            GLES20.glBindFramebuffer(GLES20.GL_FRAMEBUFFER, 0);
        }
      
  • 金沙注册送58,美颜
    由此shader完结实时的美颜功能,美白,磨皮
    (美颜成效的法则可参照)
    http://meituplus.com/?p=101
    (更加多的实时shader处理可参照)
    https://github.com/wuhaoyu1990/MagicCamera

大多无太大标题,以上五个难点消除方案在下文石黄文字区域,先介绍一下video
Player应用,后续对那多个难点开始展览缓解。

 

(一)新建video Player 能够在ui下田间video
Play组建,也能够一贯右键-video-videoplayer,添加后得以看出如下图所示的零部件

金沙注册送58 1

本文首要重点说一下一下参数:source有三种格局clip方式和url形式,clip则足以一直通过videoClip进行播报,url则能够透过url进行广播。renderMode为渲染方式,既可以为camera,material等,借使是采纳ui播放的选取render
texture,本文接纳此情势。audioOutputMode有二种,none方式,direct形式(没尝试)和audiosource方式,本文选取audiosource情势,接纳此方式时只必要将audiosource组建拖入上海体育场地中videoPlayer中的audiosource参数槽中即可,不须求其余处理,但有时候会并发拖入后videoPlayer中的audiosource参数槽消失,且无声音播放,所以壹般选取代码添加,如下所示:

 

      //代码添加
        videoPlayer = gameObject.AddComponent<VideoPlayer>();
        //videoPlayer = gameObject.GetComponent<VideoPlayer>();
        audioSource = gameObject.AddComponent<AudioSource>();
        //audioSource = gameObject.GetComponent<AudioSource>();
        videoPlayer.playOnAwake = false;
        audioSource.playOnAwake = false;
        audioSource.Pause();

 

(二)录制播放的操纵与节奏/动画播放类似,videoPlayer有play/pause等方法,具体能够参见前边完整代码。

         
在调用摄像播放完成时事件loopPointReached(此处为借鉴外人称作,此事件其实并不是摄像播放落成时的风云),顾名思义,此事件为直达摄像播放循环点时的事件,即当videoplay
的isLooping属性为true(即循环播放录制)时,录像结束时调用此措施,所以当录像非循环播放时,此事件在录像截至时调用不到。要想调用此方法能够把录制安装为循环播放,在loopPointReached钦命的事件中停播录制

(三)关于录像播放的ui选用题材,选择render texture时必要钦定target
texture。

      
一)在project面板上create-renderTexture,并把新建的renderTexture拖到videoplayer相应的参数槽上

      
二)在Hierarchy面板上新建ui-RawImage,并把上一步新建的renderTexture拖到RawImage的texture上即可。

      
其实能够不用那样处理,videoPlayer有texture变量,直接在update里面把texture值赋给RawImage的texture即可,代码如下

rawImage.texture = videoPlayer.texture;

      录像截图时能够经过videoPlayer.texture,把图像保存下去但是须求把texture转变为texture二d,即便后者继续在前端,不过不恐怕强制转货回去,转换以及存款和储蓄图片代码如下:

   private void SaveRenderTextureToPNG(Texture inputTex, string file)
    {
        RenderTexture temp = RenderTexture.GetTemporary(inputTex.width, inputTex.height, 0, RenderTextureFormat.ARGB32);
        Graphics.Blit(inputTex, temp);
        Texture2D tex2D = GetRTPixels(temp);
        RenderTexture.ReleaseTemporary(temp);
        File.WriteAllBytes(file, tex2D.EncodeToPNG());
    }

    private Texture2D GetRTPixels(RenderTexture rt)
    {
        RenderTexture currentActiveRT = RenderTexture.active;
        RenderTexture.active = rt;
        Texture2D tex = new Texture2D(rt.width, rt.height);
        tex.ReadPixels(new Rect(0, 0, tex.width, tex.height), 0, 0);
        RenderTexture.active = currentActiveRT;
        return tex;
    }

 

末段说一下经过slider控制摄像播放进度的题材,

透过slider控制录像播放存在三个难题,1方面在update实时把videoPlayer.time
赋值给slider,1方面要求把slider的value反馈给time,假如用slider的OnValueChanged(float
value)
方法则设有争辩,导致问题。所以可以通过UI事件的BeginDrag和EndDrag事件

事件进展,即当BeginDrag时,停止给slider赋值,当EndDrag时再度先河赋值。如下图所示

金沙注册送58 2

 全代码

using System;
using System.Collections;
using System.Collections.Generic;
using System.IO;
using UnityEngine;
using UnityEngine.UI;
using UnityEngine.Video;

public class VideoController : MonoBehaviour {
    public GameObject screen;
    public Text videoLength;
    public Text currentLength;
    public Slider volumeSlider;
    public Slider videoSlider;

    private string video1Url;
    private string video2Url;
    private VideoPlayer videoPlayer;
    private AudioSource audioSource;
    private RawImage videoScreen;
    private float lastCountTime = 0;
    private float totalPlayTime = 0;
    private float totalVideoLength = 0;

    private bool b_firstVideo = true;
    private bool b_adjustVideo = false;
    private bool b_skip = false;
    private bool b_capture = false;

    private string imageDir =@"D:\test\Test\bwadmRe";

    // Use this for initialization
    void Start () {
        videoScreen = screen.GetComponent<RawImage>();
        string dir = Path.Combine(Application.streamingAssetsPath,"Test");
        video1Url = Path.Combine(dir, "01.mp4");
        video2Url = Path.Combine(dir, "02.mp4");

        //代码添加
        videoPlayer = gameObject.AddComponent<VideoPlayer>();
        //videoPlayer = gameObject.GetComponent<VideoPlayer>();
        audioSource = gameObject.AddComponent<AudioSource>();
        //audioSource = gameObject.GetComponent<AudioSource>();
        videoPlayer.playOnAwake = false;
        audioSource.playOnAwake = false;
        audioSource.Pause();

        videoPlayer.audioOutputMode = VideoAudioOutputMode.AudioSource;
        videoPlayer.SetTargetAudioSource(0, audioSource);

        VideoInfoInit(video1Url);
        videoPlayer.loopPointReached += OnFinish;
    }

    #region private method
    private void VideoInfoInit(string url)
    {
        videoPlayer.source = VideoSource.Url;
        videoPlayer.url = url;        

        videoPlayer.prepareCompleted += OnPrepared;
        videoPlayer.isLooping = true;

        videoPlayer.Prepare();
    }

    private void OnPrepared(VideoPlayer player)
    {
        player.Play();
        totalVideoLength = videoPlayer.frameCount / videoPlayer.frameRate;
        videoSlider.maxValue = totalVideoLength;
        videoLength.text = FloatToTime(totalVideoLength);

        lastCountTime = 0;
        totalPlayTime = 0;
    }

    private string FloatToTime(float time)
    {
        int hour = (int)time / 3600;
        int min = (int)(time - hour * 3600) / 60;
        int sec = (int)(int)(time - hour * 3600) % 60;
        string text = string.Format("{0:D2}:{1:D2}:{2:D2}", hour, min, sec);
        return text;
    }

    private IEnumerator PlayTime(int count)
    {
        for(int i=0;i<count;i++)
        {
            yield return null;
        }
        videoSlider.value = (float)videoPlayer.time;
        //videoSlider.value = videoSlider.maxValue * (time / totalVideoLength);
    }

    private void OnFinish(VideoPlayer player)
    {
        Debug.Log("finished");        
    }

    private void SaveRenderTextureToPNG(Texture inputTex, string file)
    {
        RenderTexture temp = RenderTexture.GetTemporary(inputTex.width, inputTex.height, 0, RenderTextureFormat.ARGB32);
        Graphics.Blit(inputTex, temp);
        Texture2D tex2D = GetRTPixels(temp);
        RenderTexture.ReleaseTemporary(temp);
        File.WriteAllBytes(file, tex2D.EncodeToPNG());
    }

    private Texture2D GetRTPixels(RenderTexture rt)
    {
        RenderTexture currentActiveRT = RenderTexture.active;
        RenderTexture.active = rt;
        Texture2D tex = new Texture2D(rt.width, rt.height);
        tex.ReadPixels(new Rect(0, 0, tex.width, tex.height), 0, 0);
        RenderTexture.active = currentActiveRT;
        return tex;
    }
    #endregion

    #region public method
    //开始
    public void OnStart()
    {
        videoPlayer.Play();
    }
    //暂停
    public void OnPause()
    {
        videoPlayer.Pause();
    }
    //下一个
    public void OnNext()
    {
        string nextUrl = b_firstVideo ? video2Url : video1Url;
        b_firstVideo = !b_firstVideo;

        videoSlider.value = 0;
        VideoInfoInit(nextUrl);
    }
    //音量控制
    public void OnVolumeChanged(float value)
    {
        audioSource.volume = value;
    }
    //视频控制
    public void OnVideoChanged(float value)
    {
        //videoPlayer.time = value;
        //print(value);
        //print(value);
    }
    public void OnPointerDown()
    {
        b_adjustVideo = true;
        b_skip = true;
        videoPlayer.Pause();
        //OnVideoChanged();
        //print("down");
    }
    public void OnPointerUp()
    {
        videoPlayer.time = videoSlider.value;

        videoPlayer.Play();
        b_adjustVideo = false;  
        //print("up");
    }
    public void OnCapture()
    {
        b_capture = true;
    }
    #endregion

    // Update is called once per frame
    void Update () {
        if (videoPlayer.isPlaying)
        {            
            videoScreen.texture = videoPlayer.texture;
            float time = (float)videoPlayer.time;
            currentLength.text = FloatToTime(time);

            if(b_capture)
            {
                string name = DateTime.Now.Minute.ToString() + "_" + DateTime.Now.Second.ToString() + ".png";
                SaveRenderTextureToPNG(videoPlayer.texture,Path.Combine(imageDir,name));                
                b_capture = false;
            }

            if(!b_adjustVideo)
            {
                totalPlayTime += Time.deltaTime;
                if (!b_skip)
                {
                    videoSlider.value = (float)videoPlayer.time;
                    lastCountTime = totalPlayTime;
                }                
                if (totalPlayTime - lastCountTime >= 0.8f)
                {
                    b_skip = false;
                }
            }
            //StartCoroutine(PlayTime(15));   

        }
    }
}

 

 

 

即使利用插件AVPro Video所不平日不荒谬

相关文章

网站地图xml地图