flash – RED5 1.0录制问题NetStream.Buffer.Empty

我已经在我的
Windows 8系统上设置了RED5 1.0 Final,我正在尝试让录音正常工作.我读过的教程说缓冲客户端(在Flash中)的数据,然后缓冲区为空后关闭连接.

我遇到的问题是,当我开始记录缓冲区时,立即报告它是空的(NetStream.Buffer.Empty).我已经让它工作了一两次缓冲区确实填满了但是由于某种原因它停止了那样工作.

我可以看到,即使我将相机从网络流分离后,客户端仍在向服务器发送数据,因为服务器端的文件继续增长.我的解决方案是在关闭连接之前记录停止后等待60秒.

有一件事是,当没有更多的数据包要发送时,我在服务器端看到该文件从mystream.ser切换到mystream.flv并停止增长.我正在考虑在服务器端编写一些代码来等待这个事件发生,然后让客户端知道它可以关闭流.

这是我第一次涉足动作脚本,所以我可能在这里做了一些完全错误的事情.请告诉我.

编辑这是客户端代码

<?xml version="1.0" encoding="utf-8"?>
<s:Application xmlns:fx="http://ns.adobe.com/mxml/2009"
               xmlns:s="library://ns.adobe.com/flex/spark"
               xmlns:mx="library://ns.adobe.com/flex/mx"
               xmlns:ns1="*"
               minWidth="955" minHeight="600" applicationComplete="init()" >

    <fx:Script>
        <![CDATA[

            import flash.display.DisplayObject;
            import flash.display.Sprite;
            import flash.events.NetStatusEvent;
            import flash.media.Camera;
            import flash.media.H264Level;
            import flash.media.H264Profile;
            import flash.media.H264VideoStreamSettings;
            import flash.media.Video;
            import flash.net.NetConnection;
            import flash.net.NetStream;


            var cam:Camera = Camera.getCamera();
            var mic:Microphone = Microphone.getMicrophone();
            var nc:NetConnection = new NetConnection();
            var activeStream:NetStream;
            private var bufferCheckTimer:Timer;
            var recordHalted:Boolean = false;


            protected function init(): void{
                recordButton.enabled = false;
                stopButton.enabled = false;
                nc.addEventListener(NetStatusEvent.NET_STATUS, onNetStatus);            
                nc.connect("rtmp://localhost/oflaDemo");
                nc.client = this;
            }

            public function onMetaData(info:Object):void {

                trace("playback called onMetaData");        
            }

            public function onBWDone(... rest) : void {
                // have to have this for an RTMP connection
                trace('onBWDone');
            }

            public function onBWCheck(... rest) : uint {
                trace('onBWCheck');
                //have to return something, so returning anything :)
                return 0;
            }


            protected function onNetStatus(event:NetStatusEvent):void{
                trace(event.info.code);
                if(nc.connected)
                {
                    SetupCameraAndMic();
                    recordButton.enabled = true;
                    stopButton.enabled = true;
                }           
            }

            protected function SetupCameraAndMic(): void{
                activeStream = new NetStream(nc);
                activeStream.bufferTime = 60;   
                activeStream.client = this;
                activeStream.addEventListener(NetStatusEvent.NET_STATUS, handleStreamStatus,false,0,true);


                var h264Settings:H264VideoStreamSettings = new H264VideoStreamSettings();
                h264Settings.setProfileLevel(H264Profile.BASELINE, H264Level.LEVEL_2);
                activeStream.videoStreamSettings = h264Settings;

                cam.addEventListener(StatusEvent.STATUS, handleCameraStatus, false, 0, true);
                mic.addEventListener(StatusEvent.STATUS, handleMicrophoneStatus, false, 0, true);

                cam.setMode(320,240, 15);
                cam.setQuality(0, 80);
                cam.setKeyFrameInterval(7);

                mic.rate = 44;
                mic.gain = 75;
                mic.setSilenceLevel(0);
                mic.setUseEchoSuppression(true);



                activeStream.attachCamera(cam);
                activeStream.attachAudio(mic);  
                videoContainer.attachCamera(cam);






            }

            private function handleCameraStatus(e:StatusEvent):void {
                trace("handleCameraStatus - " + e.code);
                switch(e.code) {
                    case 'Camera.muted':
                        // Show a message
                        break;
                    case 'Camera.Unmuted':              
                        //finishCamAndMicSetup();
                        break;
                }
            }


            private function handleMicrophoneStatus(e:StatusEvent):void {
                trace("handleMicrophoneStatus - " + e.code);
                switch(e.code) {
                    case 'Microphone.Muted':
                        // Show a message
                        break;
                    case 'Microphone.Unmuted':              
                        //finishCamAndMicSetup();
                        break;
                }
            }


            private function handleStreamStatus(e:NetStatusEvent):void {
                switch(e.info.code) {
                    case 'NetStream.Buffer.Empty':
                        trace("NetStream.Buffer.Empty");
                        break;
                    case 'NetStream.Buffer.Full':
                        trace("NetStream.Buffer.Full");
                        break;
                    case 'NetStream.Buffer.Flush':
                        trace("NetStream.Buffer.Flush");
                        break;
                }
            }

            protected function recordButton_clickHandler(event:MouseEvent):void
            {
                if(activeStream == null)
                {
                    SetupCameraAndMic();
                }
                if(activeStream != null){
                    var tempDate:Date = new Date();
                    var uniqueFileName:String = "RecordME_" + String(tempDate.getMinutes()) + String(tempDate.getMilliseconds());

                    bufferLabel.text = ""+ activeStream.bufferTime;
                    activeStream.publish(uniqueFileName, "record");
                    bufferCheckTimer = new Timer(100);
                    bufferCheckTimer.addEventListener(TimerEvent.TIMER, handleBufferCheck, false, 0, true);
                    bufferCheckTimer.start();

                }

            }

            private function handleBufferCheck(e:TimerEvent):void {
                if(activeStream != null) {
                    trace("Buffer: " + activeStream.bufferLength);
                    statusLabel.text = "Buffer: " + activeStream.bufferLength;
                    if (recordHalted == true) {
                        if ( activeStream.bufferLength == 0 ) {
                            activeStream.close();
                            activeStream = null;



                            bufferCheckTimer.stop();
                            bufferCheckTimer.removeEventListener(TimerEvent.TIMER, handleBufferCheck);
                            bufferCheckTimer = null;

                            // OK - playback time
                            //doRecordingPlayback();
                        }
                    }


                if (bufferCheckTimer != null) {
                    bufferCheckTimer.reset();
                    bufferCheckTimer.start();
                }
            }
            }

            protected function stopButton_clickHandler(event:MouseEvent):void
            {

                activeStream.attachCamera(null);
                activeStream.attachAudio(null); 
                videoContainer.attachCamera(null);                      
                recordHalted = true;

            }

        ]]>
    </fx:Script>

    <fx:Declarations>
        <!-- Place non-visual elements (e.g., services, value objects) here -->
    </fx:Declarations>
    <mx:VideoDisplay id="videoContainer" x="158" y="53" width="640" height="480"
                    chromeColor="#3C2020" />
    <s:Button id="recordButton" x="396" y="546" label="Record"
              click="recordButton_clickHandler(event)"/>
    <s:Button id="stopButton" x="491" y="546" label="Stop Recording"
              click="stopButton_clickHandler(event)"/>
    <s:Label id="statusLabel" x="158" y="555" width="207"/>
    <s:Label x="14" y="408" text="Buffer Set to:"/>
    <s:Label id="bufferLabel" x="91" y="408" text="0"/>
</s:Application>

谢谢

最佳答案 我现在没有正在运行的RTMP服务器,所以我只是评论我在代码中看到的内容.

我认为,在发布(录制)流时,你首先得到的关于缓冲内容的建议可能不是一个好主意.也许教程不是关于发布而是订阅现有的流,在这种情况下,缓冲是一个好主意.

您将bufferTime设置为60秒. docs表示你应该将bufferTime设置为0以进行实时录制.也就是说,您希望在摄像机/麦克风生成数据后立即发送数据.

接下来就是你正在使用的Timer.这似乎是检查缓冲区长度,以检测录制已停止.录音停止的时间实际上只有两次:

>当用户单击“停止”按钮时,您的代码将停止它
>服务器或其他东西导致其停止的任何情况(网络问题等)

我建议使用你的NetStatusEvent处理程序方法(handleStreamStatus())来检查消息“NetStream.Record.Stop”,而不是使用计时器来检查bufferLength.这允许您的代码检测录制何时停止,原因是用户单击“停止”.

计时器可能是问题的原因.即使您设置了一个较大的bufferTime值,它也可能无法正常工作或与Red 5服务器的行为不同,或者它可能会被服务器端设置覆盖.无论如何,重点是不要使用bufferLength来检测录制是否已经停止.

有一堆useful messages用NetStatusEvent调度,我建议你阅读它们,看看它们中的任何一个在你的场景中是否有用.它们非常可靠,似乎可以处理大多数可能出现的情况.

我注意到的最后一件事(不是问题,但值得纠正):你在麦克风上启用回音抑制,但除非你得到增强型麦克风,否则这将无效:

var mic:Microphone = Microphone.getEnhancedMicrophone();
点赞