我有一个网站 (
Angular 15 & Ionic 6
),用户上传视频,然后我在画布上对其进行一些更改,捕获画布流,然后输出到文件
我首先创建一个
MediaStream
的视频画布,我的更改将在该画布上进行,以及视频文件中的音轨 captureStream
或 AudioContext
然后我用
MediaStream
捕获 MediaRecorder
并将数据写入 onstop
我当前的代码在从 Windows 桌面(许多不同的文件类型)在 Chrome 中上传文件时有效,但是当我在 iOS 上选择视频时,如果我将音轨添加到 Blob
,那么我使用的
MediaStream
捕获它0 字节。任何人都可以帮我弄清楚我需要做什么才能将音轨添加到 Safari iOS 上的流中吗?这是我的 HTML(源代码)和我捕获的内容。两者都有
MediaRecorder
,因为用户直到稍后才看到处理:
display: none
这就是我访问这些元素的方式:
<video id="uploadedVideo" #uploadedVideo autoplay webkit-playsinline playsinline></video>
<canvas id="canvasVideo" #canvasVideo></canvas>
我在 init 时将视频的增益设置为 0,因为我不想在你看不到视频时播放声音:
@ViewChild('uploadedVideo', { static: false }) videoElementRef: ElementRef<HTMLVideoElement>;
video: HTMLVideoElement;
@ViewChild('fileUpload', { static: false }) fileUpload: ElementRef;
@ViewChild('canvasVideo', { static: false }) videoCanvasElementRef: ElementRef<HTMLCanvasElement>;
videoCanvas: HTMLCanvasElement;
videoCanvasCtx: CanvasRenderingContext2D;
选择文件后,我开始处理并设置
silenceVideoAudio() {
if (this.videoConnectedToAudioContext) {
return;
}
this.audioContext = new AudioContext();
this.sourceNode = this.audioContext.createMediaElementSource(this.video);
const gainNode = this.audioContext.createGain();
gainNode.gain.value = 0; // Ensure audio is silent
this.sourceNode.connect(gainNode);
gainNode.connect(this.audioContext.destination);
this.videoConnectedToAudioContext = true;
}
在视频结束时停止:
MediaRecorder
这是
onVideoFileSelected(file: File) {
this.video.src = URL.createObjectURL(file);
this.silenceVideoAudio();
this.video.onloadeddata = async () => {
await this.startVideoProcessing(); };
this.video.onended = async () => {
this.recorder.stop();
};
}
,我在这里获取音轨,设置
startVideoProcessing
和 MediaStream
,并在完成后处理这些块。 Safari 上没有错误,只有 0 字节:MediaRecorder
正如您在上面的条件中看到的,我检查
async startVideoProcessing() {
// Do some other setup that is unrelated
let stream: MediaStream;
if (this.isSafari) {
stream = new MediaStream([...this.videoCanvas.captureStream(30).getVideoTracks()]);
} else {
stream = new MediaStream([...this.videoCanvas.captureStream(30).getVideoTracks(), this.getAudioTrack()]);
}
const options = {
audioBitsPerSecond: 128000,
videoBitsPerSecond: 2500000,
mimeType: this.mimeType,
};
this.recorder = new MediaRecorder(stream, options);
this.recorder.ondataavailable = (event) => {
if (event.data && event.data.size > 0) {
this.chunks.push(event.data);
}
};
this.recorder.start();
this.recorder.onerror = (event) => {
console.error(event);
throw new Error(event.toString());
};
this.recorder.onstop = () => {
const videoBlob = new Blob(this.chunks, { type: this.mimeType });
const videoFile = new File([videoBlob], this.mimeType === 'video/mp4' ? 'video-green-screen.mp4' : 'video-green-screen.webm', { type: this.mimeType });
this.fileSelected.emit(videoFile);
};
}
我当前没有添加任何音轨,因为如果我像在
isSafari
中那样添加它,使用此 else
方法来获得像 getAudioTrack
这样的音轨,那么录音机停止时的stream = new MediaStream([...this.videoCanvas.captureStream(30).getVideoTracks(), this.getAudioTrack()]);
和ondataavailable
始终为空。如果我不尝试添加音轨,那么它就有正确的字节。像这样添加音轨适用于除 Safari 之外的所有平台。this.chunks
我正在测试
getAudioTrack() {
if ((this.video as any).captureStream) {
this.videoAudioTracks = (this.video as any).captureStream().getAudioTracks()[0];
} else if ((this.video as any).mozCaptureStream) {
this.videoAudioTracks = (this.video as any).mozCaptureStream().getAudioTracks()[0];
} else {
const destination = this.audioContext.createMediaStreamDestination();
this.sourceNode.connect(destination);
this.videoAudioTracks = destination.stream.getAudioTracks()[0];
}
return this.videoAudioTracks;
}
任何人都可以帮助我改变这些方法:
Safari iOS 16.2
时,它会使所有
MediaStream
数据为空MediaRecorder
,问题在于:
AudioContext
https://webaudio.github.io/web-audio-api/#dom-baseaudiocontext-onstatechange) 如果没有用户交互,Safari 不会将此状态转换为
A newly-created AudioContext will always begin in the suspended state, and a state change event will be fired whenever the state changes to a different state. This event is fired before the complete event is fired.
running
不够直接)。在某些情况下,Safari 还会在没有用户交互的情况下阻止播放。请阅读本页的这两部分,详细了解 Safari 何时以及如何处理带音频和不带音频以及带或不带用户交互的视频播放:https://developer.apple.com/documentation/webkit/delivering_video_content_for_safari/#3030259) https://developer.apple.com/documentation/webkit/delivering_video_content_for_safari/#3030251 在这种情况下,Safari 需要用户交互才能将新
onVideoFileSelected
AudioContext
。恢复或播放视频的调用必须发生在用户与之交互的按钮或元素的调用堆栈中。
在此代码中,用户单击按钮处理视频(至少在移动设备或 Safari 上)会发生处理:
running
videoFile: File;
onVideoFileSelected(file: File) {
this.videoFile = file;
}
startVideoProcessing() {
this.video.src = URL.createObjectURL(this.videoFile);
this.silenceVideoAudio();
this.video.onloadeddata = async () => {
await this.videoProcessing();
this.registerVideoFrameCallback();
};
this.video.onended = async () => {
this.recorder.stop();
};
}