I am streaming audio via websocket from IBM Watson TTS,
currently I already have the audio playing from the stream with NAudio,
but I would like to shift out NAudio for the builtin audioclip
( NAudio is great, but would like to be compatible with Oculus lipsync plus other plugins using the audioclip )
Solution below with NAudio,
The websocket is adding the received bytes to the MemoryStream
private MemoryStream ms = new MemoryStream();
websocket.OnMessage += async (bytes) =>
{
var pos = ms.Position;
ms.Position = ms.Length;
ms.Write(bytes, 0, bytes.Length);
ms.Position = pos;
if (!isPlaying)
{
StartCoroutine(PlayAudioStream());
}
}
The playback is currently handled by NAudio, I would like the shift to a AudioClip if possible,
using (var blockAlignedStream = new BlockAlignReductionStream(WaveFormatConversionStream.CreatePcmStream(new RawSourceWaveStream(ms, new WaveFormat(22050, 16, 1)))))
{
var aggregator = new SampleAggregator(blockAlignedStream.ToSampleProvider());
aggregator.NotificationCount = blockAlignedStream.WaveFormat.SampleRate / 50;
using (var wo = new WaveOutEvent())
{
isPlaying = true;
wo.Init(aggregator);
wo.Play();
while (wo.PlaybackState == PlaybackState.Playing) //&& !disconnected
{
yield return new WaitForEndOfFrame();
}
wo.Dispose();
aggregator.Reset();
aggregator = null;
}
}
Related
I am trying to capture video from web camera using unity and hololens.
I found this example on the unity page here .
I am pasting the code below. The light on the cam turns on, however it doesnt record.
The VideoCapture.CreateAsync doesnt create a VideoCapture. So the delegate there is never executed.
I saw this thread, however that was on. On the player settings the webcam and microphone capabilities are on.
What could be the problem?
using UnityEngine;
using System.Collections;
using System.Linq;
using UnityEngine.XR.WSA.WebCam;
public class VideoCaptureExample : MonoBehaviour
{
static readonly float MaxRecordingTime = 5.0f;
VideoCapture m_VideoCapture = null;
float m_stopRecordingTimer = float.MaxValue;
// Use this for initialization
void Start()
{
StartVideoCaptureTest();
Debug.Log("Start");
}
void Update()
{
if (m_VideoCapture == null || !m_VideoCapture.IsRecording)
{
return;
}
if (Time.time > m_stopRecordingTimer)
{
m_VideoCapture.StopRecordingAsync(OnStoppedRecordingVideo);
}
}
void StartVideoCaptureTest()
{
Resolution cameraResolution = VideoCapture.SupportedResolutions.OrderByDescending((res) => res.width * res.height).First();
Debug.Log(cameraResolution);
float cameraFramerate = VideoCapture.GetSupportedFrameRatesForResolution(cameraResolution).OrderByDescending((fps) => fps).First();
Debug.Log(cameraFramerate);
VideoCapture.CreateAsync(false, delegate (VideoCapture videoCapture)
{
Debug.Log("NULL");
if (videoCapture != null)
{
m_VideoCapture = videoCapture;
Debug.Log("Created VideoCapture Instance!");
CameraParameters cameraParameters = new CameraParameters();
cameraParameters.hologramOpacity = 0.0f;
cameraParameters.frameRate = cameraFramerate;
cameraParameters.cameraResolutionWidth = cameraResolution.width;
cameraParameters.cameraResolutionHeight = cameraResolution.height;
cameraParameters.pixelFormat = CapturePixelFormat.BGRA32;
m_VideoCapture.StartVideoModeAsync(cameraParameters,
VideoCapture.AudioState.ApplicationAndMicAudio,
OnStartedVideoCaptureMode);
}
else
{
Debug.LogError("Failed to create VideoCapture Instance!");
}
});
}
void OnStartedVideoCaptureMode(VideoCapture.VideoCaptureResult result)
{
Debug.Log("Started Video Capture Mode!");
string timeStamp = Time.time.ToString().Replace(".", "").Replace(":", "");
string filename = string.Format("TestVideo_{0}.mp4", timeStamp);
string filepath = System.IO.Path.Combine(Application.persistentDataPath, filename);
filepath = filepath.Replace("/", #"\");
m_VideoCapture.StartRecordingAsync(filepath, OnStartedRecordingVideo);
}
void OnStoppedVideoCaptureMode(VideoCapture.VideoCaptureResult result)
{
Debug.Log("Stopped Video Capture Mode!");
}
void OnStartedRecordingVideo(VideoCapture.VideoCaptureResult result)
{
Debug.Log("Started Recording Video!");
m_stopRecordingTimer = Time.time + MaxRecordingTime;
}
void OnStoppedRecordingVideo(VideoCapture.VideoCaptureResult result)
{
Debug.Log("Stopped Recording Video!");
m_VideoCapture.StopVideoModeAsync(OnStoppedVideoCaptureMode);
}
}
EDIT:
The problem was that the API doesnt work on the Emulator
You should try taking a look at this thread here. Where it goes into detail on how to record a video with HoloLens as well as how to take a photo. Also make sure you have the WebCam and microphone capabilities set. Also if you are trying to save it, make sure you have the Videos Library capability as well.
OnVideoCaptureCreated:
void OnVideoCaptureCreated (VideoCapture videoCapture)
{
if (videoCapture != null)
{
m_VideoCapture = videoCapture;
Resolution cameraResolution = VideoCapture.SupportedResolutions.OrderByDescending((res) => res.width * res.height).First();
float cameraFramerate = VideoCapture.GetSupportedFrameRatesForResolution(cameraResolution).OrderByDescending((fps) => fps).First();
CameraParameters cameraParameters = new CameraParameters();
cameraParameters.hologramOpacity = 0.0f;
cameraParameters.frameRate = cameraFramerate;
cameraParameters.cameraResolutionWidth = cameraResolution.width;
cameraParameters.cameraResolutionHeight = cameraResolution.height;
cameraParameters.pixelFormat = CapturePixelFormat.BGRA32;
m_VideoCapture.StartVideoModeAsync(cameraParameters,
VideoCapture.AudioState.None,
OnStartedVideoCaptureMode);
}
else
{
Debug.LogError("Failed to create VideoCapture Instance!");
}
}
OnStartVideoCaptureMode:
void OnStartedVideoCaptureMode(VideoCapture.VideoCaptureResult result)
{
if (result.success)
{
string filename = string.Format("MyVideo_{0}.mp4", Time.time);
string filepath = System.IO.Path.Combine(Application.persistentDataPath, filename);
m_VideoCapture.StartRecordingAsync(filepath, OnStartedRecordingVideo);
}
}
OnStartRecordingVideo:
void OnStartedRecordingVideo(VideoCapture.VideoCaptureResult result)
{
Debug.Log("Started Recording Video!");
// We will stop the video from recording via other input such as a timer or a tap, etc.
}
StopRecordingVideo:
// The user has indicated to stop recording
void StopRecordingVideo()
{
m_VideoCapture.StopRecordingAsync(OnStoppedRecordingVideo);
}
OnStopRecordingVideo:
void OnStoppedRecordingVideo(VideoCapture.VideoCaptureResult result)
{
Debug.Log("Stopped Recording Video!");
m_VideoCapture.StopVideoModeAsync(OnStoppedVideoCaptureMode);
}
void OnStoppedVideoCaptureMode(VideoCapture.VideoCaptureResult result)
{
m_VideoCapture.Dispose();
m_VideoCapture = null;
}
I'm working with hololens, and I'm trying to get the image of the front camera. The only thing I need is to take the image of each frame of that camera and transform it into a byte array.
Follow Unity Example in the documentation. It has a well written example:
https://docs.unity3d.com/Manual/windowsholographic-photocapture.html
Copied from the unity documentation from the link above:
using UnityEngine;
using System.Collections;
using System.Linq;
using UnityEngine.XR.WSA.WebCam;
public class PhotoCaptureExample : MonoBehaviour {
PhotoCapture photoCaptureObject = null;
Texture2D targetTexture = null;
// Use this for initialization
void Start() {
Resolution cameraResolution = PhotoCapture.SupportedResolutions.OrderByDescending((res) => res.width * res.height).First();
targetTexture = new Texture2D(cameraResolution.width, cameraResolution.height);
// Create a PhotoCapture object
PhotoCapture.CreateAsync(false, delegate (PhotoCapture captureObject) {
photoCaptureObject = captureObject;
CameraParameters cameraParameters = new CameraParameters();
cameraParameters.hologramOpacity = 0.0f;
cameraParameters.cameraResolutionWidth = cameraResolution.width;
cameraParameters.cameraResolutionHeight = cameraResolution.height;
cameraParameters.pixelFormat = CapturePixelFormat.BGRA32;
// Activate the camera
photoCaptureObject.StartPhotoModeAsync(cameraParameters, delegate (PhotoCapture.PhotoCaptureResult result) {
// Take a picture
photoCaptureObject.TakePhotoAsync(OnCapturedPhotoToMemory);
});
});
}
void OnCapturedPhotoToMemory(PhotoCapture.PhotoCaptureResult result, PhotoCaptureFrame photoCaptureFrame) {
// Copy the raw image data into the target texture
photoCaptureFrame.UploadImageDataToTexture(targetTexture);
// Create a GameObject to which the texture can be applied
GameObject quad = GameObject.CreatePrimitive(PrimitiveType.Quad);
Renderer quadRenderer = quad.GetComponent<Renderer>() as Renderer;
quadRenderer.material = new Material(Shader.Find("Custom/Unlit/UnlitTexture"));
quad.transform.parent = this.transform;
quad.transform.localPosition = new Vector3(0.0f, 0.0f, 3.0f);
quadRenderer.material.SetTexture("_MainTex", targetTexture);
// Deactivate the camera
photoCaptureObject.StopPhotoModeAsync(OnStoppedPhotoMode);
}
void OnStoppedPhotoMode(PhotoCapture.PhotoCaptureResult result) {
// Shutdown the photo capture resource
photoCaptureObject.Dispose();
photoCaptureObject = null;
}
}
To get bytes: Replace the following method:
void OnCapturedPhotoToMemory(PhotoCapture.PhotoCaptureResult result, PhotoCaptureFrame photoCaptureFrame) {
// Copy the raw image data into the target texture
photoCaptureFrame.UploadImageDataToTexture(targetTexture);
// Create a GameObject to which the texture can be applied
GameObject quad = GameObject.CreatePrimitive(PrimitiveType.Quad);
Renderer quadRenderer = quad.GetComponent<Renderer>() as Renderer;
quadRenderer.material = new Material(Shader.Find("Custom/Unlit/UnlitTexture"));
quad.transform.parent = this.transform;
quad.transform.localPosition = new Vector3(0.0f, 0.0f, 3.0f);
quadRenderer.material.SetTexture("_MainTex", targetTexture);
// Deactivate the camera
photoCaptureObject.StopPhotoModeAsync(OnStoppedPhotoMode);
}
with the method below:
void OnCapturedPhotoToMemory(PhotoCapture.PhotoCaptureResult result, PhotoCaptureFrame photoCaptureFrame)
{
List<byte> imageBufferList = new List<byte>();
imageBufferList.Clear();
photoCaptureFrame.CopyRawImageDataIntoBuffer(imageBufferList);
var bytesArray = imageBufferList.ToArray();
}
If you are using Unity 2018.1 or 2018.2 (I think also 2017.4) now, then it will NOT work. Unity has public tracker to resolve it:
https://issuetracker.unity3d.com/issues/windowsmr-failure-to-take-photo-capture-in-hololens
I created a workaround until Unity fixes the bug:
https://github.com/MSAlshair/HoloLensMediaCapture
Basic sample without using PhotoCapture from unity as a workaround: More details in the link above
You must add #if WINDOWS_UWP to be able to use MediaCapture: Ideally, you want to use PhotoCapture from unity to avoid this, but until Unity resolve the issue, you can use something like this.
#if WINDOWS_UWP
public async System.Threading.Tasks.Task<byte[]> GetPhotoAsync()
{
//Get available devices info
var devices = await Windows.Devices.Enumeration.DeviceInformation.FindAllAsync(
Windows.Devices.Enumeration.DeviceClass.VideoCapture);
var numberOfDevices = devices.Count;
byte[] photoBytes = null;
//Check if the device has camera
if (devices.Count > 0)
{
Windows.Media.Capture.MediaCapture mediaCapture = new Windows.Media.Capture.MediaCapture();
await mediaCapture.InitializeAsync();
//Get Highest available resolution
var highestResolution = mediaCapture.VideoDeviceController.GetAvailableMediaStreamProperties(
Windows.Media.Capture.MediaStreamType.Photo).
Select(item => item as Windows.Media.MediaProperties.ImageEncodingProperties).
Where(item => item != null).
OrderByDescending(Resolution => Resolution.Height * Resolution.Width).
ToList().First();
using (var photoRandomAccessStream = new Windows.Storage.Streams.InMemoryRandomAccessStream())
{
await mediaCapture.CapturePhotoToStreamAsync(highestResolution, photoRandomAccessStream);
//Covnert stream to byte array
photoBytes = await ConvertFromInMemoryRandomAccessStreamToByteArrayAsync(photoRandomAccessStream);
}
}
else
{
System.Diagnostics.Debug.WriteLine("No camera device detected!");
}
return photoBytes;
}
public static async System.Threading.Tasks.Task<byte[]> ConvertFromInMemoryRandomAccessStreamToByteArrayAsync(
Windows.Storage.Streams.InMemoryRandomAccessStream inMemoryRandomAccessStream)
{
using (var dataReader = new Windows.Storage.Streams.DataReader(inMemoryRandomAccessStream.GetInputStreamAt(0)))
{
var bytes = new byte[inMemoryRandomAccessStream.Size];
await dataReader.LoadAsync((uint)inMemoryRandomAccessStream.Size);
dataReader.ReadBytes(bytes);
return bytes;
}
}
#endif
Do NOT forget to add capabilities to the manifest:
WebCam
Microphone: I am not sure if it needed since we are only taking photos or not, but I added it anyway.
Pictures Library if you want to save to it
In player settings, enable access to the camera, and try again.
I'm trying to load a video via url, but I keep getting the same error. I'm using Unity 5.3 and the example code from http://docs.unity3d.com/ScriptReference/WWW-movie.html (heavily modified because the current example doesn't compile).
using UnityEngine;
using System.Collections;
// Make sure we have gui texture and audio source
[RequireComponent (typeof(GUITexture))]
[RequireComponent (typeof(AudioSource))]
public class TestMovie : MonoBehaviour {
string url = "http://www.unity3d.com/webplayers/Movie/sample.ogg";
WWW www;
void Start () {
// Start download
www = new WWW(url);
StartCoroutine(PlayMovie());
}
IEnumerator PlayMovie(){
MovieTexture movieTexture = www.movie;
// Make sure the movie is ready to start before we start playing
while (!movieTexture.isReadyToPlay){
yield return 0;
}
GUITexture gt = gameObject.GetComponent<GUITexture>();
// Initialize gui texture to be 1:1 resolution centered on screen
gt.texture = movieTexture;
transform.localScale = Vector3.zero;
transform.position = new Vector3 (0.5f,0.5f,0f);
// gt.pixelInset.xMin = -movieTexture.width / 2;
// gt.pixelInset.xMax = movieTexture.width / 2;
// gt.pixelInset.yMin = -movieTexture.height / 2;
// gt.pixelInset.yMax = movieTexture.height / 2;
// Assign clip to audio source
// Sync playback with audio
AudioSource aud = gameObject.GetComponent<AudioSource>();
aud.clip = movieTexture.audioClip;
// Play both movie & sound
movieTexture.Play();
aud.Play();
}
}
I added this as a script to the Main Camera in a new scene, and I get this error:
Error: Cannot create FMOD::Sound instance for resource (null), (An invalid parameter was passed to this function. )
UnityEngine.WWW:get_movie()
<PlayMovie>c__Iterator4:MoveNext() (at Assets/TestMovie.cs:20)
UnityEngine.MonoBehaviour:StartCoroutine(IEnumerator)
TestMovie:Start() (at Assets/TestMovie.cs:16)
(Line 20 is MovieTexture movieTexture = www.movie;)
I've been working on this for a while now, it's happened on many files and both of my systems.
I found a solution! I tested the code with unity 5.2.X and 5.3.X.
I dont know why but Unity 3d requires first wait unitl the download is done with isDone == true and after the copy of the texture wait until the isReadyToPlay == true.
The movie file must be OGG Video format some MP4 files doesn't work.
Well. Check the code:
using UnityEngine;
using UnityEngine.UI;
using System.Collections;
public class MovieTextureStream : MonoBehaviour {
public Text progressGUI;
public MeshRenderer targetRender = null;
public AudioSource targetAudio = null;
public string URLString = "http://unity3d.com/files/docs/sample.ogg";
MovieTexture loadedTexture;
IEnumerator Start() {
if(targetRender ==null) targetRender = GetComponent<MeshRenderer> ();
if(targetAudio ==null) targetAudio = GetComponent<AudioSource> ();
WWW www = new WWW (URLString);
while (www.isDone == false) {
if(progressGUI !=null) progressGUI.text = "Progresso do video: " + (int)(100.0f * www.progress) + "%";
yield return 0;
}
loadedTexture = www.movie;
while (loadedTexture.isReadyToPlay == false) {
yield return 0;
}
targetRender.material.mainTexture = loadedTexture;
targetAudio.clip = loadedTexture.audioClip;
targetAudio.Play ();
loadedTexture.Play ();
}
}
I won't provide a solution with the WWW.movie, but an alternative solution that may help ou or anyone else.
On IPhone, we didn't find a solution to stream video from a server, we decided to download the video before reading it, here's how:
string docPath = Application.persistentDataPath + "/" + id + ".mp4";
if (!System.IO.File.Exists(docPath))
{
WWW www = new WWW(videouUrl);
while(!www.isDone)
{
yield return new WaitForSeconds(1);
Loading.Instance.Message = "Downloading video : " + (int)(www.progress * 100) + "%";
if (!string.IsNullOrEmpty(www.error))
Debug.Log(www.error);
}
byte[] data = www.bytes;
System.IO.File.WriteAllBytes(docPath, data);
}
mediaPlayer.Load(docPath);
onVideoReady();
mediaPlayer.Play();
This is the coroutine used to download and write the video on the iphone's file system. Once it's done, you can load and play it.
Hope it helps.
I'm writing a Unity class to capture and playback audio data from a microphone. Playback part works fine I can hear my voice in the headphones but I cannot access audio samples because pcmsetposcallback is never called during playback. It is called only once inside createSound method. I think i'm missing some setting , also tried several OR combinations for FMOD.MODE flag but with no luck.
I'm using fmodstudio10510.unitypackage and testing under windows 7 but it should have fully croos-platform support.
Thanks in advance.
Walter
public class AudioInit : MonoBehaviour {
FMOD.System lowlevel = null;
FMOD.Sound snd = null;
// callbacks delegates
FMOD.SOUND_PCMREADCALLBACK pcmreadcallbackPtr = new FMOD.SOUND_PCMREADCALLBACK (pcmreadcallbackFunc);
int driverId;
void Start () {
int channels = 1;
int sampleRate = 8000;
float recordTime = 1.0f;
// get low level instance
FMOD_StudioSystem.instance.System.getLowLevelSystem(out lowlevel);
// fill sound info struct
FMOD.CREATESOUNDEXINFO soundInfo = new FMOD.CREATESOUNDEXINFO ();
soundInfo.cbsize = System.Runtime.InteropServices.Marshal.SizeOf (typeof(FMOD.CREATESOUNDEXINFO));
soundInfo.length = (uint)(sampleRate * channels * sizeof(byte) * recordTime);
soundInfo.numchannels = channels;
soundInfo.defaultfrequency = sampleRate;
soundInfo.format = FMOD.SOUND_FORMAT.PCM8;
soundInfo.pcmreadcallback = pcmreadcallbackPtr;
soundInfo.pcmsetposcallback = pcmsetposcallbackPtr;
soundInfo.dlsname = IntPtr.Zero;
// FMODE MODE flag
FMOD.MODE mode = FMOD.MODE.OPENUSER | FMOD.MODE.LOOP_NORMAL;
// create sound
FMOD.RESULT res = lowlevel.createSound((string)null, mode, ref soundInfo, out snd);
if (res != FMOD.RESULT.OK) {
Debug.Log ("ERROR snd " + res.ToString ());
return;
}
// get driver
res = lowlevel.getDriver (out driverId);
if (res != FMOD.RESULT.OK) {
Debug.Log ("ERROR getDriver " + res.ToString ());
return;
}
// start record from microphone
res = lowlevel.recordStart (driverId, snd, true);
if (res != FMOD.RESULT.OK) {
Debug.Log ("ERROR recordStart " + res.ToString ());
return;
}
uint pos = 0;
uint tries = 10;
// wait for a valid record position
while ( !(pos > 0) && (tries--) > 0 ) {
if ( lowlevel.getRecordPosition(driverId, out pos) == FMOD.RESULT.OK ){
System.Threading.Thread.Sleep(100);
} else { break; }
}
if ( !( pos > 0 )) {
Debug.Log ("ERROR invalid record position");
return;
}
// start playback
FMOD.Channel chn;
res = lowlevel.playSound (snd, new FMOD.ChannelGroup (IntPtr.Zero), false, out chn);
if (res != FMOD.RESULT.OK) {
Debug.Log ("ERROR recordStart " + res.ToString ());
return;
}
}
// only called once during lowlevel.createSound execution
static FMOD.RESULT pcmreadcallbackFunc (IntPtr sound, IntPtr data, uint len){
Debug.Log("pcmreadcallback sample size " + len.ToString());
return FMOD.RESULT.OK;
}
// Update is called once per frame
void Update () {
}
}
The recording system doesn't go via pcmreadcallback, that's why you aren't getting those callbacks.
To access the microphone data use Sound::lock and Sound::unlock.
I'm developing a game on iOS with Monogame. Is it possible to init a texture2D with images downloaded from the internet? If so, is there any samples I can follow? Thanks a lot.
I'm not sure if this maps 1 to 1 with iOS, but here is how I did it on Android MonoGame. Most of this code came from another post on Stack, but unfortunately I can't remember the post!
/// <summary>
/// Downloads a texture based on a URL path, and stores it in this sprites texture.
/// </summary>
/// <param name="URL">The path to the file.</param>
private void DownloadTexture(String URL)
{
HttpWebRequest request = HttpWebRequest.Create(new Uri(URL)) as HttpWebRequest;
request.BeginGetResponse((ar) =>
{
HttpWebResponse response = request.EndGetResponse(ar) as HttpWebResponse;
using (Stream stream = response.GetResponseStream())
{
using (MemoryStream ms = new MemoryStream())
{
int count = 0;
do
{
byte[] buf = new byte[1024];
count = stream.Read(buf, 0, 1024);
ms.Write(buf, 0, count);
} while (stream.CanRead && count > 0);
ms.Seek(0, SeekOrigin.Begin);
/*Texture2D*/ mTexture = Texture2D.FromStream(GameObjectManager.pInstance.pGraphicsDevice, ms); //.PreMultiplyAlpha();
}
}
}, null);
}