I am very new to Swift and struggling so please go easy. I am building a macOS project using Swift 4 in XCode 10.1
I am receiving a sysex message from a device using the following code which I got from the MIDIUtility example project supplied with AudioKit:
func receivedMIDISystemCommand(_ data: [MIDIByte]) {
if let command = AKMIDISystemCommand(rawValue: data[0]) {
var newString = "MIDI System Command: \(command) \n"
var hexArray = [UInt8]()
for bit in data {
hexArray.append(bit) //this requires 'public typealias MIDIByte = UInt8'
newString.append(String("\(bit) "))
}
updateText("\(newString) \n")
updateData(hexArray)
}
updateText("received \(data.count) bytes of data \n\n")
}
The sysex I am receiving is very different than what I get when using SysEx Librarian or MIDI Monitor to receive the same sysEx. Both of these apps receive 6 packets of 32,110,110,110,110, and 110 bytes. This is what I expect to get from my MIDI device. My code gets 6 packets of 520,1300,20,1538,1294 and 1544 bytes. It looks like the data I want is in these packets, but it is spit up by a bunch of 0's.
data as received in MIDI Monitor or SysEx Librarian:
some of the data as received by my code
Why is my code receiving so much extra data? Is there a way to filter out the unwanted data or will I need to figure out how to use something other that AudioKit for my project?
Related
I'm using the (very cool) AudioKit framework to process audio for a macOS music visualizer app. My audio source ("mic") is iTunes 12 via Rogue Amoeba Loopback.
In the Xcode debug window, I'm seeing the following error message each time I launch my app:
kAudioUnitErr_TooManyFramesToProcess : inFramesToProcess=513, mMaxFramesPerSlice=512
I've gathered from searches that this is probably related to sample rate, but I haven't found a clear description of what this error indicates (or if it even matters). My app is functioning normally, but I'm wondering if this could be affecting efficiency.
EDIT: The error message does not appear if I use Audio MIDI Setup to set the Loopback device output to 44.1kHz. (I set it initially to 48.0kHz to match my other audio devices, which I keep configured to the video standard.)
Keeping Loopback at 44.1kHz is an acceptable solution, but now my question would be: Is it possible to avoid this error even with a 48.0kHz input? (I tried AKSettings.sampleRate = 48000 but that made no difference.) Or can I just safely ignore the error in any case?
AudioKit is initialized thusly:
AKSettings.audioInputEnabled = true
mic = AKMicrophone()
do {
try mic.setDevice(AudioKit.inputDevices![inputDeviceNumber])
}
catch {
AKLog("Device not set")
}
amplitudeTracker = AKAmplitudeTracker(mic)
AudioKit.output = AKBooster(amplitudeTracker, gain: 0)
do {
try AudioKit.start()
} catch {
AKLog("AudioKit did not start")
}
mic.start()
amplitudeTracker?.start()
This string saved my app
try? AVAudioSession.sharedInstance().setPreferredIOBufferDuration(0.02)
I've been stuck with this problem for the past 2 hours and I'm about to give up. I've Googled a lot and just can't find something that works. I am using the newest version of XCode.
I want to send a PNG image through Bluetooth Low Energy, the receiver in this case is Bleno. Things I've tried include converting the image to a Base64 String and converting the image to an UInt8 array and sending each entry in the array one by one.
Nothing I do works, so the only "working" code I is for converting an image to bytes, which is this:
let testImage = UIImage(named: "smallImage")
let imageData = UIImagePNGRepresentation(testImage!)
I already have all the connection code for BLE and am able to send a simple and short string successfully to Bleno. I also know through "peripheral.maximumWriteValueLength" that the maximum amount of bytes I can send at once is 512 bytes, although I can imagine that using Low Energy lowers this maximum. I'm trying to send the data with peripheral.writeValue, which looks like this at the moment (array was the UInt8 array I tried):
peripheral.writeValue(Data(bytes:array), for: char, type: CBCharacteristicWriteType.withResponse)
The error I most often get is Error Domain=CBATTErrorDomain Code=13 "The value's length is invalid.", which I assume is because the data I try to send is more than 512 bytes. I tried sending the data in packages smaller than 512 bytes, but like I said, I just can't get it to work.
In short my question is this: How do I send a PNG image (in multiple parts) through BLE?
Edit: I'm got something to work, altough it is pretty slow, because it's not utilising the full 20 bytes per packet:
let buffer: [UInt8] = Array(UIImagePNGRepresentation(testImage!)!)
let start = "I:"+String(buffer.count)
peripheral.writeValue(start.data(using: .utf8)!, for: char, type: CBCharacteristicWriteType.withResponse)
buffer.forEach{b in
let data = NSData(bytes: [UInt8(b)], length: MemoryLayout<UInt8>.size)
peripheral.writeValue(data as Data, for: char, type: CBCharacteristicWriteType.withResponse)
}
Thanks in advance!
So I'm trying to read some arduino output with the ORSSerialPort lib.
If I use the example code, of the lib, named ORSSerialPortDemo, everything works.
Now the only thing I mis in the demo is how they used the SerialPortDemoController.swift in there viewController.
I created a standard project, that has the ViewController.swift.
The bridge header an imports are done, all references compile.
But rather then using a GUI to select the usb-port and Baud Rate, I like to set them in the code.
something like this:
var serial: SerialPortDemoController?
serial = SerialPortDemoController()
serial.path = "dev/cu.usbserial-A6006hPS"
serial.baudRate = 9600
serial.open()
Then al I need is to read from the port. That should already work, with the function: func serialPort(_ serialPort: ORSSerialPort, didReceive data: Data)?
So in this functio I could do something like this:
if let string = NSString(data: data, encoding: String.Encoding.utf8.rawValue)
{
print(string)
}
I have looked around, but nothing seems to work. If somebody could point me in the right direction that would be great.
Thanks!
Do you have correct serial port setup ie. RTS, DTR, CTS, DSR, and DCD pins?
From: https://www.arduino.cc/en/Serial/Begin
The default is 8 data bits, no parity, one stop bit.
Yes. the port starts with "/" ie. "/dev/cu.usbserial-A6006hPS"
From doc's
http://cocoadocs.org/docsets/ORSSerialPort/1.5.1/
Using ORSSerialPortManager is simple. To get the shared serial port manager:
Cocoa code:
ORSSerialPortManager *portManager = [ORSSerialPortManager sharedSerialPortManager];
To get a list of available ports:
NSArray *availablePorts = portManager.availablePorts;
I have been trying to write a working program that takes in data from a UDP socket and displays it in an edit control box as you receive the data (My exposure to c++ is also only about a week :P have only done embedded C code before). I have a working program that can send and output data on a button click but I want something that can do it in real time. The aim is scale this up into a larger GUI program that can send control data to hardware and get responses from them.
I have run into various problems including:
The program just not executing my OnReceivefunction (derived from
CAsyncSocket)
Getting the OnReceive function to run on a separate thread so that it can still run after a button has been clicked sending a control packet to the client then waiting for a response in a while loop
Not being able to output the data in the edit box (tried using both CEdit and CString)
ReplaceSel error saying that the type char is incompatible with LPCTSTR
My code is based on this codeproject.com tutorial, being almost exactly what I want but I get the error in 4.
EDIT: the error in 4. disappears when I change it to a TCHAR but then it outputs random chinese characters. The codeproject.com tutorial outputs the correct characters regardless of char or TCHAR declaration. When debugged my code has type wchar_t instead type char like the other code.
Chinese output
In the working program echoBuffer[0] the character sent and displayed was a 1
UINT ReceiveData(LPVOID pParam)
{
CTesterDlg *dlg = (CTesterDlg*)pParam;
AfxSocketInit(NULL);
CSocket echoServer;
// Create socket for sending/receiving datagrams
if (echoServer.Create(12345, SOCK_DGRAM, NULL) == 0)
{
AfxMessageBox(_T("Create() failed"));
}
for (;;)
{ // Run forever
// Client address
SOCKADDR_IN echoClntAddr;
// Set the size of the in-out parameter
int clntAddrLen = sizeof(echoClntAddr);
// Buffer for echo string
char echoBuffer[ECHOMAX];
// Block until receive message from a client
int recvMsgSize = echoServer.ReceiveFrom(echoBuffer, ECHOMAX, (SOCKADDR*)&echoClntAddr, &clntAddrLen, 0);
if (recvMsgSize < 0)
{
AfxMessageBox(_T("RecvFrom() failed"));
}
echoBuffer[recvMsgSize] = '\0';
dlg->m_edit.ReplaceSel(echoBuffer);
dlg->m_edit.ReplaceSel(_T("\r\n"));
}
}
After reading the link that #IInspectable provided about working with strings and checking the settings differences between the two programs it became clear that the issue lay with an incorrect conversion to UNICODE. My program does not require it so I disabled it.
This has cleared up the issue in 4. and provided solutions for 2 and 3.
I also think I know why another instance of my program would not run OnReceivein 1. because that file was not being defined by one that was already being run by the program, but that is now irrelevant.
I'm using ProtoBuf-Net for serialize and deserialize TCP_Messages.
I've tried all the suggestions I've found here, so I really don't know where the mistake is.
The serialize is made server side, and the deserialize is made on an application client-side.
Serialize code:
public void MssGetCardPersonalInfo(out RCPersonalInfoRecord ssPersonalInfoObject, out bool ssResult) {
ssPersonalInfoObject = new RCPersonalInfoRecord(null);
TCP_Message msg = new TCP_Message(MessageTypes.GetCardPersonalInfo);
MemoryStream ms = new MemoryStream();
ProtoBuf.Serializer.Serialize(ms, msg);
_tcp_Client.Send(ms.ToArray());
_waitToReadCard.Start();
_stopWaitHandle.WaitOne();
And the deserialize:
private void tpcServer_OnDataReceived(Object sender, byte[] data, TCPServer.StateObject clientState)
{
TCP_Message message = new TCP_Message();
MemoryStream ms = new MemoryStream(data);
try
{
//ms.ToArray();
//ms.GetBuffer();
//ms.Position = 0;
ms.Seek(0, SeekOrigin.Begin);
message = Serializer.Deserialize<TCP_Message>(ms);
} catch (Exception ex)
{
EventLog.WriteEntry(_logSource, "Error deserializing: " + ex.Message, EventLogEntryType.Error, 103);
}
As you can see, I've tried a bunch of different approache, now comented.
I have also tried to deserialize using the DeserializeWithLengthPrefix but it didn't work either.
I'm a bit noob on this, so if you could help me I would really appreciate it.
Thank's
The first thing to look at here is: is the data you receive the data you send. Until you can answer "yes" to that, all other questions are moot. It is very easy to confuse network code and end up reading partial frames, etc. As a crude debugger test:
Debug.WriteLine(Convert.ToBase64String(ms.GetBuffer(), 0, (int)ms.Length));
should work. If the two base-64 strings are not identical, then you aren't working with the same data. This can be because of a range of reasons, including packet splitting and combining. You need to keep in mind that in a stream, what you send is not what you get - at least, not down to the fragment level. You might "send" data in the way of:
one bundle of 20 bytes
one bundle of 10 bytes
but at the receiving end, it would be entirely legitimate to read:
1 byte
22 bytes
7 bytes
All that TCP guarantees is the order and accuracy of the bytes. It says nothing about their breakdown in terms of chunks. When writing network code, there are basically 2 approaches:
have one thread that synchronously reads from a stream and local buffer (doesn't scale well)
async code (very scalable), but accept that you're going to have to do a lot of "do I have a complete frame? if not, append to an input buffer; if so, process any available frame data (could be multiple), then shuffle any incomplete data to the start of the buffer"