How to send DTMF in Linphone ios SDK - swift

Calling functionality is working fine but the SendDTMF isn't working. Here is my minimal code that can help to understand the situation.
class AnswerCallViewController: UIViewController {
var call: Call!
var proxy_cfg: ProxyConfig!
let coreManager1 = LinphoneCoreManager()
var lc: Core?
let coreManager2 = LinphoneCoreManager2()
var mIterateTimer: Timer?
var cPtr: OpaquePointer?
#IBAction func btnsAppend(_ sender: UIButton) {
let digit1 = sender.currentTitle!
print("digit1", digit1)
let cchar = (sender.currentTitle!.cString(using: String.Encoding.utf8)?[0])!
do {
try call?.sendDtmf(dtmf: cchar)
} catch {
print("DTMF failed because \(error)")
}
}
It's should work but it's always return error. The error log is -
liblinphone-warning-linphone_call_send_dtmf(): invalid call, canceling DTMF
sendDtmf and Returns: -1 on error.
Other Information -
Linphone SDK - Version 5 (compiled with g729 codec.)
Xcode - Version 12.4 (12D4e)
Any help will be appreciated.

Please ensure that you call is defined and call.state == .StreamsRunning

Related

Xcode Trace/BPT trap: 5

Situation
Hi there,
I am developing an iOS app and while building my project I run into the following error message:
Error: Trace/BPT trap: 5
I didn't find anything online to fix this problem, so I wanted to know, if anyone here might be able to help.
I also had issues with Cocoapods and my Silicon Mac, so I want to list my steps I've tried fixing:
Setup
M1 MacBook Pro, macOS 11.1
XCode Version 12.4
Cocoapods with Pods for Firebase Swift, Auth, Firestore and Storage
Steps I tried fixing
cmd + shift + k for cleaning the build folder
closing XCode and opening Terminal using Rosetta
delete ~/Library/Developer/Xcode/Derived Data - Folder
pod deintegrate in project directory
delete Podfile.lock, app.xcworkspace, Pods directory
pod install
in app and pods projects build settings setting Excluded Architectures for any iOS Simulator SDK to arm64
setting Build Active Architecture Only to yes
convert Pods Project to Swift 5
build Pods Project
build app project
And then the following error occurs:
Log
Log enty from Merge swiftmodule (x86_64):
https://pastebin.com/MiSKGxB7
(Log way to long, exceeds character limit).
Code
As the error somewhere tells, it occured while trying to serialize the class BaseViewModel, here's the code from the Base.swift file I wrote containing that class:
import SwiftUI
import Firebase
import FirebaseFirestore
import Combine
protocol BaseModel: Identifiable, Codable {
var id: String? { get set }
var collection: String { get }
init()
}
class BaseViewModel<T: BaseModel>: ObservableObject, Identifiable, Equatable {
#Published var model: T
var id: String {
didSet {
self.model.id = id
}
}
var cancellables = [AnyCancellable]()
private var db = Firestore.firestore()
required init(){
let model = T.init()
self.model = model
self.id = model.id ?? UUID().uuidString
}
required init(id: String) {
var model = T.init()
model.id = id
self.model = model
self.id = id
}
init(model: T) {
self.model = model
self.id = model.id ?? UUID().uuidString
}
static func ==(lhs: BaseViewModel<T>, rhs: BaseViewModel<T>) -> Bool {
lhs.model.id == rhs.model.id
}
func load(completion: #escaping (Bool) -> Void = {finished in}){
if let id = model.id {
self.id = id
db.collection(model.collection).document(id).getDocument { docSnapshot, error in
guard let doc = docSnapshot else {
print("Error fetching document: \(error!)")
return
}
do {
guard let data = try doc.data(as: T.self) else {
print("Document empty \(type(of: self.model)) with id \(id)")
return
}
self.model = data
self.loadSubData {finished in
if finished{
completion(true)
}
}
} catch {
print(error.localizedDescription)
}
}
}
}
func loadSubData(completion: #escaping(Bool) -> Void = {finished in}) {
fatalError("Must be overridden!")
}
func loadDataByIDs<T, S>(from list: [String], appender: #escaping (T) -> Void) where T: BaseViewModel<S>, S: BaseModel {
for id in list {
let viewModel = T.init(id: id)
viewModel.load{finished in
if finished {
appender(viewModel)
}
}
}
}
func save(){
do {
let _ = try db.collection(model.collection).addDocument(from: model)
} catch {
print(error)
}
}
func update(){
if let id = model.id {
do {
try db.collection(model.collection).document(id).setData(from: model)
} catch {
print(error.localizedDescription)
}
}
}
func delete(){
if let id = model.id {
db.collection(model.collection).document(id).delete() { error in
if let error = error {
print(error.localizedDescription)
}
}
}
}
}
I had the same problem, I am solved it after I updated Quick/Nimble.
I guess some pod project with x86 files meed to update to support M1
Well for the record, anybody who is experiencing these odd bugs on M1 must read exactly inside the excode compilation error.
If they are saying a specific class it means xcode can't compile your code and you should just remove the code and try to compile line by line.
I know that's very strange, looks like programming PHP and refreshing a webpage but I'm sure this type of bug can be related to the platform migration.
In my situation, I had a class that was OK, I started refactoring and instead of xcode showing me the compilation errors, it gave this cryptic BPT5, at reading exactly the description inside of the IDE I could find that my class was the root cause.
Just remove the code all over you changed and try to compile it again...
Sorry for the late update on that. But in my case it was either CocoaPods in general or the Firebase Pods, which were not compatible with Apple Silicon at that time.
I was just using Swift Package Manager and with that, it worked.
I do not know though, if the problem still exists, because I didn't build another app on the M1 MacBook.
I run into this issue after I accidently extract a view as variable without awareness. You shall check your recently committed code to figure it out.

Swift Remote Config: fetchAndActivate does not update local config

I am getting started with RemoteConfig on iOS with Swift and followed the tutorial to get started. I have developer mode enabled and tested the updating of config values via the Firebase console. However, the update values never get synced with the local config values.
Code:
override func viewDidLoad() {
super.viewDidLoad()
syncRemoteConfig()
}
fileprivate func syncRemoteConfig() {
let remoteConfig = RemoteConfig.remoteConfig()
#if DEBUG
let settings = RemoteConfigSettings()
settings.minimumFetchInterval = 0
remoteConfig.configSettings = settings
#endif
remoteConfig.fetchAndActivate { (status, error) in
let posts = remoteConfig.configValue(forKey: "posts").jsonValue as! [[String: AnyObject]]
print(posts) // <== Always print the previous data
if let error = error {
print(error.localizedDescription)
}
//status always prints status.successUsingPreFetchedData
}
}
Run pod update to at least Firebase 6.25.0.
It fixed a race condition bug in which the minimumFetchInterval might not have been applied before the fetch.

Swift - Unable to retrieve CMSensorDataList records

I'm making a Watch app that will record user acceleration. I've used CMSensorRecorder from the CoreMotion Framework to do this.
The flow of the program right now is that the user presses a button on the watch, which triggers acceleration to be recorded for 30 seconds. After this, there is a 6-minute delay (referring to answer here :watchOS2 - CMSensorRecorder, a delay is needed to read the data), and the acceleration and timestamp data is printed to the console.
Right now I'm getting a "response invalid" and "Error occurred" when running the app. I've added a motion usage description to the info.plist file.
I'm fairly new to Swift and app development, and I fear something's wrong with the way I'm trying to access the data. I've attached the console logs and code below.
Can anybody provide some insight into the messages and how to resolve this? I've searched around but haven't found any cases of this issue before. Cheers.
func recordAcceleration(){
if CMSensorRecorder.isAccelerometerRecordingAvailable(){
print("recorder started")
recorder.recordAccelerometer(forDuration: 30) //forDuration controls how many seconds data is recorded for.
print("recording done")
}
}
func getData(){
if let list = recorder.accelerometerData(from: Date(timeIntervalSinceNow: -400), to: Date()){
print("listing data")
for data in list{
if let accData = data as? CMRecordedAccelerometerData{
let accX = accData.acceleration.x
let timestamp = accData.startDate
//Do something here.
print(accX)
print(timestamp)
}
}
}
}
//Send data to iphone after time period.
func sendData(dataBlock:CMSensorDataList){
WCSession.default.transferUserInfo(["Data" : dataBlock])
}
//UI Elements
#IBAction func recordButtonPressed() {
print("button pressed")
recordAcceleration()
//A delay is needed to read the data properly.
print("delaying 6 mins")
perform(#selector(callback), with: nil, afterDelay: 6*60)
}
#objc func callback(){
getData()
}
extension CMSensorDataList: Sequence {
public func makeIterator() -> NSFastEnumerationIterator {
return NSFastEnumerationIterator(self)
}
Console output:
button pressed
recorder started
2019-03-12 12:12:12.568962+1100 app_name WatchKit Extension[233:5614] [Motion] Warning - invoking recordDataType:forDuration: on main may lead to deadlock.
2019-03-12 12:12:13.102712+1100 app_name WatchKit Extension[233:5614] [SensorRecorder] Response invalid.
recording done
delaying 6 mins
2019-03-12 12:18:13.115955+1100 app_name WatchKit Extension[233:5614] [Motion] Warning - invoking sensorDataFromDate:toDate:forType: on main may lead to deadlock.
2019-03-12 12:18:13.162476+1100 app_name WatchKit Extension[233:5753] [SensorRecorder] Error occurred while trying to retrieve accelerometer records!
I ran your code and did not get the "Response invalid" or "Error occurred". I did get the main thread warnings. So I changed to a background thread and it works fine.
Also, I don't think you need to wait six minutes. I changed it to one minute.
I hope this helps.
let recorder = CMSensorRecorder()
#IBAction func recordAcceleration() {
if CMSensorRecorder.isAccelerometerRecordingAvailable() {
print("recorder started")
DispatchQueue.global(qos: .background).async {
self.recorder.recordAccelerometer(forDuration: 30)
}
perform(#selector(callback), with: nil, afterDelay: 1 * 60)
}
}
#objc func callback(){
DispatchQueue.global(qos: .background).async { self.getData() }
}
func getData(){
print("getData started")
if let list = recorder.accelerometerData(from: Date(timeIntervalSinceNow: -60), to: Date()) {
print("listing data")
for data in list{
if let accData = data as? CMRecordedAccelerometerData{
let accX = accData.acceleration.x
let timestamp = accData.startDate
//Do something here.
print(accX)
print(timestamp)
}
}
}
}

GoogleCast iOS sender, v4, not sending messages

On version 2, the sender app was able to send messages.
func deviceManager(_ deviceManager: GCKDeviceManager!,
didConnectToCastApplication
applicationMetadata: GCKApplicationMetadata!,
sessionID: String!,
launchedApplication: Bool) {
deviceManager.add(self.textChannel)
}
However, the API says that we are now using GCKSessionManager instead of GCKDeviceManager.
The API says I must have a GCKSession add the textChannel, which I did here:
Once the session starts, I add the textChannel (because sessionManager.currentCastSession was nil before the session started).
func sessionManager(_ sessionManager: GCKSessionManager, didStart session: GCKSession) {
if session.device == connectionQueue {
connectionQueue = nil
}
self.sessionManager!.currentCastSession!.add(textChannel)
print("")
}
Meanwhile, I send the text message in another function:
let result = self.textChannel.sendTextMessage("\(self.textField.text)", error: &error)
But the result is always false, and the error is always "Channel is not connected or is not registered with a session".
In addition, when I do:
print("isConnected1 \(self.textChannel.isConnected)")
the result is false.
Do you know what other steps I am missing for it to be connected?
Just learned that it was an issue of my namespace. It connects now.
Problem was the namespace wasn't matching the namespace from my receiver code.
fileprivate lazy var textChannel:TextChannel = {
return TextChannel(namespace: NAMESPACE)
}()

kAudioUnitType_MusicEffect as AVAudioUnit

I'd like to use my kAudioUnitType_MusicEffect AU in an AVAudioEngine graph. So I try to call:
[AVAudioUnitMIDIInstrument instantiateWithComponentDescription:desc options:kAudioComponentInstantiation_LoadInProcess completionHandler:
but that just yeilds a normal AVAudioUnit, so the midi selectors (like -[AVAudioUnit sendMIDIEvent:data1:data2:]:) are unrecognized. It seems AVAudioUnitMIDIInstrument instantiateWithComponentDescription only works with kAudioUnitType_MusicDevice.
Any way to do this? (Note: OS X 10.11)
Make a subclass and call instantiateWithComponentDescription from its init.
Gory details and github project in this blog post
http://www.rockhoppertech.com/blog/multi-timbral-avaudiounitmidiinstrument/#avfoundation
This uses Swift and kAudioUnitSubType_MIDISynth but you can see how to do it.
This works. It's a subclass. You add it to the engine and you route the signal through it.
class MyAVAudioUnitDistortionEffect: AVAudioUnitEffect {
override init() {
var description = AudioComponentDescription()
description.componentType = kAudioUnitType_Effect
description.componentSubType = kAudioUnitSubType_Distortion
description.componentManufacturer = kAudioUnitManufacturer_Apple
description.componentFlags = 0
description.componentFlagsMask = 0
super.init(audioComponentDescription: description)
}
func setFinalMix(finalMix:Float) {
let status = AudioUnitSetParameter(
self.audioUnit,
AudioUnitPropertyID(kDistortionParam_FinalMix),
AudioUnitScope(kAudioUnitScope_Global),
0,
finalMix,
0)
if status != noErr {
print("error \(status)")
}
}