I have two projects: an iOS project using iOS Charts that plots the data without any problems and a Mac companion app that fails when assinging the data to the lineChartView.data object. I also get a warning:
'warning: could not execute support code to read Objective-C class data in the process. This may reduce the quality of type information available.'
Here is my code:
let values = (0..<count).map { (i) -> ChartDataEntry in
let value = recordingpoints[i]
return ChartDataEntry(x:Double(i), y: value)
}
let data = LineChartData()
let dataset = LineChartDataSet(values: values, label: "Glucose Trends")
data.addDataSet(dataset)
self.lineChartView.data = data
I cannot find where the error is generated. Here is a screenshot of the error:
error screen
Related
enter image description here if let tracking_id = activeridedata.tracking_id {
cell.lblTrackingId.text = tracking_id
}
compiler gives fatal error while cell is loading and its due to nil data in model classes. Even after using if let statement, app crashes when nil data appears.
If activeridedata may not exist, declare it as an optional type and use it as follows.
if let tracking_id = activeridedata?.tracking_id { cell.lblTrackingId.text = tracking_id }
I'm trying to create a USDZ object with the tutorial from Apple
Creating 3D Objects from Photographs. I'm using the new PhotogrammetrySession within this sample project: Photogrammetry Command-Line App.
That's the code:
let inputFolderUrl = URL(fileURLWithPath: "/tmp/MyInputImages/")
let url = URL(fileURLWithPath: "MyObject.usdz")
var request = PhotogrammetrySession.Request.modelFile(url: url,
detail: .full)
guard let session = try PhotogrammetrySession(input: inputFolderUrl) else {
return
}
I'm getting the following error:
2021-06-12 21:53:56.968490+0200 HelloPhotogrammetry[15294:190841] ERROR cv3dapi.pg: Internal codes (1): 4011
2021-06-12 21:53:56.972113+0200 HelloPhotogrammetry[15294:190841] [Photogrammetry] No SfM map found in native output!
2021-06-12 21:53:56.972909+0200 HelloPhotogrammetry[15294:190841] [Photogrammetry] Got error in completion: reconstructionFailed(RealityFoundation.PhotogrammetrySession.Request.modelFile(url: OutputAR.usdz -- file:///Users/jonasdeichelmann/Library/Developer/Xcode/DerivedData/HelloPhotogrammetry-ghttmmgcrrhywqeebbrstrvxoikh/Build/Products/Debug/, detail: RealityFoundation.PhotogrammetrySession.Request.Detail.medium, geometry: nil), "Reconstruction failed!")
Request modelFile(url: OutputAR.usdz -- file:///Users/jonasdeichelmann/Library/Developer/Xcode/DerivedData/HelloPhotogrammetry-ghttmmgcrrhywqeebbrstrvxoikh/Build/Products/Debug/, detail: RealityFoundation.PhotogrammetrySession.Request.Detail.medium, geometry: nil) had an error: reconstructionFailed("Reconstruction failed!")
Processing is complete!
I'm using an M1 iMac with macOS Monterey 12.0 Beta (21A5248p) and Xcode 13.0 beta (13A5154h).
tl;dr: Try another set of images, probably there is something wrong with your set of images.
I've had it work successfully except in one instance, and I received the same error that you are getting. I think for some reason it didn't like the set of photos I took for that particular object. You could try taking just a few photos of another simple object and try again and see if that is the problem with your first run.
I have created a feature service on ArcGIS online which has approximately 2000 features. Each feature has four fields: name, latitude, longitude and a boolean validation field (true/false). Two custom symbols are used - one for validated features and one for non-validated features.
I have successfully connected to the feature service from my native (xcode/swift) iOS application and the features are displayed properly on top of the basemap.
I have implemented a touch delegate and successfully detect when a feature symbol is tapped. The issue I am having is trying to query (read) the "name" field attribute associated with the symbol that was tapped. I have tried using the code below but have not been able to read the attribute:
func geoView(_ geoView: AGSGeoView, didTapAtScreenPoint screenPoint: CGPoint, mapPoint: AGSPoint) {
if let activeSelectionQuery = activeSelectionQuery {
activeSelectionQuery.cancel()
}
guard let featureLayer = featureLayer else {
return
}
//tolerance level
let toleranceInPoints: Double = 12
//use tolerance to compute the envelope for query
let toleranceInMapUnits = toleranceInPoints * viewMap.unitsPerPoint
let envelope = AGSEnvelope(xMin: mapPoint.x - toleranceInMapUnits,
yMin: mapPoint.y - toleranceInMapUnits,
xMax: mapPoint.x + toleranceInMapUnits,
yMax: mapPoint.y + toleranceInMapUnits,
spatialReference: viewMap.map?.spatialReference)
//create query parameters object
let queryParams = AGSQueryParameters()
queryParams.geometry = envelope
//run the selection query
activeSelectionQuery = featureLayer.selectFeatures(withQuery: queryParams, mode: .new) { [weak self] (queryResult: AGSFeatureQueryResult?, error: Error?) in
if let error = error {
print("error: ",error)
}
if let result = queryResult {
print("\(result.featureEnumerator().allObjects.count) feature(s) selected")
print("name: ", result.fields)
}
}
}
I am using the ArGIS iOS 100.6 SDK.
Any help would be appreciated in solving this issue.
The featureLayer selection methods merely update the map view display to visually highlight the features.
From the featureLayer, you should get the featureTable and then call query() on that. Note that there are two methods. A simple query() that gets minimal attributes back, or an override on AGSServiceFeatureTable that allows you to specify that you want all fields back. You might need to specify .loadAll on that override to get the name field back. We do it this way to avoid downloading too much information (by default we download enough to symbolize and label the feature).
I've bumped into an error in a Keras model exported to CoreML and I haven't been able to figure out where it is coming from or what is causing it. I am working with Xcode 9 + MacOS 10.13.5 High Sierra + Keras 2.1.5. I was not able to determine the CoreML version.
The model is an LSTM ("long short term memory") model created in Keras and exported to CoreML. I've successfully linked the model into my iOS app. However when I run the app, I get the following error when trying to generate a prediction from the LSTM.
2018-07-11 11:47:30.159116-0700 mymlmodelapp[32544:191635] [coreml] Different batch numbers for input features.
2018-07-11 11:47:30.159409-0700 mymlmodelapp[32544:191635] [coreml] Failure in resetSizes.
I'm really at a loss at this point. I can't figure out from where the error originates. I tried to set a breakpoint in Xcode to break on Swift errors but the breakpoint is not triggered. I can inspect the error instance by setting a breakpoint at the point at which the error is caught, but I don't see how to get the stack trace for the point at which the error was thrown.
I can't find any documentation about it in CoreML, Keras or Tensorflow. Also I can't find that string in the Keras or Tensorflow source code, or in the CoreML libraries (binaries only, I was not able to find the source code, if you know how to get it, I would be very interested).
So to some extent this is about CoreML + Keras but also this is just a general question about how to figure out, in Xcode, from where errors are coming.
Here's the code in question. Foo_Bar is the name of the input which was assigned in Keras. The intent is to parse a text field into value which are the input, then give the input to LSTM, get the output, and paste the output into another text field.
#IBAction func mybuttonAction(_ sender: Any) {
let s: String? = myinput.text
if (s != nil && s!.count > 0) {
guard let input_data = try? MLMultiArray(shape:[1, 4, 2], dataType:.double) else {
fatalError("Unexpected runtime error. MLMultiArray")
}
let w = s!.split(separator: " ")
let x = w.map { s1 in Double(s1) }
var x8 = [Double](repeating: 0.0, count: 8)
for i in 0..<(x.count > 8 ? 8 : x.count) {
if x[i] != nil {
x8[i] = x[i]!
}
}
for i in 0..<8 {
input_data[i] = NSNumber(value: x8[i])
}
let mymodel = myLSTM_regressionModel_sos06252018 ()
do {
let lstm_input = myLSTM_regressionModel_sos06252018Input (Foo_Bar: input_data)
let output_data = try mymodel.prediction (input: lstm_input)
let y = output_data.featureValue (for: "Predictions")
let z = y?.multiArrayValue
myoutput.text = String(z!.count)
}
catch {
myoutput.text = "OOPS: \(error)"
}
}
}
EDIT 2018-07-27: I found that the example works by change shape:[1, 4, 2] to shape:[4, 1, 2]. That doesn't make any sense to me, but as they say, there's no arguing with success. I tried several other permutations which failed.
I found article describing how to create plugin using Swift and Cocoa. It uses NSBundle to load plugin, but that, as far as I know, is not available in pure swift (no Cocoa). Is there way how to achieve same result without using Cocoa?
More info:
In case it's relevant, here is what I want to achieve. I create app in swift that runs on linux server. User can connect to it using their browser. I want to be able to have other people write "plugins" that will implement functionality itself (what user can see and do once they connect), from printing out hello world, through chat programs to games without having to worry about low level stuff provided by my app. Some sort of dll, that my server application loads and runs.
Solution to this is not trivial, but it's not impossible to do either. I prefer to use swift package manager to manage dependencies and Xcode as IDE. This combination is not perfect as it needs a lot of tinkering but there is not any other useable free swift IDE as of now.
You will need to set up two projects, let's call them Plugin (3rd party library) and PluginConsumer (app that uses other people plugins). You will also need to decide on API, for now we will use simple
TestPluginFunc()
Create Plugin.swift file with TestPluginFunc implementation in your Plugin project:
public func TestPluginFunc() {
print("Hooray!")
}
Set the project to build framework, not executable and build[1]. You will get Plugin.framework file which contains your plugin.
Now switch to your PluginConsumer project
Copy Plugin.framework from your Plugin project somewhere where you can easily find it. To actually load the framework and use it:
// we need to define how our plugin function looks like
typealias TestPluginFunc = #convention(c) ()->()
// and what is its name
let pluginFuncName = "TestPluginFunc"
func loadPlugin() {
let pluginName = "Plugin"
let openRes = dlopen("./\(pluginName).framework/\(pluginName)", RTLD_NOW|RTLD_LOCAL)
if openRes != nil {
// this is fragile
let symbolName = "_TF\(pluginName.utf8.count)\(pluginName)\(initFuncName.utf8.count)\(initFuncName)FT_T_"
let sym = dlsym(openRes, symbolName)
if sym != nil {
// here we load func from framework based on the name we constructed in "symbolName" variable
let f: TestPluginFunc = unsafeBitCast(sym, to: TestPluginFunc.self)
// and now all we need to do is execute our plugin function
f()
} else {
print("Error loading \(realPath). Symbol \(symbolName) not found.")
dlclose(openRes)
}
} else {
print("error opening lib")
}
}
If done correctly, you should see "Hooray!" being printed to your log.
There is a lot of room for improvement, first thing you should do is replace Plugin.framework string with parameter, preferably using some file library (I am using PerfectLib). Another thing to look at is defining plugin API in your PluginConsumer project as a protocol or base class, creating framework out of that, importing that framework in your plugin project and basing your implementation on that protocol/base class. I am trying to figure out exactly how to do that. I will update this post if I mange to do it properly.
[1]: I usually do this by creating Package.swift file and creating xcode project out of it using swift package generate-xcodeproj. If your project doesn't contain main.swift, xcode will create framework instead of executable
What you will want to do is create a folder your program will look in. Let's say it's called 'plugins'. It should make a list of names from the files in there, and then iterate through using them, passing parameters to the files and getting the output and making use of that in some way.
Activating a program and getting output:
func runCommand(cmd : String, args : String...) -> (output: [String], error: [String], exitCode: Int32) {
var output : [String] = []
var error : [String] = []
let task = Process()
task.launchPath = cmd
task.arguments = args
let outpipe = Pipe()
task.standardOutput = outpipe
let errpipe = Pipe()
task.standardError = errpipe
task.launch()
let outdata = outpipe.fileHandleForReading.readDataToEndOfFile()
if var string = String(data: outdata, encoding: .utf8) {
string = string.trimmingCharacters(in: .newlines)
output = string.components(separatedBy: "\n")
}
let errdata = errpipe.fileHandleForReading.readDataToEndOfFile()
if var string = String(data: errdata, encoding: .utf8) {
string = string.trimmingCharacters(in: .newlines)
error = string.components(separatedBy: "\n")
}
task.waitUntilExit()
let status = task.terminationStatus
return (output, error, status)
}
`
Here is how a swift plugin would accept arguments:
for i in 1..C_ARGC {
let index = Int(i);
let arg = String.fromCString(C_ARGV[index])
switch arg {
case 1:
println("1");
case 2:
println("2")
default:
println("3)
}
}
So once you have the program and plugin communicating you just have to add handling in your program based on the output so the plugins output can do something meaningful. Without cocoa libraries this seems the way to go, though if you use C there are a couple of other options available there as well. Hope this helps.