I am trying to load a rcproject from a local directory. My target is, to load it from an URL and then show it.
If I load it like this:
let modelScene = try? Entity.loadAnchor(named: "Experience")
everything works fine.
But if I do this:
let url = URL(fileURLWithPath: "./Experience")
or
let url = URL(fileURLWithPath: "./Experience.rcproject")
and
let modelScene = try? Entity.loadAnchor(contentsOf: url, withName: "Experience")
or
let modelScene = try? Entity.loadAnchor(contentsOf: url)
I get the following error:
// [Pipeline] Failed to open scene 'Experience -- file:///'.
I have no idea, what the issue here is. Did someone has an idea, what i can try?
My development target is 14.4
In the apple docs, they write, that it should work like this, right?
loadAnchor(contentsOf:withName:) type method was composed for .usd, .usda, .usdc, .usdz and .reality file formats. However official documentation says that now they work only for Reality files. You can read about it here.
public static func loadAnchor(contentsOf url: URL,
withName resourceName: String?) throws -> AnchorEntity
And here's a definition inside your code:
Supported file formats are USD or Reality. In order to identify a resource across a network session, the resource needs to have a unique name. This name is set using resourceName. All participants in the network session need to load the resource and assign the same resourceName.
Related
I have a short video on my local machine. I am trying to open it using NSWorkspace on QuickTime Player.
let stringUrl = "~/Desktop/myrec.mov"
let url = URL(string: stringUrl)!
let config = NSWorkspace.OpenConfiguration()
config.activates = true
NSWorkspace.shared.open(
[url],
withApplicationAt: URL(string: "~/Applications/QuickTime Player")!,
configuration: config
)
The error I am getting is:
Fatal error: Unexpectedly found nil while unwrapping an Optional value: file cap/cap.swift
Which has to do with withApplicationAt being the incorrect URL and returning nil.
I'm sure withApplicationAtURL is wrong, but I have no idea how to find the url for the quicktime player app.
I am very new to Swift and am having trouble reading the docs, especially since they don't seem up to date (e.g. my compiler says openFile is deprecated and led me to open).
I'm also not sure if this is the correct/best way to go about accomplishing what I am trying to do (open a small local video on quicktime player).
Any recommendations, tips, advice, or answers would be greatly appreciated!
Try this. Works on my Mac:
let stringUrl = "/Users/whoever/Desktop/myrec.mov"
let url = URL(fileURLWithPath: stringUrl)
let config = NSWorkspace.OpenConfiguration()
config.activates = true
NSWorkspace.shared.open(
[url],
withApplicationAt: URL(fileURLWithPath: "/System/Applications/QuickTime Player.app"),
configuration: config
)
Please note that that I replaced the initializer of URL with the correct one. Also note, that I've swapped the QuickTime Player path with the one that Finder gives me (right-click while holding option key).
I am creating an app based on Neural Network and the CoreML model size is of around 150MB. So, it's obvious that I can't ship it within the app.
To overcome this issue, I came to know about this article, mentioning that you can download and compile the CoreML model on device.
I did and I download on my device, but the problem is I cannot do the predictions as the original model. Like, the original model is taking UIImage as an input but the MLModel is MLFeatureProvider, can anyone address how can I do the type casting to my model and use it as original?
do {
let compiledUrl = try MLModel.compileModel(at: modelUrl)
let model = try MLModel(contentsOf: compiledUrl)
debugPrint("Model compiled \(model.modelDescription)")
//model.prediction(from: MLFeatureProvider) //Problem
//It should be like this
//guard let prediction = try? model.prediction(image: pixelBuffer!) else {
// return
//}
} catch {
debugPrint("Error while compiling \(error.localizedDescription)")
}
When you add an mlmodel file to your project, Xcode automatically generates a source file for you. That's why you were able to write model.prediction(image: ...) before.
If you compile your mlmodel at runtime then you don't have that special source file and you need to call the MLModel API yourself.
The easiest solution here is to add the mlmodel file to your project, copy-paste the automatically generated source file into a new source file, and use that with the mlmodel you compile at runtime. (After you've copied the generated source, you can remove the mlmodel again from your Xcode project.)
Also, if your model is 150MB, you may want to consider making a small version of it by choosing an architecture that is more suitable for mobile. (Not VGG16, which it seems you're currently using.)
guard let raterOutput = try? regressionModel.prediction(from: RegressorFeatureProviderInput(
feature1: 3.4,
feature2: 4.5))
else {return 0}
return Double(truncating: NSNumber(value:RegressorFeatureProviderOutput.init(features: raterOutput).isSaved))
Adding to what #Matthjis Hollemans said
let url = try! MLModel.compileModel(at: URL(fileURLWithPath: model))
visionModel = try! VNCoreMLModel(for: MLModel(contentsOf: url))
I am trying to access a file I saved from my today extension.In my today extenson I did this to save the file:
func startRecording() {
let audioFilename = getDocumentsDirectory().appendingPathComponent("recording.m4a")
let settings = [
AVFormatIDKey: Int(kAudioFormatMPEG4AAC),
AVSampleRateKey: 12000,
AVNumberOfChannelsKey: 1,
AVEncoderAudioQualityKey: AVAudioQuality.high.rawValue
]
do {
audioRecorder = try AVAudioRecorder(url: audioFilename, settings: settings)
audioRecorder.delegate = self
audioRecorder.record()
recordButton.setTitle("Stop", for: .normal)
} catch {
finishRecording(success: false)
}
}
func getDocumentsDirectory() -> URL {
let paths = FileManager.default.urls(for: .documentDirectory, in: .userDomainMask)
let documentsDirectory = paths[0]
return documentsDirectory
}
I then tried to get the data for my AVAudio Player in the main part of the project using this code:
let path = Bundle.main.path(forResource: "recording.m4a", ofType:nil)!
let url = URL(fileURLWithPath: path)
However, it gave the error: unexpectedly found nil while unwrapping an Optional value.
Thanks for the help.
Your extension saves the file to its document directory and your app code is looking for the file in the app bundle. The app bundle only contains the resources that are distributed with the app. You'll need to delve into the file system.
However, there's another problem. The extension and containing app don't share a documents directory. They each have their own container for writing data local to themselves. If you want to share data between them, it's a little more work. In summary:
Create an app group identifier for the app and the extension to share.
Use FileManager.containerURL(forSecurityApplicationGroupIdentifier:) to get the file URL for the shared container directory.
From the container URL, append the file name.
In the extension, you'll set up the AVAudioRecorder as usual and start recording.
In the main app, you'll want to use the NSFileCoordinator API to ensure that only one process is writing to the file at a time. Hopefully, the AVAudioRecorder uses the NSFileCoordinator API internally, although I didn't immediately find confirmation of this.
For more details about shared containers, see this blog post.
I just tried the same - record audio from a Today Extension. The code looks sooo familiar, so I'm taking a wild guess: you want to capture voice and send the file to the Google Speech API, correct?
Nonetheless, I think we're hitting restrictions of extensions: judging by https://developer.apple.com/library/content/qa/qa1872/_index.html extensions cannot record audio. The article has been writting for iOS 8, but I don't believe Apple ever lifted the restriction. Please correct me if I'm wrong, since I keep running into problems doing what OP does myself.
btw, check the result of audioRecorder.record(). It might be false and indicate that the capture never started (that's my error currently).
I'm trying to play a video using swift, however the line of code below gives me an error
init(URL:)' has been renamed to init(url:)'
let player = AVPlayer(URL: NSURL(fileURLWithPath: path) as URL)
How can modify this line to get ride of the error.
Thanks
Try this:
if let player = Bundle.main.url(forResource: "The name of your file", withExtension: "mp4"){
if NSWorkspace.shared().open(player) {
}
}
Be sure to substitute the name of your file and file type. You may want to put it in a function and call that function.
I know it's a totally different way of doing it, but it's always worked for me. I recommend trying it.
I use VLCKit in swift, so i created customize video player view
and i have external subtitles for movies from links, i read files from server and convert it so string
do {
let text = try NSString(contentsOfURL: NSURL(string: self.subtitleUrl)!, encoding: NSUTF8StringEncoding)
self.mediaPlayer.openVideoSubTitlesFromFile(text as String)
}catch {
print("Error")
}
and i called a function named "openVideoSubTitlesFromFile"
in player but not working
anyone can give me a solution
This method (which will be deprecated in the next major release of VLCKit) does only accept local file paths, no remote URLs. You need to download the subtitles file, cache it locally and provide the path of the stored file to the method. Additionally, you may only use this method after playback started.