LiDAR: export ARReferenceObject as .obj - swift

Since LiDAR has been built into the newest 2020+ iOS devices (iPhone 12 Pro, iPad Pro).
ARKit has more possibilities than ever, including support for exporting to .obj.
Here is the code for exporting a ARReferenceObject to .arobject
guard let testRun = self.testRun,
let object = testRun.referenceObject,
let name = object.name
else {
print("Error: Missing scanned object.")
return
}
let documentURL = FileManager.default.temporaryDirectory
.appendingPathComponent(name + ".arobject")
DispatchQueue.global().async {
do {
try object.export(to: documentURL,
previewImage: testRun.previewImage)
} catch {
fatalError("Failed to save the file to \(documentURL)")
}
}
How do you export as .obj?

Sparse point cloud contained in .arobject file can't be exported as 3D geometry.
So the answer is: NO.

Related

Extract Images from Zip Archive

My starting point:
I get a zip archive with some compressed images from an API. I want to show the pictures in a slideshow.
The problem:
After I successfully downloaded the zip archive as Data from URLSession.shared.dataTask and uncompressed it with let decompressedData = try (data as NSData) .decompressed (using: .zlib), I got no further.
I know that I can use UIImage(data: data) to display an image from data.
But how do I get the individual data for the images from the decompressed data?
Thank you in advance,
Lucas
PS: If I haven't explained it clearly or if you need further details, please just ask
Never mind, I found the solution myself:
guard let archive = Archive(data: data, accessMode: .read) else {
return
}
for entry in archive {
var extractedData: Data = Data([])
do {
_ = try archive.extract(entry) { extractedData.append($0) }
if let image = UIImage(data: extractedData) {
self.images.append(image)
}
} catch {
print(error.localizedDescription)
}
}
Zip Foundation is required for this to work. Thanks #jn-pdx for pointing in the right direction. After a litte bit of digging I found a way to create a Archive with memory data.

How to take a screenshot of your entire screen

I have created an Xcode swift based software that is menu based. One of the buttons I have created is intended to capture a screenshot and save the file to a specific location.
I have found sources explaining how to do this on iOS, but I'm looking for macOS functionality. The article: Programmatically Screenshot | Swift 3, macOS has responses that have gotten me close but I think some of it is deprecated.
How can I implement this in a software developed for macOS with Xcode & Swift 5.
Here is the code for the function:
#objc func TakeScreenshot(_ sender: Any){
func CreateTimeStamp() -> Int32
{
return Int32(Date().timeIntervalSince1970)
}
var displayCount: UInt32 = 0;
var result = CGGetActiveDisplayList(0, nil, &displayCount)
if (result != CGError.success) {
print("error: \(result)")
return
}
let allocated = Int(displayCount)
let activeDisplays = UnsafeMutablePointer<CGDirectDisplayID>.allocate(capacity: allocated)
result = CGGetActiveDisplayList(displayCount, activeDisplays, &displayCount)
if (result != CGError.success) {
print("error: \(result)")
return
}
for i in 1...displayCount {
let unixTimestamp = CreateTimeStamp()
let fileUrl = URL(fileURLWithPath: "~/Documents" + "\(unixTimestamp)" + "_" + "\(i)" + ".jpg", isDirectory: true)
let screenShot:CGImage = CGDisplayCreateImage(activeDisplays[Int(i-1)])!
let bitmapRep = NSBitmapImageRep(cgImage: screenShot)
let jpegData = bitmapRep.representation(using: NSBitmapImageRep.FileType.jpeg, properties: [:])!
do {
try jpegData.write(to: fileUrl, options: .atomic)
}
catch {print("error: \(error)")}
}
}
menu.addItem(NSMenuItem(title: "Take Screenshot", action:
#selector(AppDelegate.TakeScreenshot(_:)), keyEquivalent: ""))
The second portion of code is the menu item that is a button. I want this button to take a screenshot of the screen and then save the file to a location I specify.
I get this error when I use the button on my application:
Error
The error has to do with saving the file. You are constructing the URL badly.
First, expanding ~ to the home directory is generally a shell feature, not an OS feature. The underlying file path APIs (e.g. open()) just treat that as a normal character in a path. Foundation does support expanding ~ in path strings (not URLs), but you have to specifically request it with expandingTildeInPath. It's never automatic and it's never meaningful in URLs.
Next, I suspect you were trying to build a URL to a file within the Documents directory. However, you did not put a path separator (/) between the name of the directory and the name of the file. In other words, you constructed ~/Documents10989439875_1.jpg, not ~/Documents/10989439875_1.jpg.
You should use FileManager().urls(for:.downloadsDirectory, in:.userDomainMask)[0] to get the URL to the Downloads folder and then append a path component to that using appendingPathComponent(_:isDirectory:).

iOS 13 AVAsset: mediaOptions on AVAssetCache returning empty array for audio and captions

We have an app that downloads assets and media selections using .aggregateAssetDownloadTask. In iOS 12.4 and below we have been able to access the captions tracks when offline using the asset's assetCache. On an iOS 13 device, the assetCache for the captions is now empty despite being downloaded. We can get the captions if we directly access the assets tracks using asset.mediaSelectionGroup(forMediaCharacteristic: characteristic).
We access the tracks like this:
guard let asset = currentItem.asset as? AVURLAsset else { return [] }
let cache = asset.assetCache
if let group = asset.mediaSelectionGroup(forMediaCharacteristic: characteristic), let languages = cache?.mediaSelectionOptions(in: group) {
if medium == .captions {
return languages.filter({ $0.hasMediaCharacteristic(AVMediaCharacteristic.containsOnlyForcedSubtitles) == false }).compactMap({ $0 as Language})
} else {
return languages
}
}
We've filed a radar with apple but we're wondering if anyone else has ran into this issue? If so what workaround you're using? We're hesitant to access the mediaSelectionGroup on the asset.

Cannot load default library in Metal using Swift

For my project (being compiled as a framework) I have a file ops.metal:
kernel void add(device float *lhs [[buffer(0)]],
device float *rhs [[buffer(1)]],
device float *result [[buffer(2)]],
uint id [[ thread_position_in_grid ]])
{
result[id] = lhs[id] + rhs[id];
}
and the following Swift code:
#available(OSX 10.11, *)
public class MTLContext {
var device: MTLDevice!
var commandQueue:MTLCommandQueue!
var library:MTLLibrary!
var commandBuffer:MTLCommandBuffer
var commandEncoder:MTLComputeCommandEncoder
init() {
if let defaultDevice = MTLCreateSystemDefaultDevice() {
device = defaultDevice
print("device created")
} else {
print("Metal is not supported")
}
commandQueue = device.makeCommandQueue()
library = device.newDefaultLibrary()
if let defaultLibrary = device.newDefaultLibrary() {
library = defaultLibrary
} else {
print("could not load default library")
}
commandBuffer = commandQueue.makeCommandBuffer()
commandEncoder = commandBuffer.makeComputeCommandEncoder()
}
deinit {
commandEncoder.endEncoding()
}
}
When I try to create an instance of MTLContext in a unit test, the device is created, but the default library cannot be created ("could not load default library"). I've checked that the compiled framework has a default.metallib in Resources (which is the most common reason given for newDefaultLibrary).
Unfortunately I haven't been able to find any working examples that are creating compute kernels in a Metal shader file (there are a few examples using the performance shaders, but they don't need to make kernels in the shader file).
Any suggestions would be greatly appreciated!
newDefaultLibrary() loads from the main bundle of the currently running application. It doesn't search any embedded frameworks or other locations for libraries.
If you want to use a metallib that was compiled into an embedded framework, the easiest thing to do is to get a reference to its containing Bundle and ask for the default library of that bundle instead:
let frameworkBundle = Bundle(for: SomeClassFromMyShaderFramework.self)
guard let defaultLibrary = try? device.makeDefaultLibrary(bundle: frameworkBundle) else {
fatalError("Could not load default library from specified bundle")
}
This does require that you have at least one publicly-visible class in the framework containing your shaders, but that can be as simple as declaring an empty class strictly for the purpose of doing the bundle look-up:
public class SomeClassFromMyShaderFramework {}

Using Xcode 7/Swift 2 writeToPath to Resources file the call does not fail but no data is written

I am using Xcode 7.3.1 and Swift 2.0. I am using the following code sample:
func writeToResourcesDataDir() {
if let path = NSBundle.mainBundle().pathForResource("TestData", ofType: ".json") {
let str = "Test String"
do {
try str.writeToFile(path, atomically: false, encoding: NSUTF8StringEncoding)
print("writeToFile successful")
} catch {
print("writeToFile failed")
}
} else {
print("Path does not exist")
}
}
Running under Xcode in the see the "writeToFile successful" message.But, also using the simulator, I can display the TestData in the Resources directory and the file does not have the string.I also used a terminal window in Mac to look at the files in the Resources directory and the TestData file is empty (0 bytes).I know I am in the correct Resources directory because there is another file in the directory that has correct data that is used for running the other parts of the program.
I have spent several days now looking at other google entries about data from writeToFile not working and I have tried out every fix or things to try I have found.
Can anyone help?
I added code to accept the boolean return from the call to writeToFile and it returns a false. I'm not sure why a false is returned but the catch isn't invoked.I am not sure how to get the error code that goes with this writeToFile in Swift 2.0.
I am also wondering if this is a write permissions problem.Should I be using the Documents directory instead of the Data directory?
Try something like this. This is swift 2.3 and xcode 8.
let filename = "yourjsonfile"
let documentDirectoryURL = try! NSFileManager.defaultManager().URLForDirectory(.DocumentDirectory, inDomain: .UserDomainMask, appropriateForURL: nil, create: true)
let filePath = documentDirectoryURL.URLByAppendingPathComponent(filename)
let fileExist = filePath?.checkResourceIsReachableAndReturnError(nil)
if (fileExist == true) {
print("Found file")
} else {
print("File not found")
}