Swift UIPasteboard not copying PNG - swift

My problem is really odd. In the simulator the .png copies to clipboard fine and I can paste the image in the Contacts app on Simulator. But when I put the app on the phone, the png is not copied to clipboard.
let img = UIImage(named: "myimage")
let data = NSData(data: UIImagePNGRepresentation(img) )
UIPasteboard.generalPasteboard().setData(data, forPasteboardType: "public.png")
That's the code I'm using but like I said it does not copy to the clipboard. I'm using this code within the context of a keyboard, although that shouldn't matter when copying to a clipboard. If anyone has any ideas please let me know. Thanks in advance! Oh this is my first app in Swift and my first iOS app, so I don't have the seasoned experience to know if this is a Swift issue or something I'm just missing. =\

Make sure the code runs fine in your host app (not a keyboard extension app).
For example, check if the read image has the same resolution:
//the Pasteboard is nil if full access is not granted
let pbWrapped: UIPasteboard? = UIPasteboard.generalPasteboard()
if let pb = pbWrapped {
var type = UIPasteboardTypeListImage[0] as! String
if (count(type) > 0) && (image != nil) {
pb.setData(UIImagePNGRepresentation(image), forPasteboardType: type)
var readDataWrapped: NSData? = pb.dataForPasteboardType(type)
if let readData = readDataWrapped {
var readImage = UIImage(data: readData, scale: 2)
println("\(image) == \(pb.image) == \(readImage)")
}
}
}
If the pasteboard object is nil in your keyboard app that means you haven't provided full access to the keyboard: Copying and pasting image into a textbook in simulator

I believe you can use this line to do what you want (not able to test it out right now):
let image = UIImage(named: "myimage.png")
UIPasteboard.generalPasteboard().image = image;
Hopefully that works, I'm a little rusty with UIPasteboard.

There are lots of bugs and issues with the UIPasteboard class, so I'm really not surprised that you're having issues with something that so obviously is supposed to work. The documentation isn't that helpful either, to be honest. But try this; this worked for me on a physical device, and it's different to the above methods that are supposed to work but evidently don't for a bunch of people.
guard let imagePath = NSBundle.mainBundle().pathForResource("OliviaWilde", ofType: "jpg") else
{ return }
guard let imageData = NSData(contentsOfFile: imagePath) else { return }
let pasteboard = UIPasteboard.generalPasteboard()
pasteboard.setData(imageData, forPasteboardType: "public.jpeg")
You can use either "public.jpeg" or "public.png" if the source file is .jpg; it still works. I think it only changes the format of the thing that gets pasted?
Also, did you try adding the file extension in your first line of code where you create the UIImage? That might make it work too.
Evidently use of this class is temperamental, not just in this use case. So even though we're doing same thing, only difference in this code is we're creating the NSData from a path rather than a UIImage. Lol let me know if that works for you.

Ensure that RequestsOpenAccess is set to YES under NSExtension > NSExtensionAttributes in the extension's info.plist

Related

How to create apple watchOS5 complication?

I've never worked in WatchOS5 and want to develop a horizontal complication (Modular large) for AppleWatch, like "Heart Rate". The idea is that I would display heart rate data in a different way. Right now I want to deploy the complication on development watch.
I have created a new project with a checkbox for "complication" added. I see that this added a complications controller with timeline configuration placeholders.
There is also an storyboard with a bunch of empty screens. I'm not sure as to how much effort I need to put into an apple watch app before I can deploy it. I see this Apple doc, but it does not describe how to layout my complication. Some section seem to have missing links.
Can I provide one style of complication only (large horizontal - modular large)
Do I need to provide any iPhone app content beyond managing the
complication logic, or can I get away without having a view controller?
Do I control the appearance of my complication by adding something to the assets folder (it has a bunch of graphic slots)?
Sorry for a complete beginner project, I have not seen a project focusing specifically on the horizontal complication for watch OS 5
You should be able to deploy it immediately, though it won't do anything. Have a look at the wwdc video explaining how to create a complication: video
You can't layout the complication yourself, you can chose from a set of templates that you fill with data. The screens you are seeing are for your watch app, not the complication.
You don't have to support all complication styles.
The complication logic is part of your WatchKit Extension, so technically you don't need anything in the iOS companion app, I'm not sure how much functionality you have to provide to get past the app review though.
Adding your graphics to the asset catalog won't do anything, you have to reference them when configuring the templates.
Here's an example by Apple of how to communicate with the apple watch app. You need to painstakingly read the readme about 25 times to get all the app group identifiers changed in that project.
Your main phone app assets are not visible to the watch app
Your watch storyboard assets go in WatchKit target
Your programmatically accessed assets go into the watch extension target
Original answers:
Can I provide one style of complication only (large horizontal -
modular large) - YES
Do I need to provide any iPhone app content beyond
managing the complication logic, or can I get away without having a
view controller? YES - watch apps have computation limits imposed on them
Do I control the appearance of my complication by
adding something to the assets folder (it has a bunch of graphic
slots)? See below - it's both assets folder and placeholders
Modify the example above to create a placeholder image displayed on the watch (when you are selecting a complication while modifying the screen layout)
func getPlaceholderTemplate(for complication: CLKComplication, withHandler handler: #escaping (CLKComplicationTemplate?) -> Void) {
// Pass the template to ClockKit.
if complication.family == .graphicRectangular {
// Display a random number string on the body.
let template = CLKComplicationTemplateGraphicRectangularLargeImage()
template.textProvider = CLKSimpleTextProvider(text: "---")
let image = UIImage(named: "imageFromWatchExtensionAssets") ?? UIImage()
template.imageProvider = CLKFullColorImageProvider(fullColorImage: image)
// Pass the entry to ClockKit.
handler(template)
}else {
handler(nil);
return
}
}
sending small packets to the watch (will not send images!)
func updateHeartRate(with sample: HKQuantitySample){
let context: [String: Any] = ["title": "String from phone"]
do {
try WCSession.default.updateApplicationContext(context)
} catch {
print("Failed to transmit app context")
}
}
Transferring images and files:
func uploadImage(_ image: UIImage, name: String, title: String = "") {
let data: Data? = UIImagePNGRepresentation(image)
do {
let fileManager = FileManager.default
let documentDirectory = try fileManager.url(for: .cachesDirectory,
in: .userDomainMask,
appropriateFor:nil,
create:true)
let fileURL = try FileManager.fileURL("\(name).png")
if fileManager.fileExists(atPath: fileURL.path) {
try fileManager.removeItem(at: fileURL)
try data?.write(to: fileURL, options: Data.WritingOptions.atomic)
} else {
try data?.write(to: fileURL, options: Data.WritingOptions.atomic)
}
if WCSession.default.activationState != .activated {
print("session not activated")
}
fileTransfer = WCSession.default.transferFile(fileURL, metadata: ["name":name, "title": title])
}
catch {
print(error)
}
print("Completed transfer \(name)")
}

NSView to PDF and PNG: Why is the outcome so different?

I am trying to safe an NSView to an PNG.
I start with the NSView and then call dataWithPDF or cacheDisplay for PNG. The code to do both looks like this.
guard view.lockFocusIfCanDraw() else {
assert (false)
return
}
let pdfData = view.dataWithPDF(inside: rect)
guard let imgData = view.bitmapImageRepForCachingDisplay(in: rect) else {
assert(false)
}
view.cacheDisplay(in: rect, to: imgData)
view.unlockFocus()
try pdfData.write(to: pdfName, options: .atomic)
let pngData = imgData.representation(using: .png, properties: [:])
try pngData!.write(to: pngName, options: .atomic)
So far, so good. However, this is the different outcome.
PDF (correct!)
And this is the PNG output. As one can see, the subviews aren't included. The arrows are drawn as part of view
Why is the outcome so different?
Many thanks in advance!
Ok, I found the answer. Thanks to "View Debugging" did I see that the subviews use a layer (self.wantsLayer = true). And layers are not finding their way into the PNG, but into the PDF. Not sure whether this is a bug or a feature. However, now I can fix the PNG output.
Why is the outcome so different?
Trying your code using a different (I obviously don't have your view) view with subviews works as expected and the PNG is fine. So it has to be something to do with your views, but I can make no suggestion as to what. However...
As you've got valid PDF data you can generate your PNG from that using something like:
let captured = NSImage(data:pdfData)
let rep = NSBitmapImageRep(data:(captured?.tiffRepresentation)!)
let pngData = rep?.representation(using: NSPNGFileType, properties:[:])
(that is Swift 3, hence NSPNGFileType rather than .png)
This of course doesn't solve whatever problem you have, it avoids it :-) You should really figure out why your views are failing and treat this as a temporary band aid (assuming it works for you...).
HTH

iOS 11: [ImageManager] Unable to load image data

After update to iOS 11, photo assets now load slowly and I get this message in console:
[ImageManager] Unable to load image data,
/var/mobile/Media/DCIM/103APPLE/IMG_3064.JPG
I use static function to load image:
class func getAssetImage(asset: PHAsset, size: CGSize = CGSize.zero) -> UIImage? {
let manager = PHImageManager.default()
let option = PHImageRequestOptions()
option.isSynchronous = true
var assetImage: UIImage!
var scaleSize = size
if size == CGSize.zero {
scaleSize = CGSize(width: asset.pixelWidth, height: asset.pixelHeight)
}
manager.requestImage(for: asset, targetSize: scaleSize, contentMode: .aspectFit, options: option) { (image, nil) in
if let image = image {
assetImage = image
}
}
if assetImage == nil {
manager.requestImageData(for: asset, options: option, resultHandler: { (data, _, orientation, _) in
if let data = data {
if let image = UIImage.init(data: data) {
assetImage = image
}
}
})
}
return assetImage
}
Request image for asset usually always succeeds, but it prints this message. If I use requestImageData function only, there is no such message, but photos made with Apple camera lose their orientation and I get even more issues while loading big amount of images (I use image slideshow in my app).
Apple always sucks when it comes to updates, maybe someone got a solution how to fix this? It even fails to load an asset, when there is a big list of them in user camera. Switching to requestImageData is not an option for me as it brings nil data frequently now.
I would like to point out, that I call this function only once. It is not used in UITableView etc. I use other code for thumbs with globally initialised manager and options, so assets are definitely not nil or etc.
I call this function only when user clicks at certain thumb.
When gallery has like 5000 photos, maybe connection to assets is just overloaded and later it can't handle request and crashes?
So many questions.
Hey I was having the warning as well and here is what worked for me.
Replacing
CGSize(width: asset.pixelWidth, height: asset.pixelHeight)
by
PHImageManagerMaximumSize in requestImage call
removed the warning log 🎉
Hope this helps,
I had the same problem. Though this did not completely solve it, but it definitely helped.
option.isNetworkAccessAllowed = true
This helps only on the devices where Optimise iPhone Storage option for Photos app has been turned on.
Your code has some serious issues. You are saying .isSynchronous = true without stepping into a background thread to do the fetch. That is illegal and is what is causing the slowness. Plus, you are asking for a targetSize without also saying .resizeMode = .exact, which means you are getting much bigger images than you are asking for.
However, the warning you're seeing is irrelevant and can be ignored. It in no way signals a failure of image delivery; it seems to be just some internal message that has trickled up to the console by mistake.
This seems to be a bug with iOS 11, but I found I could work around by setting synchronous option false. I reworked my code to deal with the async delivery. Probably you can use sync(execute:) for quick fix.
Also, I believe the problem only occurred with photos delivered by iCloud sharing.
You can try method "requestImageData" with following options. This worked for me in iOS 11.2 (both on device and simulator).
let options = PHImageRequestOptions()
options.deliveryMode = .highQualityFormat
options.resizeMode = .exact
options.isSynchronous = true
PHImageManager.default().requestImageData(for: asset, options: options, resultHandler: { (data, dataUTI, orientation, info) in

How to send audio file with image and caption in iMessage app for iOS 10?

I am creating iMessage app and trying to send audio or video file to other user.
Video file works and looks fine but its not working as expected with audio file.
My current code is:
let destinationFilename = mp3FileNames[i]
let destinationURL = docDirectoryURL.appendingPathComponent(destinationFilename)
if let conversation = activeConversation {
let layout = MSMessageTemplateLayout()
layout.image = UIImage.init(named: "audio-x-generic-icon")
layout.mediaFileURL = destinationURL
layout.caption = selectedSongObj.name
let message = MSMessage()
message.layout = layout
message.url = URL(string: "emptyURL")
conversation.insert(message, completionHandler: nil)
return
}
Looks like layout.mediaFileURL = destinationURL is not adding any file into message.
And when I try to send file with above code.It looks like shown below:
It looks fine but there is no audio to play but if I try this way:
let destinationFilename = mp3FileNames[i]
let destinationURL = docDirectoryURL.appendingPathComponent(destinationFilename)
if let conversation = activeConversation {
conversation.insertAttachment(destinationURL!, withAlternateFilename: nil, completionHandler: nil)
return
}
And result with above code is:
I can play audio for that message because it's there. But problem with that message is I can not attach any image or caption with it.
How can I attach image and audio file into same message.
And if possible instead of image can I add GIF?
Any help would be much appreciated, Thank you.
Not necessary to use GIF, iMessage extensions supports also PNGand JPEG image formats. Recommended image size is 300x300 points at #3x scale.
If the MSMessageTemplateLayout's image property has a non-nil value then
mediaFileURL property is ignored. So you can't send an image and an audio file at the same time. Docs

Sharing screenshots in share sheets

I am making an aracade-style game and when the player loses I give them the option to share their score via the iOS share sheet. What I want to know is, how can I have them share a screenshot taken right when they die along with some text. I already know how to make it so that they share text but I want the screenshot as well. I set it up like this so that the game takes a screenshot right when the player dies:
func screenShotMethod() {
//Create the UIImage
UIGraphicsBeginImageContext(view!.frame.size)
view!.layer.renderInContext(UIGraphicsGetCurrentContext())
let image = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
//Save it to the camera roll
UIImageWriteToSavedPhotosAlbum(image, nil, nil, nil)
println("screenshot")
}
Then I run this function at the GameOver Sequence like this:
if gameOver == 0{
gameOver = 1
***screenShotMethod()***
movingObjects.speed = 0
movingObjects.removeFromParent()
backgroundMusicPlayer.stop()
Now what I want to be able to do is access this screenshot so that it can be used in the sharing option, but deleted as soon as the player hits replay if the player doesn't share that score. Right now I have sharing set up like this:
if shareButton.containsPoint(location){
UIGraphicsBeginImageContext(view!.frame.size)
view!.layer.renderInContext(UIGraphicsGetCurrentContext())
let image = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
//Save it to the camera roll
UIImageWriteToSavedPhotosAlbum(image, nil, nil, nil)
println("screenshot")
var postImage = UIImage(named: "\(image)")
socialShare(sharingText: "I just got \(score) points in Deez Nuts! Bet you can't beat that! #DeezNuts", sharingImage: UIImage(named: "\(postImage)"), sharingURL: NSURL(string: "http://itunes.apple.com/app/"))
}
Please be specific and straightforward because I am new to developing apps. Also I am using Swift if you didn't already notice. Thank you very much.
You just need to delete this line
var postImage = UIImage(named: "\(image)")
Because image is already an UIImage so just use sharingImage: image
socialShare(sharingText: "I just got \(score) points in Deez Nuts! Bet you can't beat that! #DeezNuts", sharingImage: image , sharingURL: NSURL(string: "http://itunes.apple.com/app/")!)