I am trying to show an activity indicator while creating a csv file, but it does not show. I am guessing I should use dispatch_async somehow, but I cant figure out how to do this in swift 3.
var activityIndicator = UIActivityIndicatorView(activityIndicatorStyle: UIActivityIndicatorViewStyle.gray)
override func viewDidLoad() {
super.viewDidLoad()
// activity indicator
activityIndicator = UIActivityIndicatorView(frame: CGRect(x: 100 ,y: 200,width: 50,height: 50)) as UIActivityIndicatorView
activityIndicator.hidesWhenStopped = true
activityIndicator.activityIndicatorViewStyle = UIActivityIndicatorViewStyle.gray
activityIndicator.center = self.view.center
self.view.addSubview(activityIndicator)
}
func writeToCsv() {
self.activityIndicator.startAnimating() // start the animation
let fileName = "events.csv"
let path = NSURL(fileURLWithPath: NSTemporaryDirectory()).appendingPathComponent(fileName)
var csvText = self.name! + "\n"
csvText += "Date,Start time,End time\n"
// create rest of comma-separated string
for event in self.events! {
let newLine = "\(event.date),\(event.startTime),\(event.endTime)\n"
csvText.append(newLine)
}
// write to csv
do {
try csvText.write(to: path!, atomically: true, encoding: String.Encoding.utf8)
} catch {
print("Failed to create file")
print(error)
}
// create and present view controller with send options
let vc = UIActivityViewController(activityItems: [path as Any], applicationActivities: [])
self.present(vc, animated: true, completion: nil)
self.activityIndicator.stopAnimating() // stop the animation
}
Err, alright bit hard to answer this without a bit more context about your view setup. First of all, make sure your activity indicator is visible without calling the writeCsv method, so you know your view hierarchy is correct. ( I.E. It could be that it is hidden behind some other subview )
Next, in Swift3 Dispatch has been changed to a newer API. I'm not sure whether on OSX they use the raw libdispatch Swift wrapper, but in any case you access it like this:
Background default queue:
DispatchQueue.global(qos: DispatchQoS.QoSClass.default).async { /* code */ }
Main thread:
DispatchQueue.main.async { /* Mainthread code ( UIKit stuff ) */ }
Your own custom queue for CSV generation blocks:
let queue = DispatchQueue(label: "csvgenerator.queue")
queue.async { /* code */ }
Now for your animating / stopAnimation, make sure you call your UIKit related code from the mainthread to prevent weird glitechs and or crashes
Namely:
DispatchQueue.main.async {
self.activityIndicator?.startAnimating()
}
Another good idea might be to use NSOperationQueue instead. It internally uses GCD I believe, but it does integrate very well into iOS and might make some of the dispatching a lot easier to implement. I myself always use GCD instead, but I have never really had long queeu's of work that needed to be done. One of the advantages of NSOperationQueue is that it is a lot more user friendly in terms of cancelling dispatch blocks.
An interesting session video about NSOperationQueue in the WWDC app written by Dave Delong: WWDC Videos 2015
A small minor changes I'd make to your writeCSV method:
guard let path = NSURL(fileURLWithPath: NSTemporaryDirectory()).appendingPathComponent(fileName) else {
// Should throw an error here, or whatever is handy for your app
return
}
Try to avoid forced unwrapping at all stages where possible.
In methods that have this, you can for instance add "throws" to the end of the function definition so you can use try without the do and catch block, while also being able to throw errors in your guard statement so whatever calls writeCsv can catch the error and more easily display it to the user.
Related
I tried to do like this, but it does not work, the text is not copied
if let urlScheme = URL(string: "instagram-stories://share") {
if UIApplication.shared.canOpenURL(urlScheme) {
let imageData: Data = UIImage(systemName:"pencil.circle.fill")!.pngData()!
let items:[String: Any] = ["public.utf8-plain-text": "text","com.instagram.sharedSticker.backgroundImage": imageData]
UIPasteboard.general.setItems([items])
UIApplication.shared.open(urlScheme, options: [:], completionHandler: nil)
}}
I would really appreciate any advice
2 things I can think of:
First, I am not sure the below data in your array can be properly handled by pastebin
let items:[String: Any] = ["public.utf8-plain-text": "text","com.instagram.sharedSticker.backgroundImage": imageData]
Next it seems that the activity of sharing causes data in the PasteBoard to be lost so I can offer the solution to put valid data into the PasteBoard (I am using string for example, you can use something else" from the completion handler of your sharing action, something like this might solve it:
UIApplication.shared.open(urlScheme, options: [:]) { (_) in
UIPasteboard.general.string =
"click on the screen until the paste button appears: https://google.com"
}
EDIT
It seems your set up was right and on reading the docs, IG stories should handle the Paste automatically as it seems to check the pasteboard when you execute this url scheme: instagram-stories://share - so it seems IG checks the pasteboard and performs a paste programmatically and that is why the pasteboard gets cleared.
Maybe because the image you choose is black on the black instagram background, it seems nothing is shared but with some proper image the result seems fine.
The other thing I noticed after reading their docs, they do not allow you to set captions anymore, I cannot find this key anymore public.utf8-plain-text
Another idea I can offer to share text is to convert text into an image and add it as a sticker as the sticker layer comes on top of the background image layer.
You can find multiple ways to convert text to an image and it is not relevant to your solution, here is one way I used
So bringing the code together, I have this:
// Just an example to convert text to UIImage
// from https://stackoverflow.com/a/54991797/1619193
extension String {
/// Generates a `UIImage` instance from this string using a specified
/// attributes and size.
///
/// - Parameters:
/// - attributes: to draw this string with. Default is `nil`.
/// - size: of the image to return.
/// - Returns: a `UIImage` instance from this string using a specified
/// attributes and size, or `nil` if the operation fails.
func image(withAttributes attributes: [NSAttributedString.Key: Any]? = nil, size: CGSize? = nil) -> UIImage? {
let size = size ?? (self as NSString).size(withAttributes: attributes)
return UIGraphicsImageRenderer(size: size).image { _ in
(self as NSString).draw(in: CGRect(origin: .zero, size: size),
withAttributes: attributes)
}
}
}
// Then inside some function of yours
func someFunction() {
if let urlScheme = URL(string: "instagram-stories://share") {
if UIApplication.shared.canOpenURL(urlScheme) {
let imageData: Data = UIImage(named: "bg")!.pngData()!
let textImage: Data = "Shawn Test".image(withAttributes: [.foregroundColor: UIColor.red,
.font: UIFont.systemFont(ofSize: 30.0)],
size: CGSize(width: 300.0, height: 80.0))!.pngData()!
let items = ["com.instagram.sharedSticker.stickerImage": textImage,
"com.instagram.sharedSticker.backgroundImage": imageData]
UIPasteboard.general.setItems([items])
UIApplication.shared.open(urlScheme, options: [:], completionHandler: nil)
}
}
}
I then see this in IG stories with correct background and text as sticker which can be moved.
Only downside of using the sticker is you cannot edit the text in Instagram.
Regarding the research looks like the only one workaround to have a text/link copied in the Pasteboard when IG Story is opened is to use:
UIPasteboard.general.string = "your link here"
but you need to do it with a delay - like:
UIApplication.shared.open(instagramStoryURL, options: [:]) { success in
DispatchQueue.main.asyncAfter(deadline: .now() + 0.5) {
UIPasteboard.general.string = "your link here"
}
}
to try to be sure the it won't override:
UIPasteboard.general.items
that contains, for example, "com.instagram.sharedSticker.stickerImage"
Also, please be careful with a delay - as iOS has some privacy restrictions to allow copy data to UIPasteboard when the App is in background (based on the tests we have less than 1 second to do that.
It means that you could try to copy the link this way:
override func viewDidLoad() {
super.viewDidLoad()
NotificationCenter.default.addObserver(self, selector: #selector(appMovedToBackground), name: UIApplication.willResignActiveNotification, object: nil)
}
#objc func appMovedToBackground() {
DispatchQueue.main.asyncAfter(deadline: .now() + 0.75) {
UIPasteboard.general.string = "your link here"
}
}
Anyway, there is one obvious inconvenience.
Each time you try to call "instagram-stories://share" API the first time - you face a system popup that asks for the permisson to allow to open Instagram and also to allow paste the data.
In this case we'll lose, for example, "com.instagram.sharedSticker.stickerImage" data as by the delay it will be overrided by UIPasteboard.general.string.
But we could to make it expected for users by any UI/UX solution with instructions/guide.
This problem is caused by user interface interactions such as showing the titlebar while in fullsreen. That question's answer provides a solution, but not how to implement that solution.
The solution is to render on a background thread. The issue is, the code provided in Apple's is made to cover a lot of content so most of it will extraneous code, so even if I could understand it, it isn't feasible to use Apple's code. And I can't understand it so it just plain isn't an option. How would I make a simple Swift Metal game use a background thread being as concise as possible?
Take this, for example:
class ViewController: NSViewController {
var MetalView: MTKView {
return view as! MTKView
}
var Device: MTLDevice = MTLCreateSystemDefaultDevice()!
override func viewDidLoad() {
super.viewDidLoad()
MetalView.delegate = self
MetalView.device = Device
MetalView.colorPixelFormat = .bgra8Unorm_srgb
Device = MetalView.device
//setup code
}
}
extension ViewController: MTKViewDelegate {
func mtkView(_ view: MTKView, drawableSizeWillChange size: CGSize) {
}
func draw(in view: MTKView) {
//drawing code
}
}
That is the start of a basic Metal game. What would that code look like, if it were rendering on a background thread?
To fix that bug when showing the titlebar in Metal, I need to render it on a background thread. Well, how do I render it on a background thread?
I've noticed this answer suggests to manually redraw it 60 times a second. Presumably using a loop that is on a background thread? But that seems... not a clean way to fix it. Is there a cleaner way?
The main trick in getting this to work seems to be setting up the CVDisplayLink. This is awkward in Swift, but doable. After some work I was able to modify the "Game" template in Xcode to use a custom view backed by CAMetalLayer instead of MTKView, and a CVDisplayLink to render in the background, as suggested in the sample code you linked — see below.
Edit Oct 22:
The approach mentioned in this thread seems to work just fine: still using an MTKView, but drawing it manually from the display link callback. Specifically I was able to follow these steps:
Create a new macOS Game project in Xcode.
Modify GameViewController to add a CVDisplayLink, similar to below (see this question for more on using CVDisplayLink from Swift). Start the display link in viewWillAppear and stop it in viewWillDisappear.
Set mtkView.isPaused = true in viewDidLoad to disable automatic rendering, and instead explicitly call mtkView.draw() from the display link callback.
The full content of my modified GameViewController.swift is available here.
I didn't review the Renderer class for thread safety, so I can't be sure no more changes are required, but this should get you up and running.
Older implementation with CAMetalLayer instead of MTKView:
This is just a proof of concept and I can't guarantee it's the best way to do everything. You might find these articles helpful too:
I didn't try this idea, but given how much convenience MTKView generally provides over CAMetalLayer, it might be worth giving it a shot:
https://developer.apple.com/forums/thread/89241?answerId=268384022#268384022
Is drawing to an MTKView or CAMetalLayer required to take place on the main thread? and https://developer.apple.com/documentation/quartzcore/cametallayer/1478157-presentswithtransaction
class MyMetalView: NSView {
var displayLink: CVDisplayLink?
var metalLayer: CAMetalLayer!
override init(frame frameRect: NSRect) {
super.init(frame: frameRect)
setupMetalLayer()
}
required init?(coder: NSCoder) {
super.init(coder: coder)
setupMetalLayer()
}
override func makeBackingLayer() -> CALayer {
return CAMetalLayer()
}
func setupMetalLayer() {
wantsLayer = true
metalLayer = layer as! CAMetalLayer?
metalLayer.device = MTLCreateSystemDefaultDevice()!
// ...other configuration of the metalLayer...
}
// handle display link callback at 60fps
static let _outputCallback: CVDisplayLinkOutputCallback = { (displayLink, inNow, inOutputTime, flagsIn, flagsOut, context) -> CVReturn in
// convert opaque context pointer back into a reference to our view
let view = Unmanaged<MyMetalView>.fromOpaque(context!).takeUnretainedValue()
/*** render something into view.metalLayer here! ***/
return kCVReturnSuccess
}
override func viewDidMoveToWindow() {
super.viewDidMoveToWindow()
guard CVDisplayLinkCreateWithActiveCGDisplays(&displayLink) == kCVReturnSuccess,
let displayLink = displayLink
else {
fatalError("unable to create display link")
}
// pass a reference to this view as an opaque pointer
guard CVDisplayLinkSetOutputCallback(displayLink, MyMetalView._outputCallback, Unmanaged<MyMetalView>.passUnretained(self).toOpaque()) == kCVReturnSuccess else {
fatalError("unable to configure output callback")
}
guard CVDisplayLinkStart(displayLink) == kCVReturnSuccess else {
fatalError("unable to start display link")
}
}
deinit {
if let displayLink = displayLink {
CVDisplayLinkStop(displayLink)
}
}
}
I call the function. Alright
func tabBarController(_ tabBarController: UITabBarController, shouldSelect viewController: UIViewController) -> Bool {
let index = viewControllers?.index(of: viewController)
if index == 2 {
let layout = UICollectionViewFlowLayout()
let photoSelectorController = PhotoSelectorController(collectionViewLayout: layout)
let navController = UINavigationController(rootViewController: photoSelectorController)
present(navController, animated: true, completion: nil)
return false }
return true
}
Photos not showing on first time
I have all of the right things asking for permission and everything..
I then call for the images with these functions. It works, but the second time I hit the button after canceling posting a post..
I'm not sure how to get the images from the library for the first call.
After that it works like a charm, but most users have been telling me this isn't a good experience , if they have to try twice.
I'm trying to reduce friction in the app usage.
It should show the pictures right after the user "Allows" the app access to the pictures so they can post, but I'm not sure what I'm doing wrong for it to show the pictures soon as someone grants access.
var selectedImage: UIImage?
var images = [UIImage]()
var assets = [PHAsset]()
fileprivate func assetsFetchOptions() -> PHFetchOptions {
let fetchOptions = PHFetchOptions()
fetchOptions.fetchLimit = 100
let sortDescriptor = NSSortDescriptor(key: "creationDate", ascending: false)
fetchOptions.sortDescriptors = [sortDescriptor]
return fetchOptions
}
fileprivate func fetchPhotos() {
let allPhotos = PHAsset.fetchAssets(with: .image, options: assetsFetchOptions())
DispatchQueue.global(qos: .background).async {
allPhotos.enumerateObjects { (asset, count, stop) in
print(asset)
let imageManager = PHImageManager.default()
let targetSize = CGSize(width: 200, height: 200)
let options = PHImageRequestOptions()
options.isSynchronous = true
imageManager.requestImage(for: asset, targetSize: targetSize, contentMode: .aspectFit, options: options, resultHandler: { (image, info) in
if let image = image {
self.images.append(image)
self.assets.append(asset)
if self.selectedImage == nil {
self.selectedImage = image
}
}
if count == allPhotos.count - 1 {
DispatchQueue.main.async {
self.collectionView?.reloadData()
}
}
})
}
}
}
If you fetchAssets before the user grants privacy access to your app, you'll get a PHFetchResult that's empty.
However, if before making that fetch you register as a photo library observer, you'll get a photoLibraryDidChange callback as soon as the user approves privacy access for the app... from that callback you can access an updated version of your original fetch result (see changeDetails(for:)) that has all of the assets your fetch should have found. Then you can tell your UI to update and display those assets. (This is how Apple's canonical PhotoKit example code works.)
Also, once you have a populated fetch result, please don't request thumbnails for the whole thing the way you're doing.
Users commonly have photo libraries with tens of thousands of assets, many of which are in iCloud and not on the local device. If you synchronously get all thumbnails, you'll take forever, use tons of memory and CPU resources, and generate all kinds of network traffic (slowing things down even more) for resources your user may never see.
PhotoKit is designed to allow easy use in conjunction with UI elements like UICollectionView. A collection view only loads cells that are currently (or soon to be) on screen, even if you've told it you have zillions of items in your collection — similarly, you can request thumbnails only for assets that are visible in your collection view. Wherever you have your per-cell UI setup logic is where you should have your PHImageManager request. (Again, this is what the canonical PhotoKit example code does.)
You can optimize even further by "preheating" the thumbnail fetch/generation process for assets that are soon to be onscreen. And then by managing your "preheating" to cancel such work in progress when further UI updates (e.g. fast scrolling of large collection) make it unnecessary. PHCachingImageManager does this. (And yet again, it's what the canonical Apple sample does. Actually, that sample's a bit out of date, and as such does more work than it needs to on this front — it does its own calculation of what cells are just outside the scroll rect, but since iOS 10 the UICollectionViewDataSourcePrefetching protocol manages that for you.)
So I know swift has mapViewDidFinishRenderingMap(_:fullyRendered:), but I have no clue how I can use that function to only add my mapView onto the view hierarchy when it finishes rendering.
Right now, even with the DispatchGroup .enter() and .leave(), the view.addSubview(mapView) is getting called before it finishes rendering, which results in my MKPolylines not showing up on the map when it loads.
How can I make sure that the map finishes rendering before view.addSubview(mapView) is called?
#Koen basically what my viedDidLoad() is doing:
override func viewDidLoad() {
super.viewDidLoad()
mapView = MKMapView()
let leftMargin:CGFloat = 0
let topMargin:CGFloat = 0
let mapWidth:CGFloat = view.frame.size.width
let mapHeight:CGFloat = view.frame.size.height
mapView?.frame = CGRect(x: leftMargin, y: topMargin, width: mapWidth, height: mapHeight)
mapView?.mapType = MKMapType.standard
mapView?.isZoomEnabled = true
mapView?.isScrollEnabled = true
mapView?.delegate = self
mapView?.showsScale = true
// I have this to make sure they run in order only after the previous block is finished because of dependencies
let group = DispatchGroup()
group.enter()
DispatchQueue.main.async {
// gets the routes as polylines and adds to mapView
self.fetchRoutes()
group.leave()
}
goup.notify(queue: .main) {
group.enter()
DispatchQueue.main.async {
// depending on the route data, adds the stops as annotations to mapView
self.fetchAnnotations()
group.leave()
}
group.notify(queue: .main) {
// the idea was that once both route and stops are added to the mapView, display mapView to screen
self.view.addSubview(mapView)
}
}
}
Issue was solved by moving all of my data fetching code into a single function and using DispatchGroup to only call addSubview(mapView) after the fetching finishes.
Was not able to find out the root cause, but the issue was solved.
I am trying to take a picture every 2 seconds by using a while loop. but when I try this the screen freezes.
This is the function that takes the photo:
func didPressTakePhoto(){
if let videoConnection = stillImageOutput?.connectionWithMediaType(AVMediaTypeVideo){
videoConnection.videoOrientation = AVCaptureVideoOrientation.Portrait
stillImageOutput?.captureStillImageAsynchronouslyFromConnection(videoConnection, completionHandler: {
(sampleBuffer, error) in
if sampleBuffer != nil {
let imageData = AVCaptureStillImageOutput.jpegStillImageNSDataRepresentation(sampleBuffer)
let dataProvider = CGDataProviderCreateWithCFData(imageData)
let cgImageRef = CGImageCreateWithJPEGDataProvider(dataProvider, nil, true, .RenderingIntentDefault)
let image = UIImage(CGImage: cgImageRef!, scale: 1.0, orientation: UIImageOrientation.Right)
UIImageWriteToSavedPhotosAlbum(image, nil, nil, nil)
//Adds every image taken to an array each time the while loop loops which will then be used to create a timelapse.
self.images.append(image)
}
})
}
}
To take the picture I have a button which will use this function in a while loop when a variable called count is equal to 0, but when the end button is pressed, this variable is equal to 1, so the while loop ends.
This is what the startPictureButton action looks like:
#IBAction func TakeScreanshotClick(sender: AnyObject) {
TipsView.hidden = true
XBtnTips.hidden = true
self.takePictureBtn.hidden = true
self.stopBtn.hidden = false
controls.hidden = true
ExitBtn.hidden = true
PressedLbl.text = "Started"
print("started")
while count == 0{
didPressTakePhoto()
print(images)
pressed = pressed + 1
PressedLbl.text = "\(pressed)"
print(pressed)
sleep(2)
}
}
But when I run this and start the timelapse the screen looks frozen.
Does anyone know how to stop the freeze from happening - but also to add each image taken to an array - so that I can turn that into a video?
The problem is that the method that processes clicks on the button (TakeScreanshotClick method) is run on the UI thread. So, if this method never exits, the UI thread gets stuck in it, and the UI freezes.
In order to avoid it, you can run your loop on the background thread (read about NSOperation and NSOperationQueue). Occasionally you might need to dispatch something from the background thread to the UI thread (for instance, commands for UI updates).
UPDATE: Apple has a really great documentation (best of what I've seen so far). Have a look at this: Apple Concurrency Programming Guide.
You are calling the sleep command on the main UI thread, thus freezing all other activity.
Also, I can't see where you set count = 1? Wouldn't the while loop continue forever?