I am new cocos2dx developer.I am using the version cocos2dx 2.2.6.Now i want to share image on gmail and facebook.I refer many site and suggestion but still i am not clear about sharing.
We can use the UIActivityViewController to create a sharing dialog and use some CCRenderTexture code to take a screenshot. When the share button is tapped, a number of options for sharing are available depending on what applications the user has available on their phone.
You can create a sharing popup using the following code:
func openShareDialog() {
var scene = CCDirector.sharedDirector().runningScene
var node: AnyObject = scene.children[0]
var screenshot = screenShotWithStartNode(node as! CCNode)
let sharedText = "Text"
let itemsToShare = [screenshot, sharedText]
var excludedActivities = [ UIActivityTypeAssignToContact,
UIActivityTypeAddToReadingList, UIActivityTypePostToTencentWeibo]
var controller = UIActivityViewController(activityItems: itemsToShare, applicationActivities: nil)
controller.excludedActivityTypes = excludedActivities
UIApplication.sharedApplication().keyWindow?.rootViewController?.presentViewController(controller, animated: true, completion: nil)
}
Then we can take a screenshot using a CCRenderTexture:
func takeScreenshotWithNode(node: CCNode) -> UIImage {
CCDirector.sharedDirector().nextDeltaTimeZero = true
var viewSize = CCDirector.sharedDirector().viewSize()
var rtx = CCRenderTexture(width: Int32(viewSize.width), height: Int32(viewSize.height))
rtx.begin()
node.visit()
rtx.end()
return rtx.getUIImage()
}
Here's an example of what the popup would actually look like when displayed:
You can find more information on the UIActivityViewController in Apple's docs.
You can might find it handy to start by reading the docs here.
There are additional detailed instructions in the cocos2d-x.org site including ones for iOS.
Related
I got a project that involves a few USDZ files for the augmented reality features embedded in the app. While this works great, and we're really happy with how it performs, the built-in share button of the QLPreviewController is something that we'd like to remove. Subclassing the object doesn't have any effect, and trying to hide the rightBarButtonItem with the controller returned in delegate method still shows the button when a file is selected. The implementation of USDZ + QLPreviewController we're using is pretty basic. Is there a way around this issue?
func numberOfPreviewItems(in controller: QLPreviewController) -> Int {
return 1
}
func previewController(_ controller: QLPreviewController, previewItemAt index: Int) -> QLPreviewItem {
let url = Bundle.main.url(forResource: models[selectedObject], withExtension: "usdz")! controller.navigationItem.rirButtonItems = nil.
// <- no effect return url as QLPreviewItem
}
#IBAction func userDidSelectARExperience(_ sender: Any) {
let previewController = QLPreviewController()
previewController.dataSource = self
previewController.delegate = self
present(previewController, animated: true)
}
This is the official answer from Apple.
Use ARQuickLookPreviewItem instead of QLPreviewItem. And set its canonicalWebPageURL to a URL (can be any URL).
func previewController(_ controller: QLPreviewController, previewItemAt index: Int) -> QLPreviewItem {
guard let path = Bundle.main.path(forResource: "Experience", ofType: "usdz") else { fatalError("Couldn't find the supported input file.") }
let url = URL(fileURLWithPath: path)
if #available(iOS 13.0, *) {
let item = ARQuickLookPreviewItem(fileAt: url)
item.canonicalWebPageURL = URL(string: "http://www.google.com")
return item
} else { }
return url as QLPreviewItem
}
The version check is optional.
My approach is to add the QLPreviewController as an subview.
container is an UIView in storyboard.
let preview = QLPreviewController()
preview.dataSource = self
preview.view.frame = CGRect(origin: CGPoint(x: 0, y: -45), size: CGSize(width: container.frame.size.width, height: container.frame.size.height+45) )
container.addSubview(preview.view)
preview.didMove(toParent: self)
The y offset of the frame's origin and size may vary. This will ensure the AR QuickLook view to be the same size as the UIView, and hide the buttons (unfortunately, all of them) at the same time.
Instead of returning QLPreviewItem, use ARQuickLookPreviewItem which conforms to this protocol.
https://developer.apple.com/documentation/arkit/arquicklookpreviewitem
Then, assign a url that you would want to share (that will appear in share sheet) in canonicalWebPageURL property. By default, this property shares the file url (in this case, the USDZ file url). Doing so would not expose your file URL(s).
TLDR: I don't think you can.
I haven't seen any of the WWDC session even mention this and I can't seem to find any supporting developer documentation. I'm pretty sure the point of the ARKit QLPreviewController is so you don't have to do any actual coding on the AR side. I can see the appeal for this and for customisation in general, however, I'd suggest instead looking at some of the other ARKit projects that Apple has released and attempting to re-create those from the ground up as opposed to stripping this apart.
Please advise if this changes as I'd like to do something similar, especially within Safari.
I couldn't get to the share button at all to hide or disable it. Spent days to overcome this. I did rather unprofessional way of overcoming it. Subview QLPreviewController to a ViewController and subview a button or view on top of image view on top of share button and setting my company logo as image. It will be there all the time, even the top bar hides on full screen in AR mode. Not a clean solution. But works.
I know that AvcaptureSession.Preset.Photo has a size of video output of 750/1000 and has a size of photo output of 3024 / 4034
Is there a way to capture the photo output without using capturePhoto?
I tried to capture from "didoutput" but this is a video output so I get the size of 750/1000.
Help me please..
You can use a UIImagePickerController with sourceType.camera. This opens a pre built camera from apple with which you are able to take pictures.
This would be the function:
func camera() {
if UIImagePickerController.isSourceTypeAvailable(.camera) {
let myCameraController = UIImagePickerController()
myCameraController.delegate = self
myCameraController.sourceType = .camera
self.present(myCameraController, animated: true, completion: nil)
}
}
To use UIImagePickerController you have to set UIImagePickerControllerDelegate and UINavigationControllerDelegate in your class and call this function in ViewDidLoad
Hope that this solved your problem
I call the function. Alright
func tabBarController(_ tabBarController: UITabBarController, shouldSelect viewController: UIViewController) -> Bool {
let index = viewControllers?.index(of: viewController)
if index == 2 {
let layout = UICollectionViewFlowLayout()
let photoSelectorController = PhotoSelectorController(collectionViewLayout: layout)
let navController = UINavigationController(rootViewController: photoSelectorController)
present(navController, animated: true, completion: nil)
return false }
return true
}
Photos not showing on first time
I have all of the right things asking for permission and everything..
I then call for the images with these functions. It works, but the second time I hit the button after canceling posting a post..
I'm not sure how to get the images from the library for the first call.
After that it works like a charm, but most users have been telling me this isn't a good experience , if they have to try twice.
I'm trying to reduce friction in the app usage.
It should show the pictures right after the user "Allows" the app access to the pictures so they can post, but I'm not sure what I'm doing wrong for it to show the pictures soon as someone grants access.
var selectedImage: UIImage?
var images = [UIImage]()
var assets = [PHAsset]()
fileprivate func assetsFetchOptions() -> PHFetchOptions {
let fetchOptions = PHFetchOptions()
fetchOptions.fetchLimit = 100
let sortDescriptor = NSSortDescriptor(key: "creationDate", ascending: false)
fetchOptions.sortDescriptors = [sortDescriptor]
return fetchOptions
}
fileprivate func fetchPhotos() {
let allPhotos = PHAsset.fetchAssets(with: .image, options: assetsFetchOptions())
DispatchQueue.global(qos: .background).async {
allPhotos.enumerateObjects { (asset, count, stop) in
print(asset)
let imageManager = PHImageManager.default()
let targetSize = CGSize(width: 200, height: 200)
let options = PHImageRequestOptions()
options.isSynchronous = true
imageManager.requestImage(for: asset, targetSize: targetSize, contentMode: .aspectFit, options: options, resultHandler: { (image, info) in
if let image = image {
self.images.append(image)
self.assets.append(asset)
if self.selectedImage == nil {
self.selectedImage = image
}
}
if count == allPhotos.count - 1 {
DispatchQueue.main.async {
self.collectionView?.reloadData()
}
}
})
}
}
}
If you fetchAssets before the user grants privacy access to your app, you'll get a PHFetchResult that's empty.
However, if before making that fetch you register as a photo library observer, you'll get a photoLibraryDidChange callback as soon as the user approves privacy access for the app... from that callback you can access an updated version of your original fetch result (see changeDetails(for:)) that has all of the assets your fetch should have found. Then you can tell your UI to update and display those assets. (This is how Apple's canonical PhotoKit example code works.)
Also, once you have a populated fetch result, please don't request thumbnails for the whole thing the way you're doing.
Users commonly have photo libraries with tens of thousands of assets, many of which are in iCloud and not on the local device. If you synchronously get all thumbnails, you'll take forever, use tons of memory and CPU resources, and generate all kinds of network traffic (slowing things down even more) for resources your user may never see.
PhotoKit is designed to allow easy use in conjunction with UI elements like UICollectionView. A collection view only loads cells that are currently (or soon to be) on screen, even if you've told it you have zillions of items in your collection — similarly, you can request thumbnails only for assets that are visible in your collection view. Wherever you have your per-cell UI setup logic is where you should have your PHImageManager request. (Again, this is what the canonical PhotoKit example code does.)
You can optimize even further by "preheating" the thumbnail fetch/generation process for assets that are soon to be onscreen. And then by managing your "preheating" to cancel such work in progress when further UI updates (e.g. fast scrolling of large collection) make it unnecessary. PHCachingImageManager does this. (And yet again, it's what the canonical Apple sample does. Actually, that sample's a bit out of date, and as such does more work than it needs to on this front — it does its own calculation of what cells are just outside the scroll rect, but since iOS 10 the UICollectionViewDataSourcePrefetching protocol manages that for you.)
I want to prevent taking screenshot of a page in app.
how to do it programmatically so that screenshots cannot be taken.
Found code to detect screenshot. Can it be deleted as soon as a screenshot is taken?
let mainQueue = NSOperationQueue.mainQueue()
NSNotificationCenter.defaultCenter().addObserverForName(UIApplicationUserDidTakeScreenshotNotification,
object: nil,
queue: mainQueue) { notification in
// executes after screenshot
}
There is no way to prevent ScreenShots but you can prevent Screen Recording
through this code.
func detectScreenRecording(action: #escaping () -> ()) {
let mainQueue = OperationQueue.main
NotificationCenter.default.addObserver(forName: UIScreen.capturedDidChangeNotification, object: nil, queue: mainQueue) { notification in
// executes after screenshot
action()
}
}
//Call in vewWillApper
detectScreenRecording {
print(UIScreen.main.isCaptured)
if UIScreen.main.isCaptured {
//your vier hide code
print("self.toHide()")
} else {
// self.sceneDeleg(ate?.window?.isHidden = false
//your view show code
print("self.toShow()")
}
}
There is absolutely no way to completely prevent user from taking screenshot during the app process, and that's because you do not have access to delete photos in the photo gallery of the user. It would totally be a security issue if you could access your user's photos.
However, there are ways to partially prevent screenshots, as described here: Prevent screen capture in an iOS app
Technically that is possible, via the Photos framework, the docs for which can be found here.
Example code can be found here.
However, this will ask the user's permission first, and then again to confirm deletion; so possibly not the ideal solution. Unfortunately this is as good as it gets as Apple has the Camera Roll fairly locked down.
You cannot prevent user from taking screenshot, however, you can hide the content while a screenshot is taken, Use this code to do so..
extension UIView {
func hideContentOnScreenCapture() {
DispatchQueue.main.async {
let field = UITextField()
field.isSecureTextEntry = true
self.addSubview(field)
field.centerYAnchor.constraint(equalTo: self.centerYAnchor).isActive = true
field.centerXAnchor.constraint(equalTo: self.centerXAnchor).isActive = true
self.layer.superlayer?.addSublayer(field.layer)
field.layer.sublayers?.first?.addSublayer(self.layer)
}
}
}
Usage:
yourView.hideContentOnScreenCapture()
I am trying to figure out how to create a user interactive post or tweet kind of like SoundCloud's here below:
The portion highlighted in yellow is the part that interests me because as far as I can tell when it comes to UIActivityViewController (which is what Sound Cloud uses for this) the only objects that work for sharing are images and strings.
Further more, if you were to tap the portion highlighted in yellow this screen would pop up on twitter:
HOW DO THEY DO THAT!? THEY HAVE A PAUSE BUTTON AND EVERYTHING!
This is my attempt to do that...
func displayShareSheet(shareContent:String) {
let someView:CustomView = CustomView() // CustomView is a subclass of UIView
let activityViewController = UIActivityViewController(activityItems: [someView], applicationActivities: nil)
presentViewController(activityViewController, animated: true, completion: {})
}
...which doesn't work. The UIActivityViewController sheet pops up with no share options indicated.
I understand that some may consider this a broad question but if you could at least point me in the right direction I would be very grateful. Thank You.
This works. For the full list of sharing destinations run it on your device and not the simulator. The simulator gives you a smaller list.
func createActivityController() -> UIActivityViewController {
let someText:String = textView.text
let google = NSURL(string:"http://google.com/")!
// let's add a String and an NSURL
var activityViewController = UIActivityViewController(
activityItems: [someText, google],
applicationActivities: nil)
activityViewController.completionHandler = {(activityType, completed:Bool) in
if !completed {
print("cancelled")
return
}
if activityType == UIActivityTypePostToTwitter {
print("twitter")
}
if activityType == UIActivityTypeMail {
print("mail")
}
}
// you can specify these if you'd like.
// activityViewController.excludedActivityTypes = [
// UIActivityTypePostToTwitter,
// UIActivityTypePostToFacebook,
// UIActivityTypePostToWeibo,
// UIActivityTypeMessage,
// UIActivityTypeMail,
// UIActivityTypePrint,
// UIActivityTypeCopyToPasteboard,
// UIActivityTypeAssignToContact,
// UIActivityTypeSaveToCameraRoll,
// UIActivityTypeAddToReadingList,
// UIActivityTypePostToFlickr,
// UIActivityTypePostToVimeo,
// UIActivityTypePostToTencentWeibo
// ]
return activityViewController
}
For the first part it's only play button that clicks you to popup the player view.
Second: You can do this by just popping a new viewController, use any popup pod: CWPOP as ex. Or whatever suits you or as Twitter here i'm not sure but probably they're doing their own, and you just build a normal view which has the play and everything.
They used do it better before and let you play the music during going through tweets, that was way better to me at least.