My goal is to convert a block of text that is on the screen into an image and save it to the user's gallery (or downloads folder, or anywhere on there phone that allows them access to it)
On searching through the many many threads on this subject it seems the best way is to take a screenshot with the "camera".
However, the manual tells me to use Application.CaptureScreenshot but in unity it says that is depreciated.
All i can find on searching for alternatives are paid for assets (seems dodgy for something that should be so simple) and examples of using Application.CaptureScreenshot
ScreenCapture.CaptureScreenshot is what you are looking for now.
Related
I am receiving iPhone photos from the Sendblue messaging service and need to determine if they’re vertical or horizontal. This seems like it should be really easy but is giving me trouble. They’re in a CDN (link to example photo).
Any solution would be good, ideally it's something simple and doesn't require another app. Seems like this should be much easier than it is not sure what I'm missing.
So far I’ve tried this post and the CLI that resulted from it, but when I try to add it doesn’t show up in my apps.
I also tried the “mallabe” Zapier app, but it’s saying the photo isn't “publicly accessible”. This confuses me as the photo seems accessible to anybody I send it to.
AirEXIF is another solution I am looking into, I just applied to use their app so waiting to hear back on that front (not much activity from them recently so hoping they're still around).
I've started to work with Ionic, just to have some experience with it. And so far I like and hate it.
I hate it, because it's confusing sometimes. Like in this case: what should I use #ionic-native/camera or #ionic-native/media-capture to take pictures? The examples I found use the first one, but I see that the second one provides me with more information about the picture after taking it (such as width/height).
So - which should I use? And if I am supposed the first one, how can I retrieve the image information (e.g. width/height) afterwards?
Sorry if my question is stupid, but I really can't find good information regarding this topic.
Long story short: Use whatever plugin you want.
Both of the plugins have an ionic-native wrapper available (cordova-plugin-media-capture and cordova-plugin-camera). If the media-capture plugin fits your needs better you should use this one. Both plugins are maintained by Apache so they should both be of high quality.
So whats the difference then?
The media-capture plugin simply offers more functionality than the camera one. You can capture Audio, Video and Photos and with cordova-plugin-camera you can only capture single photos.
Facebook recently announced the introduction of messenger codes which can be used to add new contacts and, more importantly, communicate directly with businesses and business pages (which is why I'm interested in it).
It took me ages to find it but on the bottom left of the messages tab on my Facebook page I have the option to download my code in three different sizes - clicking the disc will open a modal window where you can click the Download button and choose from 300, 600 or 1000px PNG file downloads.
NOTE: While they are PNG files the background is not transparent which seems like a bit of an oversight to me but hey ho that's what Photoshop is for I guess.
The problem is that while I can download my code I can't find any way to test it on printed materials (or even electronically at the moment!). The scanning feature doesn't seem to have been rolled out for me yet (I tried re-installing the Messenger app to see if I got a newer version but that didn't work) and nor for anyone I know (I'm in the UK). The codes are bespoke to Messenger so can't be scanned or tested using any other app.
I'm probably too far ahead of the game but is there any way I can test to see if my code scans correctly, or anywhere I can go to find out? I would like to use it on some promotional material which is likely to be long term materials that I don't want to have to update in the near future (several years, by which time it's likely these codes will be more commonplace).
I also need to know what the redundancy is like. For example the high redundancy QR codes I generate can have up to 30% of the code covered while still being usable, which is great for design purposes. I can't find any official documentation as yet for these codes at all, let alone what is required, what the spec. is etc.
I know the most likely option is 'sit and wait' but I really would rather not if possible. I've never been very patient...
Thanks
UPDATE: My Messenger app has now been updated so I can test, but I'm leaving this here in case anyone knows of another way to test perhaps? If someone doesn't have Messenger on their phone for example.
I have some basic questions that I couldn't figure out after searching for quite some time now. All tutorials and guides I've come across have the code already set up and I can't find them of much use. For instance, the Friend Smash example has the code integrated with buttons and other scripts that I can't use it in my own game.
I used Parse to upload my game and test it on facebook and it's working fine, but I want to add the social features to it (login, share, score etc.)
Picking up the code snippets that I'm supposed to use either brings me up errors (undefined variables etc.) or messes things up (I get the not-working, bugged login window on top of the running game inside Unity, while nothing happens if I build and run it).
Where am I supposed to put the code? For example:
https://developers.facebook.com/docs/unity/reference/current/FB.Init
In the "Example" part, where am I supposed to put this line?
FB.Init(SetInit, OnHideUnity);
Same thing here:
https://developers.facebook.com/docs/unity/reference/current/FB.Feed
If I use the "Example" code as it is I get errors.
Do I have to use specific names for the scripts?
This is my first time uploading a game and trying to add social features to it so these questions may seem simple, but I couldn't find an answer anywhere.
You should create your own class instead of "Example" class, where you put all your Facebook specific functions. Then you should invoke your methods with your own components, buttons, triggers, whatever.
It is also you, who should take care of pausing the game while running social functions (login, share etc.).
I've read a number of posts on Apple's forums, and a number of posts on the Cycling '74 forums (with my own questions scattered around both) and nobody seems to be able to help me.
I used Max/MSP to write a 'patch' that takes samples and generates music. I'm going to release it as an album similar to Brian Eno's Thursday Afternoon, but wanted to make it available to people so they can have the music last for more than the hour a CD can hold.
What I don't know how to do, and can't figure out is HOW. It looks just like a regular OS X app, and the only difference I see in the directory structure is that my Max/MSP made application has extra .framework folders as well as the objects I use (which I guess are similar to 'functions' in JScript). I've looked at the package contents of both OS X files and the unpacked .ipa files from the App Store. Being so similar I would imagine it'd be pretty easy.
Where do I start? Has anybody on this forum done this? Thanks for your time!
[edit] - I just wanted to let you know I've discovered RJDJ, an iOS app that allows users to create 'scenes' in Puredata (Pd) and load them on their RJDJ program. I'd rather not go this route.
[edit2] - ok. I agree that it's very different. Especially having 4 (i could cut it down to 3) additional frameworks that aren't part of the SDK. But Ive been thinking. I can add a JavaScript object inside of my program, or make a special new object (object in max is sort of like a class in JS, i think) using C. Is there anything in these languages that would be able to convert a simple 'touch' to a 'mouseclick' in my app?
My application is very very simple. Basically just samples, played at randomly generated time intervals with some a 'conductor' to bring in/out the groups the samples are drawn from (piano, fx, etc...). So the user just clicks the 'start' button and off it goes. So the .nib file I would need to create is very simple. In my head it seems like the .ipa package/ios .app both contain unix executables and so long as these are basically the same it should work, right?
Max6 has been released.
A new object/concept named gen~ is available.
As far as I discussed with C74 dev, I know gen~ WILL provide its source code output. This code produce by the gen~ object could be useable in any other framework. basically, it will be C++
So it would really open A LOT of possibilities ; Max becoming a real graphical framework producing output that can be used in programming world.
It would save time for some part of the code.
As far as I can see from poking around at the Cycling '74 site and forums, there's currently no Max engine available for iOS. libpd is probably your best bet, really. (I'd note that the Inception app uses this Pure Data engine with a custom interface and it works very well.)
Unfortunately OSX and iOS apps are completely different under the hood. Outwardly they look similar (eg. you've noted the .app extension) but the internals are completely different.