I am trying to create a functionality (for a macOS 11+ application) where I pick a pixel from the screen (e.g like the one from Digital color pickier/ any color picker). However, I am having a hard time finding a way to do that.
I tried to "reverse engineer" some applications from AppStore and when I want to pick the pixel, the apps ask me if I want to give permission to record the screen, so I guess that they are "recording the screen" and they exact the pixel from a movie/picture that they record on the spot. However, this solutions does seems a little bit overkill and I think there should be a better way (a good example is Digital Colour Meter that does not ask for permissions).
Do you think that this can be achievable in a somewhat easy and clear manner? Also, if not, is my guess right with "recording the screen" and capturing a pixel from within the clip (of course, in real time)?
NSColorSampler was introduced in macOS 10.15 (Catalina). You can use it to sample a color from the screen using the system's built-in color picking interface. It does not require screen-recording permissions.
It's mentioned around the 5 minutes mark in WWDC 2019, Session 210, What’s New in AppKit for macOS
I would probably wrap it, and take a hybrid approach where I use old methods (with the screen permission prompt) on old versions, and NSColorSampler for new versions.
Related
I built my unity game through Xcode into an iOS device for testing, and the colors look duller and more pastel-like than in the game view in the editor. The exact same thing has also happened on a modern android device. How can I make it so that the colors as seen in the built game better reflect the colors in the editor?
EDIT: I have sent a file to my phone and found out that the phone perceives colors differently than my monitor. I'm sorry if I'm asking for quite a bit here, but... is there any way to make the game look as I intend it to? The fact that I never see exactly how my game will turn out seems kinda... awkward. Thanks.
I'm not sure if you already tried this, but you might want to use Unity Remote for your iOS device, so you can check the color differences in real time instead of having to build it every time.
But other than what the comments said, it sounds like you should try to adjust your monitor's display settings for better color correlation between the built game and the one in the editor.
i need to merge 5 monitors in XNA (something like Eyefinity).
I have two graphics cards (HD 5450), which have DP connector, of course,
5x flat monitors with resolution 1024*768.
I need to merge/group this monitors in XNA, because i want fullscreen this over 5 monitors.
(fullscreen over multiple monitors)
I just need the visual studio to detect one graphics device with resolution 5120x768.
How i should modify GraphicsDeviceManager / GraphicsAdapter, make it work ?
I cant use Eyefinity, because i have two graphic cards and that i'm trying do "my own eyefinity" in xna.
In my app, i have 5 models dividing to 5 viewports, which are moved every 1024px.
OR, how i should to make it looking like a fullscreen. I don't want the border being visible and i want to have in the middle of screen - how center it ?
Thanks for answers.
To be honest this is going to be difficult if not impossible to do using XNA. And you'd have to get so far outside of what the XNA framework is providing you that there would be little benefit in the end to even using XNA at that point.
Here's a great thread on the App Hub forums talking about different ways of potentially hacking around the XNA framework to achieve multiple monitor fullscreen using XNA.
http://forums.create.msdn.com/forums/p/5562/571993.aspx
As you can see, no one really had any great suggestions and by the time you were dong you were basically programming at such a low level that you might as well be doing C++ and DirectX. Which is exactly what I would recommend to you.
http://msdn.microsoft.com/en-us/library/windows/desktop/bb206364(v=vs.85).aspx
Using DirectX you can see that you're going to get a game/application running fullscreen with a multiple monitor setup much faster and without having to hack your way into it.
while in graduate school for biomedical eng I developed a device we called a "vein finder". It was a simple device that was good enough for our eng school to patent.
I think it would be very very easy to use the iphone camera to develop an iphone app whereby MD's/ nurses/EMT's could use the app to easily identify peripherial veins that are not visible to the naked eye. This would be invaluable in starting IV's and giving bedside medications. The vein finder was esp helpful for patients in shock who had poor venous filling and therefore didn't have veins that "popped into view" with a tourniquet.
It would require using the iphone light at specific wavelengths... anybody have any idea if that is possible?
I don't think you have any control over the flash light that illuminates images. You can only turn it on and off. I also doubt you would be able to get the specific wavelengths you need.
My suggestion would be to look into building a peripheral light device which plugs in to the headphone jack for power and has the necessary functions for emitting light at different wavelengths. That way, you would be able to get the exact result you require.
You may also need to look into the camera itself, as it may not have the ability to capture the light at the wavelengths you may require. Hope that Helps!
Do you mean overlaying veins over the picture on the camera? It would be easy if the veins are always in the same place... What would your imagined procedure be for using this app?
EDIT: You can not manipulate the iPhone light for different frequencies; you can only turn it on and off. You would need to get a separate external light for that.
To work around the fact that you can't set the wavelength of the built in light, you should use an external light along with the iOS device.
Apologies for massive necro!
The LED light of the iPhone is not the problem - most white light sources contain IR or near-IR (in fact they are made up from the entire visible spectrum). Daylight works very well too.
The problem is the receiver. Most camera lenses, including smartphones, have an IR filter - thus cutting out the useful part of the spectrum. All we need is a digital camera with IR filter removed, that has a viewfinder (where you'll see the veins).
Does anyone know what will happen with existing apps when they run on the iPhone 4.0 in terms of the new screen resolution? I am assuming, just like developing for the iPad that there should be no hard coded screen resolutions in your code.
I'd also like advice on the best way of writing robust code to work well on any device. For instance, detecting the screen resolution is not enough - on the iPad the screen is physically bigger so you can display more items on it. On the new iPhone the screen is the same physical size but higher resolution, so the likely thing is that you wont want to display more items, just higher resolution versions of them.
Any help would be useful,
Regards
Dave
EDIT: I have read the other similar posts, I guess what I really would like to know is what is the recommended way to write code for all App Store devices in a robust way so they a) all work b) make best use of the device.
UIKit has be redesigned so that old apps just work unmodified in the iPhone 4. There are then several things you can do, some programmatic and some by just adding higher resolution images to you app bundle.
Firstly, and most simply you can include new double res images that are used by your app with a suffix of #2x in the name. i.e., Event.png as well as Event#2x.png. [UIImage ImageNamed:] will automatically look for the a file with this suffix if it is running at the higher res.
All the other UIKit stuff now uses points instead of pixels. So for both old and new apps full screen is still 320 x 480 points. This pretty much means everything will work including touches, etc. Although they may now return fractions of a point.
The only real gotcha so far seems to be if you use CGBitmapContextCreate as this uses pixels and requires some jiggery - pokery.
I'd like to replicate the visual style of the Springboard's delete badge when you want to delete an application. I've gotten it pretty close, but it's not quite right, and I get the feeling that Apple isn't rendering these on the fly, but rather has a set image that they use.
I was wondering if anyone has done this before, or has such an image, or anything, really :)
I assume you're talking about the little x that is pinned to the top left of the App icon?
As far as I know, it's a PNG that is stored on the iPhone, but of course you have no legal way of getting at the filesystem.
You could always take a screenshot of the iPhone and hack it in Photoshop, but Apple mightn't like that.