I want to make an item with any number of pins insertable into a board area. I'm currently using the VRTK library and haven't been able to accomplish this using interactable and snapzone.
Are there any other ways to achieve this effect? How to make it possible to push one element with three exits for three snap zones? I mean something like this, where the red dots are the snap zone and the green pins are the objects that need to be placed in their place?
enter image description here
I have tried to read AVFoundation documentation and I can't find anything about colors. Is it possible to change the color to red, green or blue? I think I have seen colored torch apps.
Would it be very hard to write your own code that controlled color hue or lumen of the flashlight?
While there are apps that get around this by display a brightly colored screen, the flash on the iPhone is has a single LED element.
You do, via AVCaptureDevice, have the ability to change the intensity of torches on some models. Take a look at the torchLevel property.
You could always try an easy hardware hack.
Does anybody know what the fastest processing times are for the beaglebome black and the RaspberryPi. By processing time, i refer to reading a really fast input signal across any two of their input pins.
Context:
I am building a small particle detector and the pmt output, which i will connect directly to the beaglebone black or a RasberryPi for processing, is 3.3V and ~40ns wide. I am concerned whether these signals will be too fast for these micro-computers to even detect it. And i cant seem to find that info anywhere.
Thanks in advance.
Beaglebone black has programmable realtime units (PRU) that run at 200 MHz (5ns per instruction). With these you will be able to reliably detect and count pulses >=15ns wide that are at least 30ns or so apart.
See this presentation. It's really quite awesome: http://beagleboard.org/pru
Raspberry Pi doesn't have anything similar with real time/reliability guarantees. I don't know whether or not there's a clever way you could get it to do what you want.
I have a question on the Raspberry Pi cam. I am using openCV on a raspberry Pi 2 to make a line-follower for a robot.
Basically the idea is to find the direction of a line in the image using derivatives and color segmentation.
However, I'm found some strange behaviour when I compare the results on an ordinary PC webcamera and the picam. The algorithm works well on the PC webcam, and the direction indicator sits spot on the line. On the picam there is a strange scale and offset which I don't understand.
On both platforms I have tried both the cap.set(CV_CAP_PROP_FRAME_WIDTH/HEIGHT) to rescale the image, as well as the resize function. Both of them still produce the strange offset. I use the circle(...) and line(...) methods in openCV to overlay the line and circles on the captured image.
Could anyone help to explain this behaviour? See the links below for a visual caption.
picam
webcam
Regards
I couldn't add the pictures directly because of the policies of Stackexchange, so had to provide links instead.
I eventually discovered the solution to the problem, and it involved changing the order of the taps of a derivative filter for the Windows and Linux versions of the program. Exactely why this is the case is a mystery to me, and may involve differences in compiler optimization (Visual Studio 13 vs g++ 4.6.3), or maybe a silly error on my part.
On the the PC I use {1 0 -1} filter taps, on the RP2 I have to use {-1 0 1} instead.
The filter runs on a S8 (-127..127) image, so there is no issue of a wraparound.
At any rate, I consider the issue closed.
Overview
The background color of my iPhone app in the simulator (iMac) looks different from the color on the device (iPhone 3GS).
EDIT (following section has been added)
The following are all different:
story board color (xib file)
simulator color
device color
I suppose I should go with how it looks on the device.
Questions
Is this is a common problem other developers face and is there a way to match the colors (systematic procedure) ?
will the color look different on different versions of iPhone (3gs / 4 / 4s) or all the color ?
Am I missing something, is there any specific color profile I should use ?
Is there something like a rule of thumb where RGB values vary by a certain percentage ?
In iPhone 4 and 4S, do the color match the simulator ? ( I don't have a iPhone4 and 4S, so I am not sure.)
Credit goes to #jtbandes for suggesting to send screenshots which led to the solution
I am just answering the question for completeness.
Steps I followed:
Take a screenshot of image in storyboard
Take a screenshot of image in device (use mail / photo stream back to your mac)
Use color picker (part of mac OS color palette) to pick the same spot on both the screenshots
Note down the RGB values (available on the mac OS color palette) of spots chosen in step 3
compare both the RGB values and see the difference
add the RGB offset to match the color.
My RGB offset (not be followed blindly)
based on my experience, i added the following RGB values to get the color I wanted, it is only rough and worked for me:
Red +12
Green +19
Blue +16
Different angles (best to keep it horizontal)
Holding the phone in different angles also gives different shades, keeping it horizontal did give the color
As others have pointed out, this is an issue of Color Spaces.
Two tasks need to be performed:
Get the source color using the sRGB color space.
Set the color in Interface Builder, using the sRGB color space.
Task 1
You can use Apple's Digital Color Meter app to sample the required color, using the Display in sRGB option, but the output is very limited.
Another option is to use an app like SipIt (available free on the App Store, last I checked). Make sure you are sampling in the correct color space by performing:
Preferences -> General -> Color Profiles -> sRGB
You can also set your output format (e.g. hex string).
Task 2
In Interface Builder, open the Color window, choose the the second pane, choose "RGB Sliders". Then click on the cog icon to choose the sRGB color profile.
Once done, paste your color value in to the Hex Color # field
You're done! Your colors should now match.
I am not affiliated with SipIt in any way. But I've been using it for a few years and it is very handy. I'd recommend it to any designer.
The current (as of 3/17/14) CGColorSpace reference includes the following text, which says that sRGB is the native device color space for iOS.
Color Spaces and iOS: iOS does not support ColorSync, so all assets should be provided in the native device color space: sRGB.
It's about color spaces.
Make sure you are picking the color from an sRGBcolor space, and replicating its RGBA values on Interface Builder by selecting sRGB IEC61966-2.1 from the dropdown menu.
If you pick a color in one color space, and generate it in a different color space, it will look different, in spite of having replicated the same RGBA values.
Also, have in mind that when you generate a color through code, using UIColor(red:green:blue:alpha:), it's generated on the sRGB color space by default. That's why it's so convenient to use this one (instead of Adobe RGB, P3, or any other). Besides, sRGB is the standard that's most used in the industry currently.
Fun fact: That UIColor constructor I mentioned above creates colors in the sRGB space since Xcode8. Before, that function used to create them within the Generic RGB space instead. That's why it might cause confusions too.
Here is a resource that extends on the topic: https://medium.com/#volbap/working-efficiently-with-colors-in-xcode-bc4c58b16f9a
iPhone does use color space management so if you want a more "scientific" solution you can create your own color space for example with CGColorSpaceCreateCalibratedRGB. It's on Core graphics level though.
The better way to convert the RGB values to sRGB is using "Digital Color Meter" tool in mac, simply check the "Display in sRGB"
then use the sRGB values in your code instead of RGB
using this tool you can find out generic RGB color and set it directly to xcode as RGB values
One more simple solution when working with image files in your app. Firstly, set your color space in Xcode to sRGB IEC61966-2.1 and apply your colors. Then, before you import any pictures to your app, match your images to sRGB IEC61966-2.1 profile using ColorSync Utility on Mac.
Worked perfectly for me and my background now perfectly matches my imported images. Hope this helps someone.
I had a similar problem, especially with the greens in my project. My issue turned out to be that some of the assets I had were created in illustrator, and the color was set to CMYK instead of RGB. I'm not sure if this was your issue or not, but it might be helpful for those who stumble onto this question in the future. I changed the illustrator "Document color mode" to RGB and all is well.
You can use this webPage for different decks of colors in Xcode.
for example Generic and Device Colors are different a little bit.