Is it possible to detect if the hardware display has completed the process of switching between display modes? - operating-system

The reason I ask is because I just bought a new LCD that takes approximately 5 seconds to change between display modes, such as from 1920x1080x32bpp to 1280x800x32bpp. Does a programmatic solution exist to detect if the display is ready for video output?

I believe the answer is 'no'. Once the display adapter starts transmitting a new signal in a new refresh rate, etc., it has absolutely no way of knowing how long it takes the display to internally process the change and start showing an image. This is a hardware limitation with both DVI and VGA signals, not a software one.

Related

PsychToolbox SSVEP

I want to make an 'SSVEP' flickering lines on the monitor using PsychToolbox.
By the way, when I made a 6Hz-flickering line, it sometimes disappeared: I guess there is some conflict between 6Hz-flickering stimulus presentation and the monitor refresh rate (60Hz).
How can I solve this problem?
If you could provide the code, it would be most appreciated.
In principle, you should wait for "vsync" (just before displaying a new 60-Hz frame), and then "flip" to the other (already filled) screen buffer.

RFB Protocol : Send incremental framebuffer update

I am writing a RFB server which is able to communicate with RFB Client. The main question is as follows:
Currently i am able to capture and send the entrire screen to the client in RAW format. The speed is damn slow for this. Also, the client is sending incremental as false. I want to know
a) Whats the best approach so for the server to detect that the screen has changed?
b) How to send only the changes screen information to the client.
I know that query asks much off information but still my main point is to know the LOGIC to enable the server to be able to send ONLY incremental updates and to DETECT that screen has changed.
I am working in C/C++
Here is a simple Logic which you can use to find incremental updates
1.) store last known framebuffer as refrenceBuffer.
2.) Whenever the client requests a incremental framebuffer do step 3.
3.) capture the current framebuffer and compare if its different from refrenceBuffer
4.) if both the framebuffer continue current framebuffer capturing and comparing with refrenceBuffer till any change occurs(preferably do this in a seperate thread, you may add a few milli seconds incase thread is too heavy.)
5.) The logic compare can be simple bit compare and then find the largest rectangle which encompases all these changes ( you can send to client only this on rectangle)
This depends on the system you are working with and how you are grabbing the screen. There are many ways you can accomplish this. Most involve keeping a buffer of what you have sent to the client and comparing it to what is currently on the screen.
The naivest and simplest to code example( I think ) is to simply hold the screen in little buffers. Lets say you have a 1024 x 1024 pixel frame buffer representing the screen. So in addition to holding (1) 1024 x 1024 buffer of the screen, you keep (256) buffers of size 64 x 64.
Every time you update your big buffer, you compare it to your little buffers, if something changed, you copy that section into the little buffer and send it to the client.
How to detect whether the screen has changed is covered in other answers.
A client will usually send a UpdateRequest with incremental = false for the first update, because it wants to draw the whole thing to start with.
Once it has received a rectangle that covers the whole screen, it should then switch to asking for just incremental.

Response time for a UIButton

I have developed a test for iPod/iPhone (with MonoTouch if that is relevant) that measures reaction time. But I need to take into consideration the time between touching the screen and actual triggering of the button event. Is there any documentation of that?
It's already very hard to almost impossible to get predictable interrupt latency on real time operating systems.
But on the iPhone? Imho impossible. A capacitive touchscreen is not optimal to get results that are exactly the same for each body and location. And if mail.app decides to poll for emails just at the moment you'll touch the screen there will be a bigger delay.
But to make one thing clear, we are speaking about some micro seconds or even less than that.
If you want accurate results you shouldn't use an iPhone. But I guess your app will be some kind of game, so nobody cares if your result is 0.01 seconds off. But I wouldn't show results as 0.381829191 seconds, that fakes accuracy you'll never get on any smartphone.
What is the lowest reaction time you got in your app?
The time between an actual touch and the system registering it will be negligable.
One key thing: if you are detecting the press using touch events like touchUpInside, consider using the touchesDownInside event because touchesUpInside, will not fire until the user's finger leaves the screen.

Getting Framerate Performance on iPhone

I've just come off the PSP where performance testing was easy. You just turned off 'vsync' and printed out the frameratem, then change something and see whether the frame rate goes up or down...
Is there any way to do the same thing on the iPhone? How do you turn vsync off? The Instruments tool is next to useless. Its chief problem being that it running it adversely affects the performance of the app! Also, the frame rate it reports is extremely sporadic.
I don't want any fancy tool that reports call trees and time spent in each function. I just want an unrestricted frame rate and some way to see what it is. Is there a high precision counter that you can use on the iPhone? Something like QueryPerformanceCounter in windows?
Also, is there anyway for you to somehow KILL backround processes so you know they can't effect the performance, perhaps solving the sporatic frame rate problem?
Profile your app with Instruments and use the Core Animation instrument. It gives a frame rate.
You're taking the try-something-and-measure approach. Very indirect. It's easy to tell exactly what is taking the time; it doesn't depend on what else is going on and doesn't require learning a new tool. All you need is a debugger that you can interrupt.
You can't kill background processes on the iPhone. That would make it possible for a buggy or malicious app to interfere with the phone function and the needs of all other functions on the iPhone are subordinated to the phone.
Try QuartzDebug or OpenGL Profiler.
Use instruments to get the frame rate.
To do this, run profile on your app (click and hold on the run button in xcode and choose profile). Make sure you are running your app on device. Choose openGL ES analysis. Look at the data display under core animation frames per second.
You want to aim for 60fps.

Motion detection using iPhone

I saw at least 6 apps in AppStore that take photos when detect motion (i.e. a kind of Spy stuff). Does anybody know what is a general way to do such thing using iPhone SDK?
I guess their apps take photos each X seconds and compare current image with previous to determine if there any difference (read "motion"). Any better ideas?
Thank you!
You could probably also use the microphone to detect noise. That's actually how many security system motion detectors work - but they listen in on ultrasonic sound waves. The success of this greatly depends on the iPhone's mic sensitivity and what sort of API access you have to the signal. If the mic's not sensitive enough, listening for regular human-hearing-range noise might be good enough for your needs (although this isn't "true" motion-detection).
As for images - look into using some sort of string-edit-distance algorithm, but for images. Something that takes a picture every X amount of time, and compares it to the previous image taken. If the images are too different (edit distance too big), then the alarm sounds. This will account for slow changes in daylight, and will probably work better than taking a single reference image at the beginning of the surveillance period and then comparing all other images to that.
If you combine these two methods (image and sound), it may get you what you need.
You could have the phone detect light changes meaning using the sensor at the top front of the phone. I just don't know how you would access that part of the phone
I think you've about got it figured out- the phone probably keeps images where the delta between image B and image A is over some predefined threshold.
You'd have to find an image library written in Objective-C in order to do the analysis.
I have this application. I wrote a library for Delphi 10 years ago but the analysis is the same.
The point is to make a matrix from whole the screen, e.g. 25x25, and then make an average color for each cell. After that, compare the R,G,B,H,S,V of average color from one picture to another and, if the difference is more than set, you have motion.
In my application I use fragment shader to show motion in real time. Any question, feel free to ask ;)