how can I work with the touch screen in the ili9341 micropython library, I didn't find much of this topic in every resource I searched for !!!
Related
I bought a new micro:bit v2 board, and want to add it to Scratch as an extension. I followed the 2 steps from https://scratch.mit.edu/microbit. The step 1 is ok, the led lights of my micro:bit board is flashing 5 characters "zepiv", but the step 2 failed.
The scratch link is running, and the bluetooth is enabled.
The os version is macOS Big sur 11.4(Mac mini late 2014), the bluetooth LMP version is 4.0(0x6).
The weird thing is that the board isn't visible to my Mac, but it is to my Android cell phone.
Is this a problem with my computer? Could anyone help me? Thanks in advance.
Good question, I had a lot of trouble with the same issue when connecting my LEGO MINDSTORMS EV3© to scratch. If you want to connect a device easily a good idea is to have it paired via Bluetooth already. To pair it with a mac:
Open the settings app
Select Bluetooth
Switch your micro:bit into pairing mode:
Hold down buttons A and B on the front of your micro:bit together. The front is the side with two buttons and the LED display. Keep the two buttons held down. Don’t let go of them yet!
While still holding down buttons A and B, press and then release the reset button on the back of the micro:bit. Keep holding down buttons A and B.
You should see “PAIRING MODE!” start to scroll across the micro:bit display. When you see this message start to appear you can release buttons A and B.
Eventually you’ll see a strange pattern on your micro:bit display. This is like your micro:bit’s signature. Other people’s micro:bits will probably display a different pattern.
(I found these instructions at this website)
In the Bluetooth menu, look for your micro:bit device and select Pair
After the device has paired, go back to scratch with scratch link activated, and attempt to connect to the device again.
This worked for me when I connected my EV3 device and I hope it helps you.
I'm building an openvr app for steamvr to assist with seated play (my room is small so my tracking area isn't ideal). My app pretty much just adjusts the play-area height when I hold the grip button and "scroll" on the touchpad so that I can reach objects that are too low/high at variable heights. (I tried "OpenVR Advanced Settings" but the options for keybinding with it is limited to simple button presses so I decided to make my own version).
I'd like to prevent touchpad input from being sent to the game while the grip button is being held, so that the moving on the touchpad doesn't cause movement in game, is this possible at all?
I'm assuming it's not possible, but wondering whether anyone has had any experience with this.
After your clarification in the comments the answer is no, you can not "eat up" device inputs in an application, I usually work on OpenVR drivers and there after you submit a device input and/or any other event its available to anything that expects pose update events, and event subscribers can not stop others from receiving the said events
However there might be a work around (if its still an issue) I know of at least 1 application that can do what you want and that application is OVR Toolkit (when the overlay is active and you try to click something in the overlay, the game running in parallel will not receive the input, however that will only happen if OVR Toolkit overlay surface receives input, it may be a built in OpenVR overlay feature and you don't have to do anything or it can be defined by the developer, I don't really have a want to test this right now)
Sadly though OVR Toolkit is not open source, but there is an open source toolkit for unity for making overlays, which is open source and might be the solution you're looking for, it can be found here
Am working on taking snap from my device camera. The front camera is working good,but the rear camera is not upto the mark. It is showing upside down. I donno why it is behaving strange.
Can anyone help me out.
Cheers!!
It's difficult to come up with a simple answer for your problem, this depends on wether you run it on iOS and Android. In general you need to adjust ( rotate and flip ) the resulting image based on the mirror and rotate value in the WebCamTexture.
There is a toolkit available that help you with solving some of these issues that you can take a look at. It's called Camera Capture Kit and is available on the assetstore ( https://www.assetstore.unity3d.com/en/#!/content/56673 ) - since the source is included for both iOS and Android it might be a good foundation for helping you tweak and solve your issue in your game or maybe it will just be a plug'n'play solution for your needs. It might let you save literally weeks in the long run since there is a lot of tweaking to get camera capturing right in Unity. Furthermore there is a functionality available to enable and disable flash light which might come in handy as well.
There is also a demo app included which is quite similar to what you have already been doing on your screens.
Cheers!
I want to track my physical Movement through iphone within my office using my office map.The map will show all rooms of my office.for example,if i move from administraion block to another block, i have to move one icon from administraion block to another block on the map.in other words, the icon should move as i move within office.is it possible to do in iphone SDK?any help please?
Can't really see how this can be done, most offices have terrible GPS reception. Thus using GPS is out of the questions, als it will not be precise enough.
You could try using bluetooth although it very limited in iOS and you would have to place bluetooth dongles every wehe.
My guess it can't be done with the precision that you need.
I'd like to create an iPhone app that supports tracing of arbitrary shapes using your finger (with accuracy detection). I have seen references to an Apple sample app called "GestureMatch" that supposedly implemented exactly that, but it was removed from the SDK at some point and I cannot find the source anywhere via Google. Does anyone know of a current official sample that demonstrates tracing like this? Or any solid suggestions on other resources to look at? I've done some iPhone programming, but not really anything with the graphics API's or custom handling of touch gestures, so I'm not sure where to start.
If you're on 3.1.3 firmware you can use the touchesBegan, touchesChanged, and touchesEnded methods. If you were to do an iPad app on 3.2, you'd have access to gesture recognizers such as UIPanGestureRecognizer - which provides the same basic functionality but also gives you some extra information.
The problem here is that they will not give you a smooth line without some extra work on your part, but these are the basic ways to handle finger tracking.
Unfortunately I don't have any examples to give you, but check out the stuff I mentioned in the developer documentation. You should be able to at least get started from that.
I'm uncertain if gesture recognizers are available in 4.0. Might be worth checking out.