Unexpected high 3g data usage on iPhone with mac [closed] - iphone

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 9 years ago.
Improve this question
It almost killed me 600MB in 20 minutes. I had my iPhone enabled the 'personal hotspot' function(as a wifi hotspot), so my mac can surf the internet through my iPhone. During that 20 minutes, I was just writing code in Xcode, there is no other application running. So how did all the data flow away? Is there anything to do with the system automatic update or Xcode component update? Thanks, I just want to find out where the problem is in case spend more money...

Yes it may be possible. May be you have downloaddiOS 6.0 Documentation Set orMac OS X 10.6 Core Library. Sometime it happens after installing or updating XCODE. You may found new updates and it will automatically download updates.
For checking this thing Go to xcode -> prefrences -> Downloads -> Components. check here it may possible your components are downloaded like this way.
Hope you will get your answer.

This is just my opinion. Your iPhone creates a Wi-Fi hotspot. Your Mac is connected to Wi-Fi, not 3G network. Maybe Macs are not smart enough to understand that they connected to Wi-Fi "modem", which actually is connected 3G. So they still doing background updates. Although, I never had such problem with iPhone as a modem. It can depend on OS version also. I have 10.8.2.
Use activity monitor to track apps which do some networking. Install local firewall app like a Little Snitch and block some apps (or hosts) when you iPhone-networking.

Related

Beacon detection using smart watch [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 6 years ago.
Improve this question
I am newly started work on beacon. Can you tell me one things is it possible beacon detect from watch without phone? Which smart watch, its possible?
Apple Watch's watch OS doesn't expose the necessary iBeacon or Bluetooth LE APIs that'd allow it detect beacons without phone's assistance. The only way for now is to do the scanning on the iPhone, and relay the scan results to your Watch app.
Android Wear on the other hand exposes the same Bluetooth LE API that a regular Android does[1]. Which means that, if the watch itself supports BLE scanning (I'd assume most of them do), you can use the android.bluetooth.le API[2] to detect beacons without phone's assistance.
[1] Well, for most intents and purposes, Android Wear is just a regular Android.
[2] Or any beacon library that utilizes this API, so that you don't have to write all the beacon discovery and parsing yourself. Android Beacon Library is probably the most widely used one. Since you tagged your question estimote, I assume you have Estimote Beacons—if that's correct, then Estimote Android SDK should also work for you. Note: I haven't tested either of the SDKs with Android Wear, so can't confirm 100% they'll work.
I'm currently using Estimote Beacons with their SDK and an Android smartwatch to develop my app.
I had some issues with Android Wear, because the smartwatch reads the signal from the beacons every 5 seconds despite having set on my app much shorter intervals (on the smartphone I had no problem). In my case this represents a major problem.
I hope to have been helpful.
It is possible with Android Wear (with just some limitations) and turns out not possible with Apple Watch. See this experiment http://elekslabs.com/2015/09/nearables-wearables-connecting-beacons-with-smartwatches.html

MixerHostAudio Bluetooth

I'm using MixerHostAudio to listen to what I'm saying, at the same moment I'm talking.
Than I can even use it with my Apple TV and it works.
My question is, can I speak and at the same moment hear what I'm saying on a bluetooth device ? Seems not cause I can't select my Bluetooth device while my app is opened.
Thanks.
Good Question. I had a similar one and spent hours on the phone with Apple trying to get it answered. Unfortunately iOS 7 does not allow AVAudioSession to have input and output configured independently. The closest you can get is using the MultiRoute Category, but it does not currently support Bluetooth devices. Submit a bug report to Apple and hopefully they will add it to future versions of iOS.

iOS detect WiFi hotspots or Bluetooth Devices [duplicate]

This question already has an answer here:
Closed 11 years ago.
Possible Duplicate:
iOS detect WiFi hotspots or Bluetooth Devices
I know that without users permission trying to switch on WiFi or Bluetooth is not possible (rather comes under private API). I dont want to go there.
Assuming WiFi is switched on or assuming bluetooth is switched on. Is it possible to detect all Wifi hotspots around my device (iPhone or iPad)? Same for Bluetooth?
I want to do this in Xcode Version 4.x with iOS SDK 4.3
So the question this is a potential duplicate of doesn't really provide an adequate answer in my mind, so here we go...
iOS has very limited WiFi options available to developers. What you can get: the current hotspot SSID and some other relevant data using CNCopyCurrentNetworkInfo. What you can't get: information about other access points that you may be in range of.
(...you can get this data through a private API call, but then your app won't make it on the app store).
Bluetooth is also fairly limited, but the question Deepak linked to you above actually provides more relevant information on that front.

Embedded computer vision platforms [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 1 year ago.
Improve this question
I am planning to start a computer vision based project on a smart phone platform.
I know iPhone and Andriod have openCV support. I am interested in knowing how was your experience with the level of integration, support and ease of building good apps on either platform.
Also I do want to consider windows phone 7 ( and Zune) as a platform, Are there any Computer Vision libraries for that platform or any good development tools( does Aforgenet work or any other good suggestion) ?
Also can you suggest some popular augmented reality apps which uses cutting edge technology (I am aware of Pranav Mistry's SixthSense)
Thnx in advance!
Unfortunately on Windows Phone 7 there is no access to the camera's data stream currently. Instead you are limited to launching the camera app, letting the user take a picture, then receiving that picture data.
This is a much requested feature so could change at any time.
Check out WordLens for cutting edge AR.
I've built AR and other computer vision apps on iOS using OpenCV and found it to be a solid platform. If nothing else, it provides you with a fast, robust set of libraries for Matrix mathematics as well as optimised versions of some of the more common vision algorithms from feature extraction to 3D reconstruction. It's pretty much a defacto standard so there's a great support community out there too and I'd definitely recommend it.
From a development point of view, I tend to write command line OpenCV apps on my Mac and then, when debugged and running, I look at moving them to iOS. This shortens the test/debug cycle (as I don't have to worry about deploying and debugging under iOS) as well as allowing me to focus on the problem at hand rather than vagaries of the mobile device.
It also makes it straightforward to move on to Android platforms though I've found the Java wrappers for OpenCV on Android to be less great. Again however, focus on getting your core algorithm and processing pipeline working on the desktop in (say) C++, then moving it to a mobile device wrapping in the necessary native code format.
Most probably you will use deep learning If yes you need to optimize everything like deep learning networks to smaller networks, mobile dl frameworks such as Tensorflow lite, etc.
You also need to consider the inference time depending on the hardware

Direct3D over Remote Desktop [closed]

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 9 years ago.
Improve this question
How can I get Direct3D to work over a Remote Desktop connection? I am using Windows XP Professional.
3D acceleration does not work on Remote Desktop or in Virtual PC/Server.
Software-rendered OpenGL works on both Remote Desktop and Virtual PC/Server.
Actually you can use D3D9 over remote desktop in two different ways. I have done & tested both.
(This runs fast)
The best way is to use SwiftShader, a software implementation of D3D9. This should run in any color bit. https://opensource.google/projects/swiftshader
(NOTE: This is very slow)
The second is to install the DirectX SDK on the remote computer. Your app must create a D3D9 reference device & the remote computer must be set to 16bit color.
It works for me if I start the 3D program first in the local machine, and I later get the session using remote desktop.
You may use VirtualGL for this purpose, if you like OpenGL.
Maybe you already knew this but doesn't look like this is a supported scenario.
See
Remote Desktop Sharing Disables Direct3D Functionality
and
Is it too much to ask to have ONE good image display API in Windows?
You may want to look at WPF if you have a choice.
According to this article Direct3D is possible (but dog slow) when the box you're remoting into is running Vista.
http://www.virtualdub.org/blog/pivot/entry.php?id=208
I haven't verified this and cannot personally vouch for whether or not it REALLY works.
In addition to Tim's answer, in the WPF Futures talk at PDC a member of the D3D team mentioned that D3D 10 under Windows 7 would work with remoting and will be remoted by primitives (which leads me to believe that the client doing the remoting would handle the rendering). They don't give much information, but it's touched on in the Q&A section of the WPF Futures talk (PC07) which you can check out on microsoftpdc.com.
Doesn't solve the D3D9 in XP question, but remote desktop with D3D10 under Windows 7 sounds a little better. :)
I have tested this and it does work if the server is running Vista.