Direct3D over Remote Desktop [closed] - windows-xp

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 9 years ago.
Improve this question
How can I get Direct3D to work over a Remote Desktop connection? I am using Windows XP Professional.

3D acceleration does not work on Remote Desktop or in Virtual PC/Server.
Software-rendered OpenGL works on both Remote Desktop and Virtual PC/Server.

Actually you can use D3D9 over remote desktop in two different ways. I have done & tested both.
(This runs fast)
The best way is to use SwiftShader, a software implementation of D3D9. This should run in any color bit. https://opensource.google/projects/swiftshader
(NOTE: This is very slow)
The second is to install the DirectX SDK on the remote computer. Your app must create a D3D9 reference device & the remote computer must be set to 16bit color.

It works for me if I start the 3D program first in the local machine, and I later get the session using remote desktop.

You may use VirtualGL for this purpose, if you like OpenGL.

Maybe you already knew this but doesn't look like this is a supported scenario.
See
Remote Desktop Sharing Disables Direct3D Functionality
and
Is it too much to ask to have ONE good image display API in Windows?
You may want to look at WPF if you have a choice.

According to this article Direct3D is possible (but dog slow) when the box you're remoting into is running Vista.
http://www.virtualdub.org/blog/pivot/entry.php?id=208
I haven't verified this and cannot personally vouch for whether or not it REALLY works.

In addition to Tim's answer, in the WPF Futures talk at PDC a member of the D3D team mentioned that D3D 10 under Windows 7 would work with remoting and will be remoted by primitives (which leads me to believe that the client doing the remoting would handle the rendering). They don't give much information, but it's touched on in the Q&A section of the WPF Futures talk (PC07) which you can check out on microsoftpdc.com.
Doesn't solve the D3D9 in XP question, but remote desktop with D3D10 under Windows 7 sounds a little better. :)

I have tested this and it does work if the server is running Vista.

Related

What microcontrollers don't require a USB to UART driver to interface with computer? [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 1 year ago.
Improve this question
I am currently teaching a remote middle school summer camp next year that involves microcontroller programming. I’ve chosen to avoid Arduino microcontrollers for the camp and use a MicroPython compatible microcontroller instead. One microcontroller that I am interested in using is the Lolin V 1.1.0 with an ESP-Wroom-32 chip since it is cheap and reliable.
One downside that I’ve noticed with this MCU is that a VCP driver needs to be installed in order to establish a USB to UART connection (It contains a CP210x chip). I’m not too sure how many Operating systems have a VCP driver installed by default. However, I tested to see if the microcontroller was able to be registered on three of my different laptops. On two MacBook airs, my computer was successfully able to locate the MCU without a virtual driver installed. Similarly, I used a Windows computer and a connection was able to be established after 10 minutes (Not sure why it took this long). Finally, I tried it on my friend’s European MacBook and his computer wasn’t able to locate the microcontroller until I installed the driver.
I’m expecting that the camp would be ~300 students and I don’t want anyone to not be able to connect with their Microcontrollers and don’t want to include the installation of the driver since it can be tedious.
Does anyone know how big companies that teach Microcontroller programming to beginners such as MicroBit, Raspberry Pi, or Arduino deal with this problem? How do they ensure that the microcontrollers are read by the computer without having to have students to go through the tedious process of installing a driver?
What percentage of Operating systems have a CP210x VCP driver already installed?
Are there any microcontrollers that don’t require a USB to UART driver installation altogether?
Thanks!
Ryan
It's usually not so much dependent the choice of microcontroller, but rather the USB-UART chip on board. There aren't many of those to choose from. SiLabs CP210x and FTDI's FT232/FT2232 are widely used families, so you can't really go wrong with them. Windows installs drivers for both automatically and seamlessly. Linux has support built into the kernel for all major distributions. Don't have experience with Macs, I'm a little surprised you had to do anything manually. Driver support for FTDI devices tends to be a little better across the board.
If you're interested in ESP32 and have a little more budget than for the Lolin, feel free to try the ESP-WROVER-KIT which has the FTDI FT2232HL chip and is generally much more feature-rich.
The only one I am aware of that is pre-installed in windows is microchip PIC (eg PIC16F1454) but a PIC is not the best device for teaching. You have to run an application on the device to use USB VID:PID 04D8:000A.
I don't have a machine to test it but maybe a microchip MCP2200 (04D8:00DF) will be the same.

Unexpected high 3g data usage on iPhone with mac [closed]

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 9 years ago.
Improve this question
It almost killed me 600MB in 20 minutes. I had my iPhone enabled the 'personal hotspot' function(as a wifi hotspot), so my mac can surf the internet through my iPhone. During that 20 minutes, I was just writing code in Xcode, there is no other application running. So how did all the data flow away? Is there anything to do with the system automatic update or Xcode component update? Thanks, I just want to find out where the problem is in case spend more money...
Yes it may be possible. May be you have downloaddiOS 6.0 Documentation Set orMac OS X 10.6 Core Library. Sometime it happens after installing or updating XCODE. You may found new updates and it will automatically download updates.
For checking this thing Go to xcode -> prefrences -> Downloads -> Components. check here it may possible your components are downloaded like this way.
Hope you will get your answer.
This is just my opinion. Your iPhone creates a Wi-Fi hotspot. Your Mac is connected to Wi-Fi, not 3G network. Maybe Macs are not smart enough to understand that they connected to Wi-Fi "modem", which actually is connected 3G. So they still doing background updates. Although, I never had such problem with iPhone as a modem. It can depend on OS version also. I have 10.8.2.
Use activity monitor to track apps which do some networking. Install local firewall app like a Little Snitch and block some apps (or hosts) when you iPhone-networking.

iOS devices as web server [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 5 years ago.
Improve this question
I saw there are several apps on App Store that allow other computers to make a http connection to the iPhone/iPad devices to transfer files. It seemed like a web service is running on the iOS device. Just curious how is it done /what class was used?
Thanks.
Just display the devices IP address, open a socket for listening in an app running on the iOS device, and implement the http protocol. There are several 3rd party libraries that can do most of the heavy lifting for you:
CocoaHTTPServer or iPhoneHTTPServer3, or SimpleWebSocketServer, or MultithreadedHTTPServer3
You can use GCDWebServer
It's a modern web server for iOS and MacOS based on grand central dispatch.
Like answered before the best choice is to use a 3rd party library for this. There exist mainly two libraries to get the job done: CocoaHTTPServer and MongooseDaemon.
Both of them have an Objective-C API but MongooseDaemon is just a wrapper around the Mongoose HTTP server written in plain c, whereas CocoaHTTPServer is completely written in Objective-C.
We decided to go with CocoaHTTPServer because of a few simple reasons:
Even the simplest property like setting the document directory for the HTTP server does not exist in MongooseDaemon. You have to change a #define in an included source file to be able to change it from the default one, which points to NSHomeDirectory().
As of now the MongooseDaemon library contains warnings about deprecated methods used within the Objective-C wrapper.
CocoaHTTPServer is aware of things like Bonjour or WebDav whereas Mongoose just delivers the basics.
CocoaHTTPServer comes with many examples that range from simple HTTP servers, passwd, SSL/TLS or WebDav HTTP server.
CocoaHTTPServer works with GCD to enable multithreading.
MongooseDaemon is also a good choice.
https://github.com/face/MongooseDaemon

Can I use matlab to program a remote control car? [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 8 years ago.
Improve this question
I'm trying to do a high school project, where i want to create a remote control car with mecanum wheels using matlab. Would anyone know if this is possible and how it is done? I have ordered matlab now and it being shipped, therefore i can mess around with it soon.
MATLAB can communicate with external peripherals using an RS-232 serial port. (That kind of port, usually found on older computers, can be added to newer computers using a USB adapter.) You'll want to build or find a radio control system that can use that interface to connect to the computer, and then you will need to write a MATLAB program to send the correct commands in response to user input, sensors, etc.
Typically you need the following:
A computer with matlab, and an
gpib or some kind of io port
that can interface with your
actuator (in your case you
probably have 2, 1 for steering and
moving back and forward).
An I/O device that matlab can
connect to (typically it is serial
port as idealmachine said, but you
can get serial to usb/serial to
Ethernet drivers that make the
device look like serial devices if it is not specifically an RS-232 connector)
such as the serial port, and
make sure that device can properly
interface with your actuator. You may need the Instrument Control Toolbox
You must find an I/O device (the gpib or some other device - national instruments has a large variety that are as easy as plugging in a USB and only as hard as plugging a PCI card into a pc) that can read/write to your actuator. Make sure you:
Have the proper device drivers for your I/O
device for you PC
Understand what kind of signals your
actuator will accept
The fact that you are using mecanum wheels is less relevant than setting up your interface to those wheels. This undertaking is not simple, and may be out of the scope of your high-school courses, unless you have some really great teachers of course. Also this project will probably cost at least $3-400 after buying the I/O devices and your R/C car (if you have a computer you can use), matlab, and your actuation devices (if you need to get different ones or modify the existing equipment on the R/C car). Doing this wirelessly is just one more complication to your system - start off wired, then once you get the hang of it move up to wireless.
well you can always use an Arduino, Arduinos will be connected to Transceivers which will act as master and slave at the same time, control your car from Matlab, send code to Arduino and watch the magic happen.
Am I sure?
Yes, I am building one now however with automatic response, initial tests worked
so to answer your question, it is possible.

Embedded computer vision platforms [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 1 year ago.
Improve this question
I am planning to start a computer vision based project on a smart phone platform.
I know iPhone and Andriod have openCV support. I am interested in knowing how was your experience with the level of integration, support and ease of building good apps on either platform.
Also I do want to consider windows phone 7 ( and Zune) as a platform, Are there any Computer Vision libraries for that platform or any good development tools( does Aforgenet work or any other good suggestion) ?
Also can you suggest some popular augmented reality apps which uses cutting edge technology (I am aware of Pranav Mistry's SixthSense)
Thnx in advance!
Unfortunately on Windows Phone 7 there is no access to the camera's data stream currently. Instead you are limited to launching the camera app, letting the user take a picture, then receiving that picture data.
This is a much requested feature so could change at any time.
Check out WordLens for cutting edge AR.
I've built AR and other computer vision apps on iOS using OpenCV and found it to be a solid platform. If nothing else, it provides you with a fast, robust set of libraries for Matrix mathematics as well as optimised versions of some of the more common vision algorithms from feature extraction to 3D reconstruction. It's pretty much a defacto standard so there's a great support community out there too and I'd definitely recommend it.
From a development point of view, I tend to write command line OpenCV apps on my Mac and then, when debugged and running, I look at moving them to iOS. This shortens the test/debug cycle (as I don't have to worry about deploying and debugging under iOS) as well as allowing me to focus on the problem at hand rather than vagaries of the mobile device.
It also makes it straightforward to move on to Android platforms though I've found the Java wrappers for OpenCV on Android to be less great. Again however, focus on getting your core algorithm and processing pipeline working on the desktop in (say) C++, then moving it to a mobile device wrapping in the necessary native code format.
Most probably you will use deep learning If yes you need to optimize everything like deep learning networks to smaller networks, mobile dl frameworks such as Tensorflow lite, etc.
You also need to consider the inference time depending on the hardware