How do I open my Unity app, two single instances, into two different monitors full-screen (one in each monitor)?
Using this link https://docs.unity3d.com/Manual/CommandLineArguments.html I have found that according to Unity all I have to do to open my Unity application multiple times into two separate monitors I use the -adapter N command. I have tried this and it always loads into the Main Display monitor. Here is my commands to open the application twice into separate monitors:
start CCC.exe -force-d3d9 -adapter 1
start CCC.exe -force-d3d9 -adapter 2
I noticed in another article that only the d3d9 option works when using the -adapter option. Also, I'm using a NVIDIA® GeForce® GTX 1070 with 8GB GDDR5 with Dual monitors plugged in 2 of the HDMI adapter ports.
Has anyone got this working? Much appreciated!
Drew
Apparently if you use Extended displays on Windows they both act as the same adapter, not sure what the intended use of the adapter switch is.
Alternatively, you could try an external solution. Create a program (.net for instance) that launches both instances and then moves one of them to the other monitor. Check this guide to see how you can move a window from another program.
You probably want to use borderless windowed mode to show them fullscreen and still be able to move them to the desired position.
Ultimately, here's a paid solution that does all of this.
Related
I'm using a GUI application framework (EGT) on an ATMEL/ Microchip SAMA5D4. The framework features
DRM/KMS and X11 backends.
I've looked at using tslib to calibrate a restive touchscreen for the device but due to EGT limitations it looks like I'm going to have to use libinput for the moment.
Is there a calibration mechanism (equivalent of tslib) available for libinput? I've looked at xlibinput_calibrator & it seems like it could be a solution but I'll have to sort out the dependencies in the Yocto build.
Thanks,
For anyone looking at this xlibinput_calibrator looks like it requires X11 & was not an option.
I eventually ended up using a startup script to prevent EGT capturing raw events (deleted /dev/event0 or similiar) from the touchscreen & instead use the calibrated source from ts_input.
I want two scripts integrate in to one script.
Scripts for sensors SHT10 and MAX31855. Both make use of software SPI.
The SHT10 use GPIO.BOARD and the MAX31855 use GPIO.BCM.
The problem is that I get an error "ValueError: A different mode is already been set." I don't know how to resolve this because both sensors used different libraries. I think that the problem is in those libraries.
Is there an easy solution for this problem.
Running the scripts separately than there is no problem
You can try using GPIO.setmode(GPIO.BCM) and then in the other program using GPIO.setmode(GPIO.BOARD).
I have followed the Multiplayer Shootout showcase (https://docs.unrealengine.com/latest/INT/Resources/Showcases/BlueprintMultiplayer/index.html) and tried to replicate the sessions part for my own project. I can create a (LAN) session, see other sessions and join one. My problem is that, for some reason, the correct map opens, but the actors do not replicate. If I simply open the map, for 2 players, they replicate without any issues. Is there something else I should do to enable replications when using sessions? Thank you!
After further testing the program, it gave me the following error:LogNet:Warning: Travel Failure: [LoadMapFailure]: Failed to load package '/Game/Maps/UEDPIE_2_Arena2'. Which I found out was because I was testing the game sessions inside the editor. After I tried testing it in Standalone Game mode, everything was working fine.
I have a Matlab application deployed using Builder JA. I incorporated it into a larger Java-based web application. It was built on a Windows machine, which has actual matlab on it, and worked fine when I tested it there. I've deployed the application onto a Linux server, which has only the MCR on it. what happens now is that I can run the application via the web page, but the resulting graphs display only the graphics and not the text (title, axis labels, etc). This happens both when I use WebFigure(gcf) and when I use figtoImStream(gcf, jpg), so I don't think it's an issue with any one format. The issue seems to be in the hardcopy.p function, since the server logs show an error
{Warning: Failed to draw text string}
{> In /usr/local/MATLAB/MATLAB_Compiler_Runtime/v717/toolbox/matlab/graphics/hardcopy.p>hardcopy at 28
In compiler/private/hardcopyOutput at 58
In figToImStream at 73
In Gaussian_WBfigures_jpg at 635}
I've seen some things that suggest that this is an issue of Matlab looking for a font that isn't there, and some that suggest that this is an issue with the renderer. Does anyone have a solution for this?
Try plotting your labels with a different fonts or interpreters. there have been reports of bugs in matlab when printing with different interpreters. e.g. http://www.mathworks.com/support/bugreports/398506, http://www.mathworks.com/support/bugreports/309380
figure
text(0.5,0.5,'testa','Fontname','Arial')
text(0.5,0.6,'testa','Fontname','Times')
text(0.5,0.7,'testa','Fontname','Times','Interpreter','Tex')
text(0.5,0.8,'testa','Fontname','Times','Interpreter','Latex')
I have an AIR application that takes command-line arguments via onInvoke. All is good, but I cannot figure out how to print some status messages back to the user (to stdout / console, so to speak). Is it possible?
Even a default log file for traces would be fine, but I can't find any info about it anywhere. Do I need to create my own log file? Now that'd be silly.
Take a look at CommandProxy. It is a low level wrapper around your AIR application that lets you send command from AS3 back to the proxy for communicating with the underlying OS. You should be able to add a means of writing to the command line via such a method.
I don't think that is possible, but I'm not completely sure though.
There is a flashlog.txt file which you can configure so all trace() statements are logged to it. Check this post http://www.digitalflipbook.com/archives/2005/07/trace_from_the.php for more info on how to set it up. This is for logging from the browser, but I'm pretty sure it should also work from an air app.
Additionally, you could use SOS MAX from Powerflasher to log to an external console through an XML socket.
By default, trace() will output to stdout.
Your AIR application is one, big trace window if you want it to be.