mdt 2013 how to deploy single os for multiple model pc at once - mdt

We have multiple model PCs, we have setup MDT 2013+WDS server, for NewComputer or Refresh (reintall) scenario we uses MDT only new PCs all are same model so had no issues so far, for reinstall scenario at once just 1 or 2 PCs deploy.
But now trying to reinstall 4 different model PCs one OS that is Windows 7 x64 Bit for 1. Intel H61-MS 2.ASUS M5A78 3.Mercury PIH81 4.Gigabyte
I have imported all the 4 model PCs into my Deployment Tools-->Out of Box Drivers location, have imported the OS,Application everything, i don't know how to deploy all at once, such that MDT distribute Drivers base on %MODEL% one more draw back is few of the PCs when i check model number using the following cmd "wmic computersystem get model" throws error "To Be Filled By O.E.M"
Can anyone please suggest me how to perform Same OS Deployment for different model and make.
Thanks in Advance.

I build the current MDT environment for the company that I work for and they have over 30 different models of machines by using this method of driver delivery you should be able to deploy multiple types of models at the same time. This method works for both windows 7 and windows 10.
http://techgenix.com/Deploying-Windows-7-Part26/

Related

TwinCAT: Running on isolated cores failed

I was trying to activate my configuration on my local PC, but it failed. I tried:
Isolate 1 or 2 cores on my pc (Under SYSTEM > Real-Time and reboot the PC) and run the PLC tasks on those cores. When I do this I get the following error:
'TwinCAT System' (10000): Sending ams command >> Init4\RTime: Start Interrupt: Ticker started >> AdsWarning: 4118 (0x1016, RTIME: startup of isolated CPU fails!) << failed!
I then tried to run it on the normal windows dedicated CPUs (so none of the CPU’s were isolated). When I activated the configuration (and enabled Virtualization in the BIOS) I got the following error message:
Setting TwinCAT in Run Mode with KB4056894 is not possible
Uninstall KB4056894
or
Activate a solution using only isolated cores
I could not find KB4056894 installed on my PC. Any other solution?
I'm using TwinCAT 3 Build 4022.14 under Windows 10.
From Beckhoff support:
According to the error note, the Microsoft patch for spectre/meltdown
is installed on your PC. Normally, the TC3 should work with this patch
when using isolated cores…
However, since version TC3 Build 4022.16, this problem is solved.
I installed 4022.22 and everything worked.
I just want to share my experience with this error and how I solved it. Just in real-time menu set the cpu cores as 1 shared and 3 isolated cores. since my cpu has 4 core. Then set this value on target and then it will ask for reboot. after reboot it worked without this error and I was able to run the my code.

Trying to Get a Raspberry Pi 3 to write and modify Data in a PostgreSQL database on a seperate Server

As a side project I have been interested in energy consumption. I have written and executed a program on a raspberry pi 3; this uses external hardware to gather data over Ethernet using ModbusTCP.
Within my program that is a data logging feature that creates and saves a CSV file with the values collected for that day. Every day at midnight a new CSV file is created and marked with the new day's date. This CSV file is saved locally on the raspberry pi and as it runs headless I’ve had to set up a cronjob to move the files onto a Thumb drive to allow me to view and assess the CSV file.
The modification I am trying to attempt is: I currently have a PostgreSQL database on a separate server; I am trying to get the Raspberry Pi to connect to the database and populate it with data as soon as the Pi has recorded it.
I have searched the internet, both this site and many others, but most of what I have found is tutorials and guides on how to set the Raspberry Pi up as a PostgreSQL server, which is not what I want to achieve.
Any advice and help is greatly appreciated.
Carl
Update : the programming language i am using is python 3
I would investigate using ssh. You can transfer the csv, and then invoke a script on the target machine, wait for a cron or just call pgctl at the remote command line. That will avoid the necessity of setting up a client on the PI, and opening the firewall at port 5432, configuring pg_hba.conf, etc.

Two master instances on same database

I want to use Postgresql in Windows Server 2012 R2 for one our project where it can be 24/7 uptime.
I would like to ask the community if I can have 2 master instances in 2 different servers A&B and they will 'work' on the same DB located in a shared file storage in lan. Always one master instance on server A will be online and when it goes offline for some reason (I suppose) a powershell script will recognize that the postgresql service stopped and will start the service in server B. The same script will continuous check that only one service in servers A & B is working to avoid conflicts.
I'd like to ask if this is possible or a better approach for my configuration.
(I can't use replication because when server A shuts down the server B is in read-only mode thing that I don't want)
If you manage to start two instances of PostgreSQL on the same data directory, serious data corruption will happen.
Normally there is a postmaster.pid file that prevents that, but a PostgreSQL server process on a different machine that accesses the same file system will happily unlink that after spewing some log messages, thinking it was left behind from a crash.
So you are really walking on thin ice with a solution like that.
One other issue that you didn't think of is that script that is supposed to check if the server is still running. What if that script fails, because for example the network connection between the two servers is down, but the server is still up an running happily? Such a “split brain” scenario will cause data corruption with your setup.
Another word of caution: since you seem to be using Windows (Powershell?), you probably envision a CIFS file system when you are talking of shared storage. A Windows “network share” is not a reliable file system — last time I checked, it did not honor _commit.
Creating a reliable failover cluster is harder than you think, and I'd recommend that you check existing solutions before you try to roll your own.

Will a PowerShell script developed for a Windows 7 run on Win Server 2008 R2?

I have developed a large PowerShell script that has been refined on a Windows 7 64bit box and now I intend to run it on a Windows server 2008 r2. Assuming the PowerShell versions are the same, will there be any major issues with syntax in-between Win 7 and WS 2008 R2?
The script checks a lot of WMI and registry keys like GWmi Win32_NetworkLoginProfile and Get-Itemproperty -Path Registry::HKLM\Software\Microsoft\"Windows NT"\CurrentVersion\winlogon\
Most PowerShell information is driven towards managing servers so I assume I will be safe, but I want to see if you all can help me learn some lessons before I start banging my head against the wall.
Thanks
There are no syntax differences between PowerShell on Windows 7 and PowerShell on Windows Server 2008 R2. You may encounter differences in existing services, WMI classes, and registry keys, though.
First you should test it on a virtual machine to see if it works or not. Then try it on the physical machine. If it doesn't work, modify the code to the specific registry keys.
The short answer is yes. I run PSv4 on both my desktop and one of my servers running 08. Be sure to import the correct modules (if any) and allow for RPC in your firewall (And winrm) if applicable. One note- depending on what you run with the server, commands and functions are only as good as the version you run against (even when invoked). I ran into this problem as I scripted in v4 and environmentally my firm is almost all v2. Enabled -verbose error output and test in virtual machines or a loner laptop. (This is what I did). Good luck!

Is WinDbg's vertarget command always accurate?

I wonder because running it on a client's minidump it reports a different Windows version than the client repeatedly told me she had, and the version I'm being reported happens to be exactly the same version I'm running WinDbg on.
So I wonder, can vertarget always be trusted (and clients not) or the information it relies on may be absent with some dump generation options and when it is it reports the version WinDbg is currently running on, or maybe just some default that happens to coincide with my OS version?
I'm using WinDbg 6.12.
In all my cases so far, vertarget has been correct and the customer/client made a mistake - and vertarget is one of the commands I use for every dump, exactly for the purpose of checking if the dump contains what I need.
But perhaps, things can potentially go wrong here as well, so let's evaluate some options:
vertarget also reports debug session time and system uptime. Do those also match your system? Reboot your system in order to get a low system uptime and check again. Is it still your PC's uptime?
vertarget also reports the number of CPUs. Does that number match your number?
Get a virtual machine which does not have your OS, e.g. one from Modern.IE (Microsoft). Copy WinDbg and the dump to the VM and check the output of vertarget again.
WinDbg 6.12 is a bit old. Do newer versions (6.2.9200 / 6.3.9600 or even 10.0) provide the same information or was there a bug fixed already?
And even check some other information:
Is it a dump of the correct application? Use | (pipe)
Is it a dump of the version you are expecting? Use lm vm <exename>
Does it have the flags which can be expected for the method used for taking the dump? Use .dumpdebug.
Other than that I observe (not representative) that many client OS version dumps (Windows 7, 8, 8.1) have all latest service packs installed, while administrators seem to follow the "never change a running system" approach for server OS (Windows Server 2012, R2). So it might just be a coincident.