Using a Script; install a Windows Store App - powershell

Requirement: install a Windows Store app without requiring the user to nav to the store and click Install. Perhaps a batch file. Perhaps a Powershell script. Perhaps something else.
This is not a side-load question; this is a public, Windows Store question
Scenario 1: Maybe, my company has a new app in the Store that I want to push it out to every single employee without requiring them to nav to the Store and click Install.
Scenario 2: Maybe, my company has just subscribed to online CRM (or something) and I want to push out the CRM client to every single employee without requiring them to nav to the Store and click Install.
Scenario 3: Maybe, my company is hiring new employees & preparing new computers. In their first-time login script (or something) I want to ensure they have the Apps important to my business - without requiring they nav to the Store and click Install (perhaps several times).
Scenario 4: Maybe, my company is very virtualized, and we provision new VMs all the time. The VM performs fine, but bandwidth is our problem. To streamline the user experience, users logon and watch as the VM prepares itself for them by downloading and installing Windows Store Apps for them.
Please don't pick on the scenarios, I am just trying to give a possible use case.
Complication: I have been told (by people who know this sort of thing) that there is no built-in API to accomplish this. But we are developers. Nobody dares tell us something is impossible. But, if there isn't a built-in API, how could a network administrator or developer on a team solve this problem? I realize this question is somewhat brainstorming. But it gets asked over and over and over and over. I would like to provide a resource for others who might be considering the same scenario.
Hey, perhaps this is easy. Please share.

We have SCCM in our environment and some PS scripts are deployed in C:\Windows\CCM\SignedScripts that may be worth investigating. They are not SCCM specific. The most relevant of the three is "C:\Windows\CCM\SignedScripts\installwindows8app.ps1". The script just passes parameters to Add-AppxPackage though I am not sure how it would get the path to the .appx in the MS store.
You can get the location of installed apps on a model machine with (Get-AppxPackage -Name "*").InstallLocation but then you would need to repackage, store, deploy, and maintain them--not really the solution you were looking for.
Between investigating how SCCM would do it with these scripts and digging in the installed apps, maybe someone runs across something.

Almost 8 years into the future and we are getting closer to an answer!
Recent versions of Windows 10 now come with "winget" and they've added some Microsoft Store support.
It seems hit and miss as to what apps I can install. Using Spotify as an example,
winget search "Spotify"
Probably best to install via id instead though:
winget install 9NCBCSZSJRSB
Since it stops and asks if you want to agree to the terms and conditions, you answer automatically via:
echo Y | winget install 9NCBCSZSJRSB
As one of my references states: "What’s interesting is that if you have the Microsoft Store open at the same time as running winget install, you’ll see the install progress updating in real time in both the command line window and the Store GUI."
There is plenty left to be desired with this answer since most automated installs will probably just run into the error "Verifying/Requesting package acquisition failed: no store account found". But, if you're able to run it as the user, you might have some more luck. I'd love to see this tool get more finished so it can actually accomplish all the scenarios you listed. As Microsoft updates the tool, we could update this answer accordingly.
References and notes:
There's also the unlisted option --scope user or --scope machine found via: https://aka.ms/winget-settings
This page was helpful in my discovering of the tool: https://petri.com/how-to-programmatically-install-microsoft-store-apps-using-windows-package-manager
If you want to change and compile winget on your own, the source is here: https://github.com/microsoft/winget-cli/

Isn't this what Intune is for? http://www.microsoft.com/en-us/server-cloud/products/windows-intune/default.aspx#fbid=CFXRSOlwIM2

Related

Can I use pagespeed insights for my local host website or offline?

Can I use pagespeed insights for my localhost website or offline?
Yes.
Use the "Lighthouse" tab from your google chrome dev tools.
This is a great starter tutorial on how to do that:
https://www.youtube.com/watch?v=5fLW5Q5ODiE
Edit: user izogfif pointed out the "Audit" tab was replaced by "Lighthouse".
An alternative way to run Lighthouse
Although this is an old question there is an alternative way to run Lighthouse (the engine behind Page Speed Insights) locally that may be useful to people in some circumstances.
You can install the Lighthouse Command Line Interface (CLI) locally on your machine quite easily.
This gives you some significant advantages over using the "Lighthouse" tab in Developer tools.
Automation
Firstly you can automate it. You could have it run on every significant change / commit to check you haven't broken something.
Or if you want to check every page on your website you can automate that, very useful if you have hundreds of pages.
Storing results
Secondly you get the full JSON response (or CSV or HTML report, your choice) so you can store some (or all) of the audit results to a database for each page and see if any pages are performing poorly or whether you are improving or ruining your page performance.
Customisation
You can also set your own parameters when running tests.
For example I like to set my "cpuSlowdownMultiplier" very high (8 or 10) as I have a decent CPU and I want to catch any bottlenecks / long tasks that I may miss on slower devices. This is great for making you realise how sloppy your (my!) JavaScript is!
You can also pass headers, set cookies (slightly difficult at the moment but something they are working on) etc. before a run.
You can even use --disable-storage-reset to see how the site responds on a subsequent page visit where the user has already cached images etc. (you can do this in the Lighthouse tab in Developer tools so maybe not that good a reason).
Because you get the raw timings data you can also set your own criteria if you want.
Puppeteer
The icing on the cake is that you can use puppeteer (or similar) to automate complex tasks.
Lets say you want to check a page that is only accessible when you have logged in, use puppeteer to log in and then run lighthouse.
So which should I use?
I would advocate for the CLI if you are going to test regularly / want to automate testing, the Developer tools version for quick and dirty checks / infrequent testing.
Personally it took me about an hour to install and get used to Lighthouse, but I also had to install and learn how to use nodeJS (npm) command line to install lighthouse into my project (yes I am a slow learner!).
If I didn't have to learn that, probably 5 minutes to install and run your first test.
It is actually really simple to use the CLI once you have it installed.
Only down side is you need to update every few months, which is automatic in the browser. However even then that is a positive to me as if you are comparing over time using an older version might be useful.
Oh and you can run it on remote sites as well, so you can test the production site automatically from your own machine (useful if you are located a long way from the PSI server and want an idea of how your site performs in your local community).
This is also really useful if you have a staging server that only allows whitelisted IP addresses and want to test on there (yet again can be done with Developer tools Lighthouse but useful for bulk testing etc.)

Install solutions like ClickOnce

Are there other things out there like ClickOnce, but that use the actual application files?
I am finding the signing and packaging process very difficult to pass off to others.
I have tools than can make an MSI. But the deploy process of ClickOnce is very useful.
Is there something out there that works like ClickOnce but uses an MSI?
NOTE: I need what ever I use to not require the help of my System Admins. They are in a different structure of my company and rolling out my releases via them would delay things too much. (This means that using a Group Policy is not feasible.
Installshield has in their Premier and Professional editions has a feature that will check for updates before each execution, just like a ClickOnce application, which sounds like just what you are asking for. However, the price is not cheap.
http://www.flexerasoftware.com/products/software-installation/installshield-software-installer/tab/features

What is the best way to setup a development & production environment for a PHP/MySQL app?

I've been developing a web app locally on my local MAMP computer for the last few months. Now I am ready to launch it while continuing to add enhancements/fixes. So, I am wondering what is a good way to implement a development AND production server in order to efficiently manage updates, prevent overwrites, and seamlessly add other developers into the workflow. I also want something that has a minimal learning curve for me. Personally, for whatever reason, I've never been able to fully grasp version control systems like Git or SVN so I am hoping for an easier solution until I am able to invest more info the business.
As I see it, the options that I have are:
Spend more time learning Git before launching. And hoping that I don't break anything while further developing my app.
Buy two hosting accounts. One for Dev and one for Prod, where only I can do the deployments into Prod. I suppose I'd have to keep track of all files we've modified in a spreadsheet that are deemed ready for deployment.
Editing right on the FTP (no Dev server).
Are there any other options that you can recommend? I've heard that there are some new types of Web Hosting companies that can do the heavy lifting...
While personally, I have had good experiences using svn/git for multi-developer websites, I can understand your reticence to start relying on something you are not entirely familiar with. Unfortunately, I do believe that is your best option, but failing that, you might try using subdomains. My former employer would create test area on the disk and point beta.thedomainname.com at it. When bug fixes or upgrades were complete and verified to be working in the beta directory, the entire directory would be copied over to the live domain. Not the most elegant solution, but it worked. It certainly is cheaper than buying two hosting accounts.

Deploying .EXE to network drive? [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 6 years ago.
Improve this question
What are the problems with deploying an .EXE to a network drive and having users execute the .EXE over the network?
The advantage is that upgrades only need to be made to the one location. What are the disadvantages?
I would instead consider creating an MSI (http://en.wikipedia.org/wiki/Windows_Installer) file for your application and a Group Policy to facilitate distribution throughout your company (http://support.microsoft.com/kb/816102).
There are a number of freeware MSI tools. Good ones that come to mind are http://www.advancedinstaller.com/ and http://wix.codeplex.com/
The EXE is one thing, but you also need to consider any DLLs and other shared resources that may be associated with the app.
Some DLLs may be shipped with the EXE - you'd have to put those on the remote drive with the EXE, which would cause additional network traffic if it needed to use them.
Other DLLs may be part of Windows, but there could be versioning issues here if your workstations have different versions of windows or even different service packs or patches but they're all running a common version of the app.
And what about licensing? Does the app's license actually allow you to install it on a network drive - many software companies are very specific about this sort of thing, so you need to really be careful if you don't want to get caught out.
In short, it sounds like a good idea to get a quick win for your deployment management, but it probably causes far more issues than it solves.
If you really want to go down this path, you maybe should consider alternatives like remote desktop (eg Citrix or Terminal Server) or something like that - there are much better ways of achieving your goals than just sticking everything on a network drive.
One problem is file locking. In a Windows environment, if a user executes the application directly from a network share, the application's files are locked. This prevents the application from being updated with a newer version if someone has left the application open.
You can go around this by disabling the network share before updating the app and then again enabling it.
If you write your application using an Object Capability Security model, as defined in Mark S. Miller's Ph.D. thesis, Robust Composition: Towards a Unified Approach to Access Control and Concurrency Control, then you will not have any security drawbacks.
On the other hand, the "disadvantage" is that you must now manage access control via the object graph. The application should only have access to whatever permissions you give it. As some have already mentioned, Windows has a basic protection policy which locks the application files and thus prevents anyone from modifying the EXE until the application instance(s) is closed.
Really, the key issue here is you have to ask yourself what authority the program and its component parts should have. If it requires local user permission, then you will either have to design around that or give the program permission.
Understanding the implications of this, and doing it well, is not an easy task.
For our program we decided against a shared exe. We thought it would be harder to support (IT needs to kill users to unlock files before updates, users wont know where the exe is on the network, share\network file permissions need to be modified by IT, etc) and that we should emulate the behavior of other programs when possible (client software is normally installed on the clients).
The main disadvantage would be the network drive being unavailable.
Then each language, which you didn't specify, the EXE is written in matters. As .NET has some security issues running from a network drive.
It depends on what the application does. My application would be a problematic over-the-network deployment because the configuration files it uses are all in the same folder as the EXE, or in a subfolder. If every user runs off of the network, they could potentially modify the configuration files and screw things up for everyone else.
Thankfully, my app is only going to be deployed on separate workstations. :)
They might not have all the files your app needs installed. If they don't, you'll need to create a setup. If they do and it works and everyone's drives are mapped correctly, you should be fine.
I run a vendor's app like this at work. They didn't design for it, but it works without an issue. I have all the shortucts pointing to the UNC path. This particular app doesn't use files in the exe directory, so file locking isn't an issue. Its also hooked up to SQL Server for the data, so the data store isn't an issue either. (Would be a major problem if the app used a local SQLite, Access, or some other file based DB.)
If your app is a .Net app, this WILL NOT work without some major modifications to each machine's security settings, which is probably bad idea anyway. If you're talking about a .Net app, you should use ClickOnce. I use it for a few apps at work, as well, and it's great, and easy to use.
The problem is there isn't a definitive answer to your question, just a bunch of "it depends" qualifications. The big issues, AFAIK, are using local files for data storage, be they text files or databases. It is awesome for updates, though, which is why the app mentioned above is run like this.
This is perfectly doable. Be sure to set the "Run from CD-ROM" (I think?) flag in the Visual Studio settings when compiling -- this prevents the image from being backed directly by the binary, so you can upgrade it while people are running it. I am not running Windows at the moment, so I can't check, but you may be able to set this flag for DLLs, too.
One problem with doing this is that if your program associates itself with files, when the network changes and computers are renamed everybody's PC starts to run like a dog. Explorer has a tendency to query these things at funny times.
Another more serious problem is that if somebody accidentally deploys a broken version, it's not just the early adopters who get stuffed!
For an easy life, personally I recommend XCOPY deployment...
For .NET applications, we have observed BadImageFormatException which we have come to believe is from network glitches (or computers loosing network connectivity at key moments, for example using WIFI) while reading the EXE or DLL files.
IMHO this is a really bad design decision. We have a third party application in our company which is designed exactly like this.
In order for the program to run properly it requires full sharing for that folder; In this case the worst part was that the program had the freaking DATABASE in the same shared folder (yeah, I was shocked too when I found out)!!! Didn't take too long till someone wiped every file that was not in use from that folder, including the database of course :)
I really recommend a client-server approach, even if you have to buy/build a smart installer with auto-update features to overcome deployment issues.

Deployment in an agile environment

In the past my development team we have mostly done waterfall development against an existing application and deployments were only really done towards the end of a release which would normally result in TEST, UAT, PROD releases normally only consisting of three to five releases in a two month cycle.
A release was an MSI installer, deployed via Group Policy.
We have now moved to a more agile methodology and require releases at least once per day for testing, some times more often.
The application is a VB6 app and the MSI was taking care of COM registrations for us, users do not have elevated privileges on their machines.
Does anyone have any better solutions for rapid deployment?
We have considered batch/scripted installs of the MSI, or doing COM registrations per file, both using CPAU for elevated privileges, and ClickOnce. Neither of these have been tested yet.
Edit: Thanks for suggestions.
To clarify, my pain point is the MSI build / deployment process takes a long time can take up to two hours to get the new build on to the testers desktops. The testers do not admin rights on their machine (and will not get them) so I am looking for a better solution.
I have played around with ClickOnce, using a dot net wrapper which starts up the application and has all the OCX/DLL vb6 assemblies as isolated dependencies, but this is having issues finding all the assemblies when it starts up, or messages to that effect.
CruiseControl and Nant are probably your best bet for builds with flexible output. But Rapid Deployment?
My concern is that you are looking at the daily builds in the wrong way. The dailies do NOT need to be widely deployed. In fact, QA and Development are the only ones who should care about the builds on a day to day basis. EVen then, the devs shouldn't be out of sync ;).
The customer team should only recieve builds at the end of a iteration. That is where you show them what you have done and they provide feedback and you move forward from there. Giving them daily builds could cause a vicious thrashing that would kill your velocity.
All that being said, a nice deployment package might be good for QA. But again, it depends on how in step they are with your development iterations. My experience, right or wrong, is that QA is one iteration back testing the deliverables from the last iteration. From that point of view, they should be testing with the last "stable" release as well.
Is this something you can do in a virtual machine? You could securely give your testers admin rights on the virtualized system and most virtualization software has some form of versioning so you can roll back to a "good" state if something goes wrong. I've found it to be very useful for testing.
I'd recommend ClickOnce with the option to update on execution. That way only people using the software receive and install the updates.
You could try registry-free COM. See this other question. ActiveX EXEs still have to be registered though.
EDIT: to clarify, using registry-free COM means the OCX/DLL components you mention don't need to be registered. Nor do any OCX/DLL components they use. You can just copy the whole application directory onto a tester's machine and it will work straightaway.
If I understand your question correctly, you need admin rights to install your product. I see three options:
1) Don't install to the tester's desktops. Get some scratch testing machines (as dmo suggested, VMWare might help) that you can safely give them admin rights to. This may mean giving them a test domain and their own group policy to edit.
2) Build a variant that doesn't require MSI installation, and can be executed directly. Obviously your testers would not be testing the deployment and installation process with this variant, but they could perform other tests of the product's functionality. I don't know if this is possible with your product; it would certainly be work.
3) Take your agile medicine: "[prefer] responding to change over following a plan". That is, if denying admin rights to your testers is interfering with their ability to do their jobs efficiently, then challenge the organization to give them admin rights. (from experience, this will mean shifting to #1, but it might be the best way to make the case). If they are expected to test the product, how can they not even be allowed to install it in the same way a customer would?
If the MSI deployment is taking velocity out of agile testing, then you should test MSI deployment less regularly.
Use XCOPY deployment wherever possible, using .local for COM components. This can be a problem with third party components. As third party components are pretty stable, you should be able to build a custom MSI for these, install them once and be done with it.
You should try an automated build/deploy process or script that you can manually run. Try Teamcity or CruiseControl. Good luck!
I'm not sure just precisely what your pain point is.
You specifically mention registration of VB6 COM objects. Does the installer sometimes fail because of that?
Is it that the installer works but people don't know to install the new build so they are more often than not reporting bugs on an old build?
If the former, then I suspect the problem to be that VB6 was very likely to play fruit basket turnover with the GUIDs when rebuilding the solution. Try recreating your public interfaces in MIDL and have your VB6 classes implement those interfaces.
If the later, then try Microsoft's SMS product. No, it has nothing to do with cell phones. If all the user's aren't on the same domain, then you will have to build an "auto update" feature into your product. Here is a third party offering that I've heard of but never used.
I'm using SetupBuilder (http://setupbuilder.com/products_setupbuilder_pro.htm) for all my builds. Very extensible. Excellent support.
Not sure exactly if it fits your needs, but this kind of post on the forums, "Installing as a limited account user (non-admin) on XP" (http://www.lindersoft.com/forums/showthread.php?t=11891&highlight=admin+rights), makes me think it might be.