How to download a pushed Odoo release and upload to Github - github

To give a bit background information on whats going on, few months back the company I'm employed with purchased Odoo ERP system and did a couple of custom modifications to it using a third party company. After the development developers pushed the code into a GCP VM instance. However my company was not using this release at all. 6 months after management wants to remove the existing GCP VM and transfer the source code of Odoo into Github to save the infrastructure cost.But I have absolutely no idea how to do this as I'm not coming from an infrastructure background. I can see a VM sitting in GCP with below specification.
Machine type: custom (8 vCPUs, 16 GB memory)
Reservation: Automatically choose
CPU platform: Intel Haswell
Zone: us-central1-a
Operating system: Ubuntu
It would be really helpful if someone can point me at the correct direction

GitHub is for hosting program sources only. You cant run your odoo from there. You may migrate to odoo.sh (And code to GitHub)
You need access to your modifications, source code, and database files. Then you can upload your source to GitHub. After that, buy an instance to odoo.sh and upload your database backup there.
And if you are interested in how to get files from GCE, then there should be information for it:
https://cloud.google.com/compute/docs/instances/transfer-files

Related

Should XAMPP be installed on an actual physical Server?

Is XAMPP just meant for testing and setting up virtual servers ?(cause that's what wiki say)
Can it be installed on an actual physical server? Do developers actually do that?
I'm a little confused cause if it were true, why would anyone install a virtual server on a physical server? It's like trying to run Excel on VirtualBox.
XAMPP simulates a typical stack used for web development on a local machine. If you have access to an actual physical server, you would typically install things like the web server (such as Apache) and MySQL on the server itself. The developers of XAMPP consider it more of a development tool due to certain features being disabled to make dev easier.
Virtualisation in servers is used because the actual physical machines are very powerful and so are idling a large amount of time. Putting those resources to use by creating two virtual servers on top of the host reduces cost and increases operational throughput.
Virtual server and Docker can be used to test with different environments at the same time, or test beta software for future releases. On Maschines that have 6 or 8 cores and running 3.6 Millions instructions per second, there are plenty of resources to have more than 1 maschine virtual or as a docker file, so that you can uses for example different databases, with out them interfering.
Besides phiscal Hard cost mony to buy and to maintain.
Last virtualisation and docker are only files, that you can simply copy to have a backup. A real maschine is a little more work, to make a backup.
But don't use XAMPP as real maschien that is exposed to the world. There much to many security risks ind teh standard configuration.

What could happen if I install all server roles on Windows Azure Pack: Web Sites on one machine?

At work, I'm in the process of installing Windows Azure Pack: Web Sites in a VMWare ESXi lab environment. I have little available RAM and hard drive space on the ESXi.
I originally thought I would be able to do this without spending too much resources. The Azure Pack Express variant is advertised as if it only requires one machine with 8 GBs of RAM. However, after completing the first installation, I discovered that the Azure Pack: Web Sites extension requires no less than 7 different server roles installed on 7 different machines, each with Windows Server 2012 R2. I need a separate Cloud Controller, Cloud Management Server, Cloud Front End, Cloud Publisher, Cloud Shared Web Worker, Cloud Reserved Web Worker and Cloud File Server.
I have no way of freeing up that much resources. In the installation guide for Windows Azure Pack, they "advice" me to use separate VMs for each role, but they don't say explicitly that it won't work. Is it because multiple server roles on one machine will strain resources, or is it because the roles are incompatible and will make the system malfunction? In my case, the Azure Pack will only be used for penetration testing by a single user, so I imagine resources should not be a problem.
I'm not a web administrator, and I'm in over my head on this task. If anyone could give me some advice before I proceed on this, that would be much appreciated.
TLDR: Will there be a critical conflict if I install seven server roles on one machine, or will there just be a strain of resources?

Best practice deploying windows service

I'm looking for best practice in continuous delivery of windows services.
Currently we hava a set of powershell scripts that unintall, reboot, install updates but error handling is tricky. We are reviewing System center but are there any other options available for deploying a windows service?
We've been using Presto since Dec 2011, and have done over 1,000 deployments. Most of what we deploy are Windows services.
What's nice is that we set up our apps and servers in Presto, then we can repeatedly deploy, to any server (or multiple servers at once), by just hitting a button. Presto will copy our official release binaries, update all of the items in our app config files, create and start the service, etc...
So, if you have an application that has 30 manual steps to deploying it, you can enter these steps in Presto, then it's done automatically for you after that.
It's worth a look: http://presto.codeplex.com/
Your most basic and generally accepted best option comes from this thread, which basically links to a Microsoft support article on creating an installer for the windows service.

Automating deployment of .Net Web App to Azure & SQL Azure & Azure Blobs

Nutshell :
I'd like some help / info / resources r.e. setting up a Team Build, MsBuild, TFS 2010 automated deployment for my Web App to Azure (inc. all the DB bits).
Ideally I'd like to have a process that I can fire off from the VS2010 Team Explorer UI "Queue New Build", then just keep an eye on its progress, releasing me to work on something else. Options to delve into logging for any issues, and know that my process is robust, complete and totally non-manual, i.e. :
Backs up all my Live data (SQL Azure
and Azure Blobs)
Deploys any DB schema changes (contained in my DB project)
Deploys any data changes to my core data (e.g. config data etc - which I have in my Post-Deployment scripts)
Does things sensibly (e.g. using compression for deployment packages to save time & bandwidth)
Cover my silly backside (e.g. seamlessly rolling back failed changes)
Keeps app 100% running during deployments (failed or successful) e.g. sessions left intact, minimal chances of data loss
Keep detailed logs of processes progress at each stage for fixing any issues
Keep everything that should be Source Controlled well... source controlled
Background / Dream / Goal :
At my last FT job we had a pretty sweet automated deployment setup for our hosted Web Apps, using CC.Net (to manage the process), CCTray & the CC.NET web UI to monitor and control, Code Generation (CodeSmith + NetTiers templates for data access & entities), MSBuild, VS Databse Projects, a few .bat scripts, and some handy utilities like PsExec etc to help out with little bits and pieces.
I didn't set it up, but have some experience managing it, dealing with issues etc.
It was (98% of the time) a lovely experience to deploy. You'd make sure TFS was up to date, double click CCtray, right click on a project and then click "Force Build", sit back and watch Green => Yellow => Green.
Great !!
Current Situation :
I run my own Micro ISV, and my main project is an App on Azure (in Beta).
I'd very much like to replicate the kind of deployment experience I had before - I'm even considering moving out of Azure to dedicated servers - just because I know I can setup an automated deployment system there.
My main stopping point is the DB bits, seems like a nightmare. Maybe I'm missing some great free tool or library which would do the job, I really hope so, but I also could really do with someone experienced with this to point me in the right direction for a "Best Practice" solution to wrap up all the little bits neatly.
I have scoured Google, read and read, burnt hours and hours, but what I seem to find atm is half solutions, not quite right for my project and needs, based on expensive tools I can't afford (near $0 budget), or is plain over my head and a bit incomprehensible & scary.
Now I'm NO Sys Admin, but with enough time I can generally work out what I need to do for these sort of things.
However, I don't have ANY time right now, and the success of my whole project really depends on me being able to cut out the horrendous 40min+ manual deployment wastage I currently endure.
I want to be able to get some user feedback, find a bug, or code an improvement and confidently just fire off a deploy and crack on with something else.
The extra issues with development for Azure in its current state (as opposed to dedicated servers), and the currently fairly poor tooling support from MS (I know there's lots of improvements coming, but I need something right now) - has left me swimming in a sea of "I don't know"'s & "I'm not sure"'s and tends to end in one big :
"I give up + a manual deploy for almost an hour + a little sobbing inside as my dream of deployment heaven dies once again" :(
But I know people out there more proficient, knowledgeable and experienced than myself have cracked this one for themselves. I just can’t seem to find the info I am sure is out there.
So if anyone has some good resources, tips, links, comments, or opinions on this, I'd love to hear.
Details Of My Setup :
App up and running in Azure (which is in Beta - partly due to not having the auto deploy setup), running in a Production slot, I haven't bothered with a staging slot, as some issues with subdomains / DNS / the auto generated Url has made that look painful / not feasible.
Azure / App :
App is
1 Web Role
ASP.NET 4
MVC 2
EF 4
SQL Azure
Azure Blob Storage
1 Worker Role - this runs some scheduled tasks, and works with same DB and Blob Storage
SQL Azure
Azure Blob Storage
The 2 Roles communicate with the Azure queue system (or will do shortly)
Locally :
Datacenter 08 (DC) + Hyper V
- VM for TFS 2010
- VM for a Linux firewall
Dev Box 1 (Win 7)
- VS 2010 / VS 08
- SQL 08 R2 / 05
Dev Laptop 2
- as above.
I tend to run these together all the time (so I never need to stop to wait for anything) with the FANTASTIC free tool Synergy to bind Keyboard, Mouse, Clipboard together.
Some Of The Stuff I've Read :
I have read what I have found and some of its great stuff, so I am also posting these links here to help other's struggling with this stuff, but none of it quite seems to do the trick, or maybe I don't get the trick maybe I'm missing something ?
http://deploytoazure.codeplex.com/
How do I manage and publish a database with my MVC2 application on Azure?
How can I automate the "generate scripts" task in SQL Server Management Studio 2008?
http://www.koltovich.com/blog/DeployingAzureProjectFromTFS2010BuildServer.aspx
http://msdn.microsoft.com/en-us/library/ff803365.aspx
http://msdn.microsoft.com/en-us/library/gg432988.aspx
http://www.jimzimmerman.com/blog/2010/03/16/Deploying+An+Azure+Project+Using+TFS+2010.aspx
http://archive.msdn.microsoft.com/azurecmdlets
http://selfpacedazure.web.officelive.com/Documents/Windows%20Azure%20Platform%20Articles%20from%20the%20Trenches.pdf
http://msdn.microsoft.com/en-us/library/gg651132.aspx
http://social.technet.microsoft.com/wiki/contents/articles/overview-of-tools-to-use-with-sql-azure.aspx
http://msdn.microsoft.com/en-us/library/ms178078.aspx
http://blog.syntaxc4.net/post/2011/05/13/Continuous-Integration-in-the-Cloud.aspx
http://blog.syntaxc4.net/post/2009/12/31/Synchronizing-a-Local-Database-with-the-Cloud-using-SQL-Azure-Sync-Framework.aspx
http://social.technet.microsoft.com/wiki/contents/articles/overview-of-tools-to-use-with-sql-azure.aspx
http://social.technet.microsoft.com/wiki/contents/articles/developing-and-deploying-with-sql-azure.aspx
http://blogs.msdn.com/b/tomholl/archive/2011/02/23/using-msbuild-to-deploy-to-multiple-windows-azure-environments.aspx
http://www.scarydba.com/2011/04/25/sql-azure-deployments/
Disclaimer / Forum Abuse Minimisation Blurb :
Like I say I am NO Sys Admin, I am NO script magician, and NO CI guru, I am a simple minded web dev, so pls pls be nice if its mindlessly easy to you, or if I stoopidly am missing the point, I don't mean to be all "Does You Haz the codes?" But I’ve basically spent 6 months dreaming that one day soon someone will post a nice clear simple blog entry with an "Idiots guide" that solves all my woes, and an hour later I am in deployment heaven again, but I am still waiting (or Googling badly), and its breaking my little Developer's heart :(
P.S. I promise that If I get a good answer here I'll do my bit for the fantastic SO community and spend at least 8 hours scouring for questions I might be able to help with and contributing back.
Great.
It seems the new SQL Server (Code Named Denali), along with the new SQL Server Developer Tools (Code Named Juneau), and specifically the 2.0 release of DAC projects may well have filled the gap between development and deployment to SQL Azure.
The new v2.0 of the DAC framework expands the set of supported objects
to full support of SQL Azure schema objects and data types across all
DAC services: extract, deploy, and upgrade
From SQL Azure Import/Export
Also see :
Bob Beauchemins Blog Post suggesting schema upgrades on SQL Azure are now
supported with the new DACImportExportCli.exe utility.
Other suggestions the new DACs 2.0 solve the major issues with upgrade deployments to SQL Azure
And looks like it all should run side by side with current setup. Will check it out and update here on progress. Brilliant.
For the database deployments I use RedGate compare which works well with Azure. There is a command line edition which can be used as part of an automated build process. Regarding keeping the site always running, you should deploy to staging and then the production site is never down. Once deployed you can switch the staging over to prod.

What strategy do you use to sync your code when working from home

At my work I currently have my development environment inside a Virtual Machine. When I need to do work from home I copy my VM and any databases I need onto a laptop drive sized external USB drive. After about 10 minutes of copying I put the drive in my pocket and head home, copy back the VM and databases onto my personal computer and I'm ready to work. I follow the same steps to take the work back with me.
So if I count the total amount of time I spend waiting around for files to finish copying in order for me to take work home and bring it back again, it comes to around 40 minutes! I do have a VPN connection to my work from home (providing the internet is up at both sites) and a decent internet speed (8mbits down/?up) but I find Remote Desktoping into my work machine laggy enough for me to want to work on my VM directly.
So in looking at what other options I have or how I could improve my existing option I'm interested in what strategy you use or recommend to do work at home and keeping your code/environment in sync.
EDIT: I'd prefer an option where I don't have to commit my changes into version control before I leave work - as I like to make meaningful descriptive comments in my commits, committing would take longer than just copying my VM onto a portable drive! lol Also I'd prefer a solution where my dev environment stays in sync too. Having said that I'm still very interested in your own solutions even if they don't exactly solve my problem as best as I'd like. :)
A Distributed / Decentralized Version Control System solution will suit your needs, Git, Bazaar, Mercurial, darcs... you have plenty alternatives.
Use a version control software like SVN, SourceOffSite, etc. You just have to check-in all your changes and get the latest changes when you want to sync.
Or you can use Windows Live Sync -> https://sync.live.com/foldersharetolivesync.aspx
Hasn't anyone recommended rsync? Use an rsync client to send the diff between files. You can apply these diffs thus bringing your file up-to-date. For the smallest file transfer it's probably the best idea.
I simply use an external portable notebook drive and do all my work on that. All my PCs have it set to the same drive letter. So no copying anything .. I've not attempted to run VMs this way, however, but I don't see any reason it shouldn't simply work.
i use dropbox.
We use Citrix and then I do a remote desktop connection to my PC at work. It is not the fastest solution in the world, but it does eliminate the problem of keeping two or more workstations up-to-date.
Here is a solution I use.
Set up a VPN between the office network and the laptop.
Install the VisualSVN Server
Load all projects in the SCC.
When at the office I check out a project, work on it and then check it in. When at home or around the world I connect to the office via VPN, check out my project, do my thing then check it in. Via the VPN connection I can also RDP to my dev boxes and or servers.
Hope this helps. Good luck!
I either connect remotely to the office SVN, or VPN in and remote desktop my dev or desktop machine and carry on working. It's very rare I sync any files, but when I do it's usually with DropBox (although you can't really do that with large files).
Write program, that will syncronize all your data through internet, and then shutwodn your computer, so at the end of the day you launch it, and go home, and when you come home all data is already there
We work with a distributed team, so it is vital everyone has easy and secure code repository access. For this, we use SVN over ssl/https. It works great, reliably and secure.
Depending on the VM software you are using why don't you set up 2 different VM disks, keep your user profile/dev files on one disk and the OS and other programs that change rarely on the other.
This way you can probably get away with only having to copy the larger disk image when you've installed something new and end up only copying a single virtual disk containing your work.
Just setup a SVN server at home, forward your router port and get on with your life. rsync is also a good, fast solution. Just remember to use it over SSH.
I had a similar problem. But fortunately we had a source control server (TFS) configured so I use to work only from the local Virtual Machines stored on my external drive and than check in the required files to the TFS as an when required.
you haven't specified the OS and virtualization system, but if you're working VM images that can be mounted, e.g. XEN on linux, then you could mount the image and sync it via rsync.
i connecting to the office net work and download the lates version form svn
use the Dev mysql server
so i am just like anther computer in the office network
I imagine that most of the time spent copying involves the database. Is that right? If so, can you not simply connect to your work DB from home using your VPN connection?
You would still copy your source files (or use a source code control system as others have suggested), but this would only take a fraction of the time.
If all you need is a virtual machine from your work computer, then you could mount a remote catalog (using nfs or smb) where is your virtual machine files store and run that virtual machine from there. This should be faster than using remote desktop.
I also use DropBox, and that is key because it is important to keep it simple.
It is generally better if you can have some type of remote desktop ability, because this will allow you to use a standard workstation configuration, and it will allow for consistent connection to network resources (database server, business servers like workflow, etc).
Working offline, in my opinion, is ok for certain tasks, but overall there are obstacles for systems which connect to other resources (unless you plan to move those resources to your home box).
It was a problem for me too. So, the company bought me a laptop, and I do my work on it, at home or anywhere else.
I have a set up where a folder on one machine is synced to a folder on another machine. any changes to the contents on one machine is also made on the other machine within a minute.
So you could sync the top level folder of your work files, and have then sync to your home machine. What I like about this is that syncing is completely transparent. As far as the user experience goes, I'm simply using the file system. No external app to interact with.
I use Live Sync Live Sync from Microsoft to this. You'll need to create a Windows Live ID to use this system. It works for windows and macs.
Dropbox and Microsoft's Live Sync are good options that have already been mentioned. My personal favorite is Live Mesh, also from Microsoft. The one great feature that puts it above the other two, in my mind, is the ability to specify which folders get synched on which computers, and where the folders are located. So, for example, I synch my Visual Studio 2005/Projects folder between my work machine and my dev box at home, and I synch Visual Studio 2008/Projects between my side gig VM and my home dev box.
i have a macbook with all my dev software on it; when i go to work, i start it in target firewire mode and plug it into my work macpro with the fast processor, lan connection, big monitor, etc. this way i never have to leave my user folder but i have access to all the software and hardware available at work.
Why don't you just use version control? A DVCS?
Find here a tutorial on DVCS for Windows users (very simple)
http://codicesoftware.blogspot.com/2010/03/distributed-development-for-windows.html
Some ideas:
Use network storage (with SSD cache if speed is a concern), either for your code or to host your VM.
Separate data and OS into two virtual disks in your VM.
Google drive, Onedrive, Dropbox etc.
If you use Visual Studio (Code), try the Live Share extension.
Dockerize your environment. Alternatively, I keep a bash script for all the setup I did, so I could almost one-click reinstall my dev environment anywhere.
Use a second version control, covering your whole work directory. Commit and push everything before switching environments, then pull and hard reset your commit in another machine.