I have this challenge. I am the DevOps engineer and a software engineer in a team where months back, the developers moved from having a central Oracle DB to having the DB on a CentOS VM on their individual laptops. The move from a central DB was to reduce dependency on the DBAs and also to eliminate issues that stemmed from inconsistent data.
The plan for sharing and ensuring synchronization of the Database with everyone on the team was that each person will share change scripts with everyone. The problem is that we use Skype for communication (we just setup slack but are yet to start using it fully), and although people sometimes post the text of DB change scripts, it could be missed by some. The other problem is that some developers miss posting the changes. Further, new releases are deployed in Production without being deployed on the Test and Demo environments.
This has posed a serious challenge for us, especially myself who of recent, became responsible for ensuring that our Demo deployments were in sync with the Production deployments.
Most of the synchronization issues border on the lack of sync of the Database due to missing change scripts or missing DB objects. Oracle is our DB of preference.
A typical deployment in the Demo environment is a very painful process that involves testing an application and as issues occur due to missing DB table columns, functions, stored procs, we have to look for the missing DB objects, apply them to the DB and then continue until all issues are resolved.
How can I solve this problem to ensure smooth, painless and less time-consuming deployments? Can migrating our applications to Docker help with the DB synchronization issues and the associated lack of discipline of the developers? What process can we put into place to improve in this area?
Thank you very much in advance for your help.
Have a look # http://www.dbmaestro.com
I strongly recommend you to join the live demo session
DBmaetro TeamWork can help you merge the changes from multiple DBs into a single shared DB and to move safely the changes from one environment to the other
Danny
Related
We're planning to select DNN+2sxc for a project for our team.
Normally when it comes to a CMS, I usually fly solo, but in a corporate .Net or Java environment it’s team collaboration, source control, Azure, deployments etc.
With our upcoming project we’re taking one of our main sites (C#/asp.net/razor) and converting it to DNN.
However, I’m currently unsure as to how to approach a CMS in a team development environment?
So in the development phase, we'll have some guys doing styling, others creating 2sxc reusable content templates and others building the actual pages. All at the same time, on the same website. In terms of Git/Visual Studio I'm not sure how it will actually work with relation to the DB especially. This question obviously applies to all CMSes (not just DNN) in a shared development environment.
What is the best practice to do this?
So I prefer to do most development locally, in my own instance (local IIS and local DB), with each individual project (module,theme/skin) in a separate repository. This makes the risk of me breaking someone else, or someone else causing me pain, minimal.
You can use a tool like Polydeploy to automate the deployment from the repository check ins into that upper environment. Requiring that individuals check code into the repository when ready to deploy to a test/uat/prod type environment.
Where it gets tricky is content for sure, I would typically do that in a test/uat environment that will ultimately be pushed to production once it is finalized.
I NEVER source control the DNN instance itself, that's just asking for pain.
This can be quite challenging, especially since some parts are user-data (which shouldn't be re-deployed on development) and other parts are dev.
There is a minimal guide to this here: https://docs.2sxc.org/abyss/enterprise-development/index.html
I am being asked to extend our production/QA database to include an additional schema reserved for testing. My gut keeps telling me this will lead to no good.
The reasoning I've been given is to avoid spinning up an additional RDS instance. Doing so will cut cost and increase efficiency. I proposed running these test on a local instance, or even a micro EC2 instance. Both were shot down due to the complexity and what I felt was other nonsense.
Before I push back, I am wondering if others may have done this with some success. My experience in testing databases is that the environment should mimic one another as much as possible and that each environment should be isolated.
My Questions are:
Is a multi-tenant schema the way to go for this? Or is there another shared schema method?
Has it been heard of to run a multi-tenant schema to support both production and testing interaction?
If so, where might I look for inspiration, examples or how-tos?
What are some of the benefits/pitfalls of taking on this approach?
our small team of 2 people recently got upgraded to 5, which means we should introduce a bit more infrastructure around our project in order to work efficiently together. Its a university research project.
How much administrative effort is it to run
Gitlab
Jenkins
A Release Server
on a rented machine, in comparison to a SaaS solution e.g github with Travis ?
Unfortunately nobody in our (quite academic) team has practical experience with that. I know the setup can be done fast, but how time consuming is it to keep the stuff running. Are there other concerns we might be missing?
Of course we would like to mainly work on the project itself, but since the toolstack keeps growing by time we are not sure if a SaaS solution is what we need.
With these inputs be ready to spend from one to several weeks of one from the five.
Instead you might want to check which leading hosting companies provide free services for open source or educational projects.
I've been using MongoDB for about a year now, however not nearly up to its potential.
I've been developing new software, out of anyone's eyes except my own, and I've enjoyed the flexibility of the database to its fullest and I've made major structural changes to data on the fly.
Now I'm at a point where I have production server(s) and 3 development servers, I'm having a real problem with changing data structures and syncing them up.
Theoretically the development servers should always have the most current data from production. In a structured database, if I rename something, I can just run a compare tool and do the corresponding change in production after a pull. In MongoDB, this can become incredibly difficult.. there could be hundreds of changes from document to document, much less from database to database.
I've been reviewing my ~/.dbshell file to kinda get the feel of changes I've made, but what about changes made within the program its self? Configuration database changes?
Are there tools or procedures that are around to make this easier?
I've spent hours on Google researching how others do it. I came across Mongeez, but it's more manual and tedious than I need. In the past, I just do a mongodump and mongorestore inside of a git directory to transport data, but these snapshots are too rigid. I read a few blog posts regarding moving new data from production to development, but nothing about updating development documents in production. I could write a comparison script, but I feel like this is reinventing the wheel. There has to be a better way.
TL;DR: What are some ways to version NoSQL data, new entries and changed data, between environments?
I had a similar problem/experience while managing a few production Mongo machines for about a year.
Two quick pieces of advice:
WiredPrairie is right. Version your documents and that will allow you to migrate in a casual/relaxed manner. I wish we had done that up front. One of my biggest regrets.
We used Groovy to connect and do our schema/data changes and I loved it. The language is easy to learn and it works great with JSON. My practice was to back up the collections I'd be operating on, write the scripts in dev, run them and if I messed up, restore the backed up collections. Iterate until I got the scripts perfect and then repeat in production.
I'm leading a small software development team (4 people), and have just broken ground on a source-controlled SQL Server 2008 database project, with isolated development databases for each developer. I'm still implementing this one step at a time, but I'm envisioning each developer having their own database, with a naming scheme something like <ProjectName>_DEVELOPMENT_<TFSUserName>. This was all recommended per the MSDN articles I've been reading, but someone let me know if that sounds way off.
Anyway, we have a shared application solution that we've been developing for some time. In the past, we had no database version control, and just modified our database directly from SQL Server Management Studio when new reference data needed to be populated, or when we were testing functionality -- one change immediately affected everyone else. So with this new change, I'm wondering what the best way would be to have each person connect to their isolated development databases from the application solution. Prior to isolated databases, our connection to the database was specified in our application's web.config as a connection string. If we're each going to have our own database, the only way I can see it working is for each developer to set their connection string in their local solution to point to their personal database. But changing the web.config will check out that file in the solution, so developers will always have to specifically uncheck that file when checking in application changes to the baseline. Is there a less clunky way for each developer to use their isolated database when doing application testing?
I recommend that you not make the database names username-specific. Instead make the database the same name for each developer and always reference it via localhost (localhost\<ProjectName>_DEVELOPMENT). Then the same connection string will work for every developer.
MSDN's suggestion to use username-specific databases is better for a shared development environment. It's definitely not ideal for a localized environment.