We've got an ASP.NET Web API application and we are using Entity Framework v6.1.3 and SQL Server 2012. Everything works fine locally with no performance issues, it also works very well on a server we have that runs Windows Server 2012.
When we tried Azure as a cloud platform, we got the free trial and deployed our application, but the performance was so bad with some queries taking from 4-7 minutes. Also as a side note, the server we got is connected to the internet and can be accessed outside our company's internal network, the performance still is not an issue only on Azure we faced this problem.
Any help would be appreciated, we want to invest in Azure but we fear we will face same issues we had on our trial subscription.
Probably you are using Basic tier which has 5 DTUs. Simply, we can say this database can process 5 transactions per second. Azure have tiers from basic to P11 which is 5 DTUs to 1750 DTUs. Additionally, if you choose right region for data center, it will help for network speed.
Select single database on menu and see the tiers and their prices. You can see how many DTUs you are using on portal. That's way you can choose perfect price/perfomance tier for you. For example, If it's higher than 80% increase the tier. If it's below 60% you can keep this tier.
Related
On Db2 v11.1, how do we get or setup the notification for DBA team if there is any hang or slowness situation in off shift working hours?
The answer depends on the external monitoring and alerting solution you deployed, and how you configure that tooling in your environment.
This application layer tooling is not built into Db2-LUW, although APIs exist in Db2-LUW for such tooling to get the data it needs in order to operate.
IBM and several third parties offer solutions for real time monitoring and alerting in this space. Many cover app-servers, web-servers, database layers, networks and operating-system layers and have different alerting configurability. Many have plugin type architecture with plugins for Db2-LUW monitoring. Do not use stackoverflow for product recommendations however.
For "slowness", this is only meaningful to measure usually at the application layer, in terms of response times and other metrics etc.
For database-hangs, IBM offers a db2-hang_detect script that tooling can orchestrate , requires careful interpretation and even more careful testing.
I am starting to use MongoDB and yet I am developing the first project with this. I can not to predict how volume of clients and usage it gonna to receive but I want to develop it from the beginning to be high volume handled.
I have heard about clusters and I saw the demonstrations in MongoDB official website.
And here is my question (cutted to small semi-questions):
Are clusters are different servers or that they are just pieces of one big server?
Maybe it seems a bit not related, but how Facebook or huge database handles its data across countries? I mean, they have users from Asia and from America. Surely with different servers, how the system knows how to host, aggregate and deliver with the right server? Is it automatically or that it is a tool that a third party supply to such large databases?
If I am using clusters, shall I still just insert the data to the database and the Mongo will manipulate them in cluster by it's own, or shall I do that manually?
I have a cloud VPS. Should I continue work with this for Mongo or maybe I should really consider about AWS / Google Cloud Platform / etc..?
And another important thing is: Im from Israel, and the clouds I have mentioned above are probably from Europe at least or even more far.
It will probably cause in high latency, is not it?
Thanks.
I have this challenge. I am the DevOps engineer and a software engineer in a team where months back, the developers moved from having a central Oracle DB to having the DB on a CentOS VM on their individual laptops. The move from a central DB was to reduce dependency on the DBAs and also to eliminate issues that stemmed from inconsistent data.
The plan for sharing and ensuring synchronization of the Database with everyone on the team was that each person will share change scripts with everyone. The problem is that we use Skype for communication (we just setup slack but are yet to start using it fully), and although people sometimes post the text of DB change scripts, it could be missed by some. The other problem is that some developers miss posting the changes. Further, new releases are deployed in Production without being deployed on the Test and Demo environments.
This has posed a serious challenge for us, especially myself who of recent, became responsible for ensuring that our Demo deployments were in sync with the Production deployments.
Most of the synchronization issues border on the lack of sync of the Database due to missing change scripts or missing DB objects. Oracle is our DB of preference.
A typical deployment in the Demo environment is a very painful process that involves testing an application and as issues occur due to missing DB table columns, functions, stored procs, we have to look for the missing DB objects, apply them to the DB and then continue until all issues are resolved.
How can I solve this problem to ensure smooth, painless and less time-consuming deployments? Can migrating our applications to Docker help with the DB synchronization issues and the associated lack of discipline of the developers? What process can we put into place to improve in this area?
Thank you very much in advance for your help.
Have a look # http://www.dbmaestro.com
I strongly recommend you to join the live demo session
DBmaetro TeamWork can help you merge the changes from multiple DBs into a single shared DB and to move safely the changes from one environment to the other
Danny
Can someone give me some options about how I can connect a PostgreSQL database to Power BI?
Right now, I used the Power BI Desktop and drivers to connect to my local database. I then published the data to Power BI for users to access and set up a daily refresh schedule with a Personal Gateway installed. This worked fine.
My issue is that my users now want refreshes every 30 minutes instead of daily and Power BI only allows 8 refreshes per day. This seems like it would require a live connection. My only Windows machine is quite weak and I live across the world from my end-users, so my only option is to set up a remote server.
I have an Azure Linux VM which I would prefer to use, but Power BI does not work on Linux as far as I can tell
My ETL pipelines and database are all based on PostgreSQL and I do not want to switch over to MS SQL or the Azure database product, if I can avoid it
Should I create a Windows-based VM on Azure and install PostgreSQL there and then replicate the required tables for Power BI to visualize? What is the best set up? I did not see any option on the Power BI website to connect live to Postgres so I am a bit concerned.
This is an old question, so you've probably figured out a workaround, but just to confirm:
No, Power BI does not offer a live connection to PostgreSQL at the moment. You can see the current list of what Power BI does live connect to here: https://powerbi.microsoft.com/en-us/documentation/powerbi-refresh-data/#live-connections-and-directquery-to-on-premises-data-sources
If a live connection to PostgreSQL is important to you, I would recommend posting an idea at https://ideas.powerbi.com/ (or up-voting someone else's idea - though I don't see one right now). Microsoft does review these ideas. I'd also recommend sharing the link here, so others searching for how to do this can up-vote the same idea.
In the meantime, a couple of different workarounds:
Even though you can't automate refreshes as often as you'd like, you can do additional manual refreshes. You can initiate the refresh yourself, or you can suggest end-users click the refresh button to get the latest data.
If you don't want to manually refresh, you could look into a 3rd party tool such as Power Update (http://poweronbi.com/power-update-features/). I've never used it before, but it can refresh a Power BI Desktop file and publish it up to the service. This would have the same effect as a manual refresh, but automated.
Note: This question was also asked (and answered) here: https://community.powerbi.com/t5/Integrations-with-Files-and/DirectQuery-for-PostgreSQL-Gateways-on-Linux/td-p/103418.
Since the august release 2019 of power BI there is now a directquery connection for postgres.
https://powerbi.microsoft.com/en-us/blog/power-bi-desktop-august-2019-feature-summary/
For any future viewers of this question - I'm working on building and maintaining a custom connector for exactly this purpose. So far I've been able to access most features except those which require datetime adds or diffs. We do have this working in our production environment w/ Postgres 11 via an enterprise gateway.
Repo:
https://github.com/sgoley/DirectQuery-for-ODBC-in-PowerBI
Please feel free to reach out to me if you'd like to help me resolve any outstanding bugs remaining or just learn more.
A how-to is available at medium here:
https://medium.com/just-readr-the-instructions/directquery-with-postgres-from-powerbi-desktop-f3d8c4dc5e15
Edit: As of August 2019 release, PowerBI will be supporting Direct Query in the native PostgreSQL connector:
https://powerbi.microsoft.com/en-us/blog/power-bi-desktop-august-2019-feature-summary/#postgresql
I'm a CS student, just exploring the SCM space. While doing my own research I came across many different hosted solutions (GitHub obviously, Lighthouse, YouTrack, TeamCity, etc.) - do you think it is actually reasonable to try to host a (commercial, closed source) project entirely in the cloud?
Let's say I'd host code on GitHub, use Jira or Lighthouse for issue tracking, God knows what other hosted PM solution (Basecamp?) and build using EC2 (I can put Hudson or TeamCity on it and use appropriate EC2 plugins for these products to get more computing power when needed).
Is the EC2 bill going to kill me (compared to self-hosted solutions)? Do you think "the cloud" it's still not reliable enough?
This is the way we work at our company. Version control system (git) + agile planning + ticket system/bugtracker + wiki are hosted at http://www.assembla.com for 49$/month for 40 users, private repositories ( https://www.assembla.com/plans ) and we have a micro instance on amazon aws ec2 where jenkins, nexus, sonar and some internals tools are running for free the first year and then you should consider spending like 80$/month for the same service.
So it costs 129$/month for a full cloud solution for a small company (40 users max): reliable, with a good release train of new features by our service providers and with a low maintenance footprint for us.
Compared to self hosted it's not really expensive considering the following costs :
- price of your server (lets say 1000$)
- electricity bills (lets say 30$/month for 100% uptime)
- cost of configuration (to get the same quality as assembla for exemple) and maintenance (lets say 0.5day man per month at 500$/day in france)
The cost is : 363$/month
This should look a bit biased, but finally it's what we experienced.
Regards,
Xavier
There is no problem to use the cloud for hosting, and many large companies do so already. I think NetFlix recently moved solely to EC2. Our whole business runs on EC2, and it's been relatively good so far.
The EC2 bill is up to you to manage -- cloud is all about granular billing for services, and the more you consume the more you pay (we sell a tool that helps with cost controls: http://LabSlice.com). Your biggest cost will usually be CPU power, so stick to the Micro/Small instances until you've got a handle on costs.
It's interesting that people question the reliability of cloud, as the underlying premise is actually to provide more reliability to businesses then they could afford themselves (high scalability, immediate availability of hardware, monitoring, load balancing etc.)
You can make use of AWS Free Account and host your application. If you exceed Free Account Usage limit,you will be charged for whatever extra you have utilized.
Regarding reliability about Cloud, every big firm is moving towards Cloud like Amazon,Microsoft,IBM,HP etc because they found cloud reliable,cost effective and green.
Given your a student and assuming your looking to spend little money, a lot of Git and SVN hosting providers offer free hosting for students or free accounts if your a small team with minimal storage requirements. Check out Codesion's student offering for example (disclaimer, I work for Codesion). This plan also comes with Trac / Bugzilla for your PM requirements. I wouldn't be concerned with security and reliability for the same reason that Simon points out above.
As for CI on EC2 - this is probably your best bet since you pay by the hour each instance is running. I'd recommend using Amazons API to fire up an instance each time Hudson needs to perform a build, store the results of the build on more permanent storage, and shut the instance down when finished. If your doing a lot of CI builds, it might be better to just keep the instance running, but this will cost you more of course.