Postgresql server not shown on azure application map - postgresql

I'm trying to use Application Insights to monitor an application composed of different microservices in an AKS (Azure Kubernetes Services) cluster.
As AKS does not support the auto-instrumentation scenario, I had to instrument myself my js/.net services with the dedicated libs.
And this works fine, I can see my different microservices on an application map.
But I can't see my database server in the dependencies like in the documentation's example, even if those dependencies should be automatically collected as stated in the dependencies documentation.
I'm using Azure Database for PostgreSQL - Flexible Server. Is this normal? Is it due to the fact I am using PostgreSQL instead of SQL Server? Is this related to the fact that I'm using Npqsl instead of SqlClient ?

Related

How to deploy a next.js + mongo app to AWS (or any other service like G Cloud)?

I just have some experience developing in JS but almost nothing in devops, and there's a lot of documentation but I don't really know where to start.
I built a next.js app (both frontend and backend) connected to mongo db. They run fine locally using docker-compose. Now I would like to deploy them to aws, also because I need to store on S3 files needed by the application.
What services do I tipically need? should I deploy my app to EC2, or use AWS amplify, or any other service like google cloud for example?
Can I deploy my images just how they are, including mongo, to EC2? Or should I, for example, just deploy next.js and connect it to a managed mongo db, which I suppose is an additional cost.
I know it is a pretty generic question, if you can just point me to the tools I need to manage the whole deploy process then I'll find out how to use them. Currently all the code (including Dockerfile and docker-compose.yml) is on github.
This is probably not the perfect answer since the question is very general and AWS provides a lot of features but I'll give it a go.
For JS app you could use a AWS Elastic Beanstalk which is for setting up web applications easily as it creates all the resources like EC2, load balancers, etc. Since you're new to AWS you can check this service out instead of manually creating EC2. Even if you use AWS Elastic Beanstalk you will still have access to the EC2 and other resources created by AWS Elastic Beanstalk. You'll get exposure to various different services which can help speed up your application.
For images S3 would be a great choice. However, depending on how frequently data is accessed I would look up the different S3 options as well as backup options.
As for your DB, MongoDB would work but you'd need to run it on a EC2 and maintain it yourself. AWS has different managed database option such as DynamoDB in your case but it all depends on the tools you require, budget, etc.

How can Spring Cloud Dataflow Server use new tables( with custom prefix ) created for Spring batch and Spring cloud task?

I have created spring cloud task tables i.e. TASK_EXECUTION, TASK_TASK_BATCH with prefix as MYTASK_ and spring batch Tables with prefix MYBATCH_ in oracle database.
There are default tables also there in the same schema which got created automatically or by other team mate.
I have bound my Oracle database service to SCDF server deployed on PCF.
How can i tell my Spring Cloud Dataflow server to use tables created with my prefix to render data on dataflow server dashboard?
Currently, SCDF dashboard uses tables with default prefix to render data. It works fine. I want to use my tables to render SCDF dashboard screens.
I am using Dataflowserver version - 1.7.3 and Deployed it on PCF using manifest.yml
There's an open story to add this enhancement via spring-cloud/spring-cloud-dataflow#2048.
Feel free to consider contributing or share use-case details in the issue.
Currently in a spring-cloud-dataflow and spring-cloud-skipper we use flyway to manage database schemas and it's not possible to prefix table names. Trying to support for this would add too much complexity and I'm not even sure if it'd be possible.

Is it possible to launch a NoSQL cluster with Dynamodb locally(downloadable) and not in amazon aws?

Actually I am not very familiar with Dynamodb and I would like to launch a NoSQL database with local Dynamodb (downloadable version) but not hosted on amazon AWS. I would appreciate it if someone could let me know is it possible to make such a clustered or does downloadable version of Dynamodb support to be cluster locally ?
You can very easily run DynamoDB locally, but it only supports running a single instance—not a cluster. It's intended to be used for local testing/debugging.
DynamoDB is provided as a hosted service. Does not exist a DynamoDB code that you can download and install to use as a host or service provider.
As part of SDK for a lot of languages, AWS Team developed some wrappers that permits you to execute local versions of DynamoDB to test your particular code. These wrappers respect the DynamoDB API contract. In that case you can code to the DynamoDB interface and get the responses like it were hosted in AWS environment. But you can't host any database or even serve data as a service using this solutions.

RESTful services and MYSQL deployment in cloud

I have developed RESTful services with Asp.NET, Web API 2.0 and MySQL.
What are my options to deploy this in to the Cloud? I don't want a complete EC2 instance or Azure Virtual Machine.
Are there any cloud platform services where I can only get IIS server and a MYSQL database?
See below for good links on Azure and AWS options. Since you mention IIS, Azure may be your best bet. Keep in mind you should try and keep your API and DB in the same cloud data center to improve performance and reduce cost for ingress and egress.
From an Azure perspective:
Take a look at their MySQL as a service offering (in preview)
And then you can host your code in a couple of ways.
Asp.Net in an App Service
An Azure Function
Using a combination of the above you can leverage PaaS and avoid having to manage your own VMs.
Further, look in to using a consumption plan to pay for only what you use.
From an AWS perspective
Use Amazon RDS (MySQL)
Use Lambda to host your API
Again, here you wont need to manage servers either.

Does azure support things like mongodb and redis?

Can you use mongodb and redis/memcached with azure?
I'm guessing no but just want to make sure.
It turns out they do support things other than .net, are they using linux servers then?
You can very easily run mongodb in Windows Azure. I presented this at MongoSV - video here.
EDIT: In December 2011, 10gen published their official MongoDB+Azure code on github. This contains a project for replica-sets, as well as a demo ASP.NET MVC application (taken from the Windows Azure Platform Training Kit) that uses a replica set for its storage.
Standalone servers are straightforward, except you have to deal with scale-out: you can't have multiple instances of a standalone server simultaneously, so you'll need to plan for this: take all but one out of the load balancer, or only launch mongod if you can acquire the Cloud Drive lock.
Replicasets are doable, as I demonstrated at MongoSV. However, I didn't cover the intricacies of graceful shutdown of a replicaset to ensure zero data loss.
You can run memcached as well - see David Aiken's post about this. Note: Now that the AppFabric Cache service is live, you should look into the pros/cons of using that over memcached. Cost-wise, AppFabric Cache should run much less, as you don't have to pay for role instances to host your cache. More info about AppFabric Cache here.
You now also have the option of running Redis in Windows Azure on Linux virtual machines ! In the case of Redis, this would allow you to use the "official" build instead of the "unsupported" Windows build ... For MongoDB, both choices seem equally valid (running on Linux virtual machines, "plain" Windows virtual machines, or using 10gen's package to run on "managed" VMs (Cloud Services).
FYI, there's now a Redis installer for Windows Azure available from MS Open Tech (my team). Here's a tutorial on how to use it: http://ossonazure.interoperabilitybridges.com/articles/how-to-deploy-redis-to-windows-azure-using-the-command-line-tool
[UPDATE] Azure now supports MongoDB and Redis.
http://azure.microsoft.com/blog/2014/04/22/announcing-new-mongodb-instances-on-microsoft-azure/
http://azure.microsoft.com/en-us/services/cache/
In the Azure Store you can now select Redis Cloud as an add-on.
Heres the Azure store description:
"Redis Cloud is a fully-managed cloud service for hosting and running Redis in a highly-available and scalable manner, with predictable and stable top performance. Tell us how much memory you need and get started instantly with your new Redis database."
PUBLISHED DATE 3/31/2014
You can access the store by selecting the "New" button in the Azure portal then "Store". I have yet to use it but it looks promising.
Azure now has a first-party Redis service, currently in preview:
http://azure.microsoft.com/en-us/documentation/articles/cache-dotnet-how-to-use-azure-redis-cache/