Google Cloud SQL : Change to second generation possible? - google-cloud-sql

I have a working system using the Google App Engine including a Cloud SQL Database instance of the first generation. Is it possible to switch the database instance to the second generation?
Thanks!

On top of the previous answer, you can facilitate the Export/Import functions available by default to transfer the data.
Note that a lot of work is required to the code if you transfer to the second generation.

No, it is not possible to change the type of an existing instance, a new instance must be created.

Related

Updating Application data in a Qlik Sense Application though Extension

Is it possible to change the application data from a extension?
I was creating a visual extension(table) in which if I change the value of a cell I should be able to change the value in the application level (not in database level),How can I achieve this?
Changing the value in Qhypercube.[].qDataPages.qDataPages... is only changing the value in Extension level.
I think the problem here is data persistence, since Qlik Sense itself is not a data warehouse or a true "data store" in the traditional sense. When you load data from a database into an app and it goes through the app's load script, it's then cached to the underlying QVF file for the app. Updating the data would need to happen at either the source level (the database in this case), an intermediary store like a QVD, or "on the fly" via variables and chart scripting. Those first two options are persistent and that third one is not.
That's why if you look at other similar Qlik extensions that enable users to input data, they are "writeback" solutions, as they update the underlying database that the app is pulling from. You can find a few examples of those here, here, and here.
A few existing ones also take the approach of outputting to QVDs, which could be your best bet if you want to avoid updating a database. See this one as an example, as well as their implementation docs here.
You could probably achieve all of this with a combination of:
Getting the hypercube of your (updated) table (more info)
Create a session app (more info)
Write to a new or existing QVD (more info)
(Partial) reload the current app (more info)
This would all depend on the Update rights of the users of the app, though.

Postgresql in Node-red

I use node-red to create an Api from a server. I want to read and send data via http. I use the browser-based programming method. I want to send data from a postgresql database. I installed the package node-red-contrib-postgres-multi. I donĀ“t know how to put data into the database and how to read data from the database, because I cannot find examples.
Does anybody knows, how I can do that?
You can use postgrestor postgrestor package where you can define query or data modification statement directly in node window. It also allows parametrized queries via template engine.
The problem I've encountered is working with more than one db instance in the same flow but if you need only one db instance connection it should work for you.

Load a PostgreSQL database using cloudconnect

On the side of my Gooddata project, I maintain a small PostgreSQL database that contains a few tables.
I would like to be able to integrate both my ETL processes using the same tool, and it seems to me cloudconnect would be the easiest way, since I already have my whole GoodData ETL in it.
Here are the ways I tried to do it without success:
I tried to have a look in the documentation, and it seems to me that all the functionalities of CloverETL that enabled this (DBOutput, PostGreSQLDataWriter) are not available in Cloudconnect.
I managed to connect to the Agile Datawarehouse Service (Database attached to GoodData), but it seems that only the ADS database is able to understand the request:
COPY MyDataBaseTable (field1,field2) FROM LOCAL '${DATA_TMP_DIR}/CIforADS.csv'
even when I adapt the syntax to PostgreSQL because the dynamic addressing I use here does not seem to work.
Is there any way to proceed that I'm missing? Can anyone think of a workaround?
In general this could be achieved by using of "DBExecute" component, but
I'm not sure if I understand it well - do you want to load data into your own Postgres instance using CloudConnect?

How to create/insert product programmatically in Websphere Commerce 7 WCS7

I'm developing an ecommerce based on Websphere Commerce 7 WCS7. I need to import products from an external supplier, which is exposing a webservice. I've already implemented a Controller Command performing all the operation needed to extract the products from the remote service, and I've them avalaible as custom Java classes.
I'm a little bit confused about the approach I should follow in this case. I've defined the attributes needed in my scenario and used the dataload utility to import them in the DB. What should I do next? I expect to be able to "create" WCS product programmatically from my Controller Command but I don't know how to use the attribute I've defined in a programmatic insert.
Can someone point me on the right track on how to perform this kind of operation? I went through the documentation, but, given the fact I'm quite new of the WCS environment, I don't know how to proceed according to the current best practices.
It is possible to create a new catalog entry programmatically if you copy what is being done in the LOBTools. I have not done this myself though. I have always added new products via the data loads and when we did need to add from an external service I just output the information to a file and loaded along with our other products. The reason was due to keeping the catalog in sync with the product management system.
Have a look at the various DataBean classes in WCS, like: CatalogEntryDataBean.
See here for WCS data beans:Link
And here for "activation" of a DataBean:Link

Set GeoServer to access a Postgresql database, Simple or Snapshot schema, populated by Osmosis

I have a postgresql database which I keep updated using Osmosis. Osmosis can write to two different database schemas, named Simple and Snapshot. There are not that much different from the database Geoserver uses, But I can't make Geoserver use it perfectly.
The main problem seems to be the way tags are stored in those DBs. I can add the nodes layer and display it with that default Points style, but as soon as I use a "ogc:Filter" in my style to filter the nodes by their "place" tag, the WMS is broken and does not respond (says: The requested Style can not be used with this layer. The style specifies an attribute of place and the layer is: TestDB:nodes)
Is there anyway to make GeoServer understand that one of those shemas, or make Osmosis update to the DB GeoServer knows?
This is a case for using TRIGGERs to manage the integration. The two programs use two different schemas. You can CREATE TRIGGERs in the database which ensure that data written to one application is made available to another. Another option is you can set one or both to use VIEWs populated in part by the other application. In PostgreSQL, a VIEW can have triggers attached so these are not really
This is, in any case, a potentially large project so rather than offering sample code, I will offer a general outline of what sorts of things you need to think about.
Are these generally applicable? If so do you want to start an open source integration project?
Are both of these read-only workloads? Does data ever update? In general, if you are going to use views, updates pose the most concerns, so you want to run the views on the side not doing the updates if such is the case.
What is the write model of both sides? Insert/Update? Append only? Static data? What data do you have to "replicate" between the schemas?
Once you have those answers it should be relatively straightforward to get started and ask for help (either as an open source project or here) where you get stuck.