I have a problem with my account in windows. It disappeared and I can't find a way to recover it. So I can't access my data on comp and my jobs are not up in the cloud. Is any chance that I can recover my jobs with my talend account?
Thank you!
If you are using Talend Open Studio, the source code of your jobs is only stored on your local disk.
With the subscription version of Talend, you would be using a remote repository (svn or git), where your jobs are stored.
With either version, your jobs are never going to be stored outside these locations, so a Talend account is of no use.
You may have a chance with a data recovery software to restore the files from disk.
if your disk (hdd) not encrypted you still can connect the disk to another computer and read your data.
Related
I had 10-15 microservices databases running in production on Ubuntu server. Accidentally deleted everything in the /var/lib/postgresql/** folder with command sudo rm -r *. I think PGDATA is inside the /var/lib/postgresql/13/ folder.
I tried TestDisk to restore this folder but it showed everything deleted except the 13/ folder.
I only have backup files from a long time ago.
Is there any way to restore the last data?
If you don't have a backup of the deleted files and testdisk was not able to recover them, you may want to try using another data recovery tool such as extundelete or photorec. These tools work by scanning the partition and looking for data that is no longer referenced by the file system, which can include deleted files.
It's important to note that the chances of successfully recovering deleted files decrease as more time passes and more activity occurs on the partition, so it's best to try to recover the files as soon as possible after the deletion. In addition, the more you use the partition after the deletion, the more likely it is that the deleted data will be overwritten, making it impossible to recover.
If you are unable to recover the deleted files using these tools, you may want to consider seeking the assistance of a professional data recovery service. These services typically have specialized equipment and expertise that can be used to recover data from damaged or formatted disks. However, these services can be expensive, so it's important to weigh the value of the data against the cost of recovery.
On rundeck backup guide, noted that is mandatory to stop rundeck to take full backup when using data file. Now, that guide don't show any secure method to backup full rundeck instance (rundeck server + database) when using MariaDB, PostgreSQL, or any supported database as a backend.
In a real production scenario, not seems to be possible to stop rdeck on a daily basis.
Can anybody share best pratices to take a hot full backup on rdeck installation without stop rdeck?
Is there any secure and supported way to achive a full consistent rdeck projects and jobs definitions and database on a daily basis ?
In this post, answer is not clear, because question don't describe what kinbd of backend are used.
The documentation suggests the instance shutdown because some execution could be active, and that means a potentially active transaction in the middle of the "hot backup process" which means a potential data corruption in your backup. Is the safest way to backup your database.
If you want to do a "hot" backup you can export your projects (with all content, including jobs) and keys.
We have a VM that runs a MongoDB and a web api to it.
Currently there is no backup solution enabled.
My goal is to protect data in MongoDB.
Is it a good idea to snapshot the whole Azure VM instance daily?
Or should I go for a database backup solution?
I have created instance in compute engine with windows server 2012. i cant see any option to take automatic backup for instance disk database everyday. there is option of snapshot but we need to operate this manually. please suggest any way to backup automatically and can be restore able on a single click. if is there any other possibility using cloud SQL storage or any other storage please recommend.
thanks
There's an API to take snapshots, see API section here:
https://cloud.google.com/compute/docs/disks/create-snapshots#create_your_snapshot
You can write a simple app to get triggered from Cron or something to take a snapshot periodically.
You have no provision for automatic back up for compute engine disk. But you can do a manual disk backup by creating a snapshot.
Best alternative way is to create a bucket and move your files there. Google cloud buckets have automated back up facility available.
Cloud storage and cloud SQL are your options for automated back ups in google cloud.
Added more details at the bottom of the question.
We are testing deployment scenarios in Azure VM preview and have run into an issue.
Here is our scenario. We have a software stack that we use in all of our servers. We have created an image with all of that stack installed on an attached data drive. We have created a image of the VM that we can use as a template. Now what we want to do is to to create a VM based on that template and create a copy of the data drive and attach it to the newly created VM in an automated manner.
Our problem is that while we have found lots of information about creating drives, we can't find any guidance on how to copy the data drive using Azure for Powershell.
Any thoughts, code, or RTFMs happily accepted.
Cheers,
Terence
We have sucessfully created an operating system image that we can use to create VM's. But there is a data disk that holds our standard software stack that we want to reuse by copying it across VMs. The scenario that we are trying to implement is:
Create a VM from a standard VM image - PBIMaster
Attach a disk as F to that image called PBIMasterDisk
Install all of the software required for our app on F: (to big for the OS disk and besides sticking it on the OS disk seems messy)
Build an image from PBIMaster call it PBIMasterImage save it.
Create a new image from PBIMaster call it Node1
Copy PBIMasterDisk to a new Azure disk call it Node1Software disk
Attach Node1Softwaredisk to Node1 as F:
Since the image has the correct registry settings from the previous installs our stack is ready to go.
9 Add appropriate endpoints.
Rinse and repeat for each additional node.
Hopefully that makes our scenario clearer.
Thanks.
If I understood your objective correctly you already have uploaded two VHD in your subscription and you have also create a VM based on your OS Disk VHD1:
OS Disk (VHD1)
Data Disk (VHD2)
Now you want to copy VHD2 to VHD3 and then attach VHD3 to your VM (which is based on OS disk) via Powershell.
As of there is no powershell command which will let you copy DataDisk (VHD2) to another data disk (i.e VHD3)..
I haven't tried but you can use the following code to try copying your DataDisk:
http://blogs.msdn.com/b/windowsazurestorage/archive/2012/06/12/introducing-asynchronous-cross-account-copy-blob.aspx
This method does copy blobs directly at cloud storage level so there is no bandwidth usage towards on-premise and potentially zero cost if you are in same DC. Trying using the same subscription and see if that solves your problem.