Talend issue while copying local files to HDFS - talend

Hi I want to know how to copy files to HDFS from source file system(Local File system),if source file already copied to HDFS,then how to eliminate or ignore that file to copy again in HDFS using Talend.
Thanks
Venkat

To copy files from local file system to the HDFS, you need to use tHDFSPut components if you have Talend for big data. If you use Talend for data integration you can easily use tSystem component with the right command.
To avoid duplicated files, you need to create a table in a RDBMS and keep track of all copied files. Each time the job start copying file, it should check if it already exists in the table.

Related

How load file .dat in database postgresql

please tell me how to upload a file with .dat extension in postgresql. I need to download about 5,000 files at a time - that is, a whole database
Use dbeaver program
This programm export and upload db

Can I update a Zip file inside AWS S3 with powershell?

I recently learned how to modify the contents of a Zip file with Powershell, in order to update it without having to recreate it completely, but now I have to do the same on AWS S3 with the Powershell module AWSToolsS3 and I I'm afraid it won't be possible.
My goal is to have two scripts. The first archives files on a file server and creates Zips for months and years.
This script is done and working.
The second script aims to copy the zips of months and years to an Amazon S3 Cloud.
However, there is a very small chance that a previously archived folder will be modified during a period when they are still accessible, and I therefore have to update any Zips that may have already been created.
These folders and Zips are very large and recreating them from scratch is not an option, so I have set up an optimized update system for Zips on the file server, and I would like to know if this is possible too with the AWSToolsS3 Powershell module
I can't seem to find in the Amazon documentation if it is possible to modify the contents of a Zip file without unzipping it.
So I ask the help of professionals from Powershell and AWS who will see my question.
Thank you in advance for your help.

Data Fusion: GCS create creating folders not object

I am trying to create an GCS object (File) with GCS create plugin of Data Fusion.
but it is creating a folder instead.
How I can have a file created instead of a folder ??
It seems that the description of the plugin leads to a misunderstanding. Cloud Storage doesn't work like a conventional filesystem, so you cannot "strictly" create empty files. The gsutil command doesn't have an equivalent to a touch command (on Linux) and all "basic" operation in this product is limited to the cp command (upload and download files).
Therefore, since there is no file when you specify the storage url, it's expected that a folder will be created instead of a file.
Based on this, I would like to suggest you two workarounds:
If you are using this plugin to create a file as a ‘flag’, you can continue using the plugin since the created folder also serves as a flag (to trigger a Cloud Function, for example)
If you need to create a file, you can create files with the GCS plugin located in ‘Sink’ plugins group to write records to one or more files in a directory on Google Cloud Storage. Files can be written in various formats such as csv, avro, parquet, and json.

SQL Server Rename Data Folder

I lost several days worth of SQL data in mdf and ldf files on my server stored at C:\Program Files\Microsoft Server\MSSQL11.MyServer\MSSQL\DATA.
I have uploaded files with the missing data to a folder called DATA_NEW in the same path.
I would like to test the Restored Data by renaming my current DATA folder to DATA_OLD and then renaming the DATA_NEW folder to DATA so that my database points to the Restored Data.
Is this a good way of verifying that the renamed data folder contains the valid data - with an option to revert back to the data in the folder named DATA_OLD?
Does anyone know of any reasons that I should not proceed with this approach?
Thanks very much,
Mark D

Can we move files from one folder to another in FTP server using Talend DI open studio?

There is an FTP server. On that server there are two folders (Folder1 and Folder2). Folder1 contains 20 csv files (Total size more than 2 GB). I want to move all csv from Folder1 to Folder2. But I don't want to use TFTPGet and TFTPPut as it will take to too much time to upload.
Can anyone help me?
Yes, we can. You can use tFTPRename component and give fully specified file paths of different folders to the Filemask and New name fields.
There are two ways to accomplish this in Talend. If you wish to copy all contents in a directory, then you only need a tFileCopy component and check "Copy a Directory" specifying the source and destination directories.
If you need to copy only certain files in a directory, you can accomplish this in Talend using 2 components that work together. You need a tFileList and a tFileCopy, connecting them together with an Iterate flow.
Use the tFileList to generate your list of files from a specified directory. You can configure wildcards in the filemask section. For example, to only take .txt you would enter "*.txt" in the filemask section.
Then rightclick tFileList in the designer and click Row-->Iterate. Connect this to the FileCopy component. In FileCopy use this code in file name:
((String)globalMap.get("tFileList_1_CURRENT_FILEPATH"))
You have other options in the FileCopy component as well, including Remove Source File, and Create the Directory if it doesn't exist.
Select which of the two best suit your needs.