Download directory content from SFTP server recursively - scala

How do we download directory content from SFTP server recursively in Scala? Can someone please help me with an example?
def recursiveDirectoryDownload(sourcePath: String, destinationPath: String) : Unit = {
val fileAndFolderList = channelSftp.ls(sourcePath)
for(item <- fileAndFolderList)
{
if(! item.getAttrs.isDir)
{
ChannelSftp.get(sourcePath + “\” + item.getFilename,destinationPath +”\” + item.getFilename)
}
}
}

You have to call back your recursiveDirectoryDownload function, when you encounter a subfolder.
See this question for implementation in Java:
Transfer folder and subfolders using channelsftp in JSch?
It should be trivial to translate to Scala.

Related

Is there a way to set a destination to unzip for gzip files when a user double clicks on the archive. Looking for a Scala/java solution

Is there a way to get a gzip archive file to unzip to a different destination when a user double clicks on the archive? Currently, my compression code looks something like this in Scala:
val filename = SetFilename.getOrElse {
val path = files.head.getAbsolutePath
val baseUrl = FilenameUtils.getFullPathNoEndSeparator(path)
...
}
val output = new File(filename)
val fos = new FileOutputStream(output)
val gzos = new GZIPOutputStream(new BufferedOutputStream(fos))
try {
files.foreach { input =>
val fis = new FileInputStream(input)
try {
ioStream(fis, gzos)
gzos.flush()
}
finally {
fis.close()
}
}
}
finally {
gzos.close()
fos.close()
}
IS there any way to tell the compressed files to decompress in a different destination when a user double clicks on the archive?
It is not the gzip archive that decides it will be unzipped in the same location, it's something the operating system you're unzipping it on decides.
If you need to unzip into a specific place, you should look for a packaging solution like deb for Ubuntu or Debian systems; or dmg for OSX.

How do I get the file/folder path using Apache Commons net FTPS Client

I am using Apache commons ftps client to connect to an ftps server. I have the remote file path which is a directory. This directory contains a tree of sub-directories and files. I want to get the path for each file or folder. Is there any way I can get this property? Or if there is any way I could get the parent folder path, I could concatenate the file name to it.
I am currently using below function to get path and size of all files under a certain directory. It gets all the file in current directory and check if it is a directory or file. If it is a directory call recursively until end of the tree, if it is a file save the path and size. You may not need these "map" things you can edit according to your needs.
Usage:
getServerFiles(ftp,"") // start from root
or
getServerFiles(ftp,"directory_name") // start from given directory
Implementation:
def getServerFiles(ftp: FTPClient, dir: String): Map[String, Double] = {
var fileList = Array[FTPFile]()
var base = ""
if (dir == "") {
fileList = ftp.listFiles
} else {
base = dir + "/"
fileList = ftp.listFiles(dir)
}
fileList.flatMap {
x => if (x.isDirectory) {
getServerFiles(ftp, base + x.getName)
} else {
Map[String, Double](base + x.getName -> x.getSize)
}
}.toMap
}

How can I get the project path in Scala?

I'm trying to read some files from my Scala project, and if I use: java.io.File(".").getCanonicalPath() I find that my current directory is far away from them (exactly where I have installed Scala Eclipse). So how can I change the current directory to the root of my project, or get the path to my project? I really don't want to have an absolute path to my input files.
val PATH = raw"E:\lang\scala\progfun\src\examples\"
def printFileContents(filename: String) {
try {
println("\n" + PATH + filename)
io.Source.fromFile(PATH + filename).getLines.foreach(println)
} catch {
case _:Throwable => println("filename " + filename + " not found")
}
}
val filenames = List("random.txt", "a.txt", "b.txt", "c.txt")
filenames foreach printFileContents
Add your files to src/main/resources/<packageName> where <packageName> is your class package.
Change the line val PATH = getClass.getResource("").getPath
new File(".").getCanonicalPath
will give you the base-path you need
Another workaround is to put the path you need in an user environmental variable, and call it with sys.env (returns exception if failure) or System.getenv (returns null if failure), for example val PATH = sys.env("ScalaProjectPath") but the problem is that if you move the project you have to update the variable, which I didn't want.

What do files deleted by groovy leave behind in Windows 7?

I have a groovy script that saves multiple files from a remote directory to my temp directory and parses them into xml. It has an interesting bug. Each time it runs, it can't find one file in my temp directory. The next time it runs, it finds that file, but can't find a new file. If I have 20 files, it won't find all 20 files until the 20th run. The temp directory is cleared after each run. I'm wondering if there are other artifacts the program is leaving behind?
If I clean the project after 16 runs, it still finds the first 16 files. So it seems it's not an artifact in eclipse.
This is running in Eclipse 3, Java 1.5, Windows 7, Groovy 1.0.
remoteftpFile.findAll {
println "in find"
ftp.getReply();
it.isFile()
}.each {
println "in each"
ftp.getReply();
println it
ftp.getReply();
def tempDestination=PropertiesUtil.getTempDir()
def procDestination=PropertiesUtil.getProcessedDir()
def tempFile = new File(tempDestination+ it.name )
def procFile = new File(procDestination+ it.name )
//set it to delete
ftp.getReply();
println "got tempfile"
def localftpFile = ftp.SaveToDisk(it,tempFile) //Save each file to disk
//************** Handles decryption via gpgexe
println "Decrypting file";
println localftpFile.toString();
def localftpFileStr=localftpFile.toString();
def processedftpFileStr=procFile.toString();
def gpgstring=PropertiesUtil.getGpgString();
def decryptedOutputName = localftpFileStr.substring(0, (localftpFileStr.length()-4));
def decryptedProcOutputName= processedftpFileStr.substring(0, (processedftpFileStr.length()-4));
def decryptedOutputXMLName = decryptedOutputName.substring(0, (decryptedOutputName.length()-4))+".xml";
def decryptedProcOutputXMLName = decryptedProcOutputName.substring(0, (decryptedProcOutputName.length()-4))+".xml";
println decryptedOutputName;
def xmlfile = new File(decryptedOutputName)
def cdmpXmlFile = new File(decryptedOutputXMLName)
def procCdmpXmlFile = decryptedProcOutputXMLName
println gpgstring + " --output ${decryptedOutputName} --decrypt ${localftpFile} "
(new ExternalExecute()).run(gpgstring +" --output ${decryptedOutputName} --decrypt ${localftpFile} ");
Thread.sleep(1000);
//************* Now Parse CSV file(s) into xml using stack overflow solution
println "parsing file"
def reader = new FileReader(xmlfile)
def writer = new FileWriter(cdmpXmlFile)
def csvdata = []
xmlfile.eachLine { line ->
if (line){
csvdata << line.split(',')
}
}
def headers = csvdata[0]
def dataRows = csvdata[1..-1]
def xml = new groovy.xml.MarkupBuilder(writer)
// write 'root' element
xml.root {
dataRows.eachWithIndex { dataRow, index ->
// write 'entry' element with 'id' attribute
entry(id:index+1) {
headers.eachWithIndex { heading, i ->
// write each heading with associated content
"${heading}"(dataRow[i])
}
}
}
}
println "Performing XSL Translation on ${cdmpXmlFile}"
def cdmpXML = new XMLTransformer(xmlTranslate).run(cdmpXmlFile) //Run it on each of the xml files and set the output
new File("C:\\temp\\temp.xml").write(cdmpXML)
new File(procCdmpXmlFile).write(cdmpXML)
println "Done Performing XSL Translation"
println "Uploading Data to CDMP"
def cdmpUp = new UpdateCDMP(updateDB)
cdmpUp.run(cdmpXML)
println "Finished Upload and Import"
//do clean-up backing it up AND removing the files
println "Finished"
println "Closing Buffers"
reader.close();
writer.close();
println "Deleting Local Files"
new File(decryptedOutputName).deleteOnExit();
new File(localftpFile).deleteOnExit();
xmlfile.deleteOnExit();
cdmpXmlFile.deleteOnExit();
println "Deleting " + cdmpXmlFile.getName()
new File("C:\\temp\\temp.xml").deleteOnExit();
}
ftp.close()
}
This is because you are using deleteOnExit, which is not guaranteed to delete a file. It only deletes if:
the files are closed correctly,
the JVM exits correctly (with no exceptions), and
a System.exit() was called with 0 argument (or the VM terminated naturally).
It's especially problematic on Windows OSes. I can't point to a specific Stack Overflow question about this, but most questions involving deleteOnExit discuss this issue.
If you actually want to delete a file, then you should always use aFile.delete() directly. There is really no good reason to delay the deletion until later in your example.

Potential flaw with SBT's IO.zip method?

I'm working on an SBT plugin where I'd like to zip up a directory. This is possible due to the following method in IO:
def zip(sources: Traversable[(File,String)], outputZip: File): Unit
After tinkering with this method, it seems that simply passing it a directory and expecting the resulting zip file to have the same file & folder structure is wrong.. Passing a directory (empty or otherwise) results in the following:
[error]...:zipper: java.util.zip.ZipException: ZIP file must have at least one entry
Therefore, it appears that the way to get use the zip method is by stepping through the directory and adding each file individually to the Traversable object.
Assuming my understanding is correct, this strikes me as very odd - vey rarely do users need to cherry-pick what is to be added to an archive.
Any thoughts on this?
It seems like you can use this to compose a zip with files from multiple places. I can see the use of that in a build system.
A bit late to the party, but this should do what you need:
val parentFolder: File = ???
val folderName: String = ???
val src: File = parentFolder / folderName
val tgt: File = parentFolder / s"$folderName.zip"
IO.zip(allSubpaths(src), tgt)
Here is some code for zipping directories using sbt's IO class:
IO.withTemporaryDirectory(base => {
val dirToZip = new File(base, "lib")
IO.createDirectory(dirToZip)
IO.write(dirToZip / "test1", "test")
IO.write(dirToZip / "test2", "test")
val zip: File = base / ("test.zip")
IO.zip(allSubpaths(dirToZip), zip)
val out: File = base / "out"
IO.createDirectory(out)
IO.unzip(zip,out) mustEqual(Set(out /"test1", out / "test2"))
IO.delete((out ** "*").get)
//Create a zip containing this lib directory but under a different directory in the zip
val finder: PathFinder = dirToZip ** "*" --- dirToZip //Remove dirToZip as you can't rebase a directory to itself
IO.zip(finder x rebase(dirToZip, "newlib"), base / "rebased.zip")
IO.createDirectory(out)
IO.unzip(base / "rebased.zip",out) mustEqual(Set(out /"newlib"/"test1", out / "newlib"/ "test2"))
})
See the docs
http://www.scala-sbt.org/0.12.2/docs/Detailed-Topics/Mapping-Files.html
http://www.scala-sbt.org/0.12.3/docs/Detailed-Topics/Paths.html
for tips on creating the Traversable object to pass to IO.zip