Create a zip file inside dropbox - dropbox-api

I have a system in dropbox where i have folders for multiple users with contents inside that i have them download every now and then. Anyways, I want to zip the contents of each persons folder for faster downloading as downloading folders over dropbox is very slow (at least to my knowledge unless there is a faster way to download besides sharing url) How could I go about doing that?

No, unfortunately the Dropbox API doesn't currently offer a way to create/download a zip of a folder, or otherwise download folders in bulk, but we'll consider it a feature request.
Edit:
The Dropbox API now offers the ability to download folders as zips:
https://www.dropbox.com/developers/documentation/http/documentation#files-download_zip
If you're using an official SDK, there will also be a corresponding method for this endpoint.

Related

Is it possible to download Github-Actions artifacts directly?

When I want to download an artifact I use the following kind of URL: https://github.com/some_user/some_repo/suites/some_id/artifacts/some_id. This however always leads me to a .zip package even if the result is just a single file. In my case this additional layer is totally redundant and I'd like to skip it (it's especially annoying when I build a pdf that I want to be able to preview conveniently).
(How) can I setup the automated workflow to expose unpacked files?
It's not possible at the moment:
Note: We only currently support downloading an artifact from the GitHub UI by zipping all the files together (this is independent of how the artifact gets uploaded). If you upload an individual file, in the UI the artifact will still present itself as a Zip because that is currently only what we support. We have plans in our roadmap to offer a more enhanced artifact UI experience that will allow you to browse and download individual files from an artifact. No ETA on when that might arrive, but it is something that we really really want to enhance.
https://github.com/actions/upload-artifact/issues/3#issuecomment-598820814
This is a limitation of our APIs and our UI, some of my earlier comments go into more details #39 (comment) and #39 (comment)
If you also look at our public api to download an artifact, you'll notice that we currently require a zip :archive_format: https://developer.github.com/v3/actions/artifacts/#download-an-artifact and that is what effectively is being used when you click to download an artifact. Ideally we should have options that let get the raw contents of whatever was uploaded without any archiving format, but we currently don't have any solutions disappointed
https://github.com/actions/upload-artifact/issues/109#issuecomment-671853296

How can I upload files to https://github.com/downloads/<user>/<repo>/?

I see people hosting files (usually binary files) under https://github.com/downloads/<user>/<repository>/. How do they do it?
You can't. That doesn't exist anymore... GitHub used to have a Downloads API that was replaced by the Releases API back in 2013. Old links were redirected and still work as you can see on the example you provided in the comments.
Nowadays, if you want to make files available for download, you should use either the Releases feature or the Packages feature.

What is the difference between Netsuite Bundle Copy and Push?

I am trying to figure out a better way to transfer bundles within our company and to multiple clients account. Which one will be the better way to install a bundle and later update it - Copy or Push? What are the limitations?
Copy is intended for versioning, deprecation and release testing.
You can think about copy as Forking a Project.
When coping a bundle, it will get a new bundle number and will appear on the bundle list.
If you want to move the bundle to client accounts, you have to install/push it on their accounts, never copy.

Atomic upload of many files to dropbox?

This was asked in one of the interviews. I would like to know the possible answers to this question.
"You have a shared folder, which everybody can see. You want to upload 100 files. This upload of 100 files should atomic i.e either all files are available to download to any user or no file is available to download.
One can argue that he will delete the uploaded files if operation fails in between but that is not an option because once a file is uploaded, it becomes visible to other users.
What can be the possible solutions?
My solution - Upload them first to a private folder and then share that folder inside the main shared folder.
It will be almost impossible to achieve isolation if you are using these cloud services. You can do it if you have your own server. Distributed systems is the topic which deals with similar kind of problems.
You can put a lock on a folder and upload all the files and then can change the lock on that folder.

Sharing of Sub Folders ownCloud

I have a complicated system of folders and I need to share 2nd and 3rd level folders with certain groups of users while maintaining the full path to the folder.
Is this possible? I tried but without success as if I share a folder eg. Project 1->Administration with the "Group Administration" on the client I only see the Administration folder and I need, instead, to replicate the entire structure.
Thanks for the support
With the current ownCloud sharing implementation this is simply not possible. Every shared item appears directly in the "Shared" folder of the user the file/folder is shared with.
Update: At the moment ownCloud (and I guess also nextCloud) allow a user to move around and rename files/folders shared with them. So even if you could enforce a certain structure on your users, they could always change it afterwards.
You could always report a feature request for it (or maybe there even already is one) here: https://github.com/owncloud/core/issues/ .