I am trying to upload the files using document management API. The file gets uploaded but fails with "We couldn't process the following document. Extraction failed" error message.
I tried uploading image,PDF etc. But the same file gets extracted and published successfully when done manually through UI
Just tested with this sample (source code here) and worked fine.
Related
error image i am getting this error while running my anylogic application therefore not able to run the application
"The source attachment does not contain the source for the file Throwables.class"
Is it possible to lookup the audio metadata for a file stored in Google Cloud without having to download it? When building a Google Speech-to-Text API service you pass it a gs://bucket/file.flac, and I know the sox and ffmpeg bash and Python commands for locally stored files metadata lookup, but I can't seem figure out a way to lookup audio file metadata on Google Cloud Storage file.
Additionally if I have a gs://bucket/audio.wav, can I re-encode that using sox/py-sox and write the new audio.flac directly to gs://bucket/audio.flac? Or do I have to download the audio.wav to re-encode it?
Any thoughts or directions appreciated.
No, it is not possible to access the metadata you want directly in google Cloud Storage. Using the command gsutil ls -L gs://[bucket_name]/[file_name] will prompt the metadata of that file within the bucket. You can modify these metadata, but not the ones you are referring to. You will need to download the files, re-encode them and upload them again.
You cannot do that re-encoding operation in Cloud Storage, you will need to download the file and process it the way you want before uploading it again to your bucket. However, here is a workaround if it works for you:
Create a Cloud Function triggered when your file is uploaded. Then, retrieve the file that you just uploaded and perform any operation you want with it (such as re-encoding into .flac). After that, upload it again (careful! If you give the new file the same name and extension, it will overwrite the older one in the bucket).
About your library, Cloud Functions use Python 3.7, which for the time being does not support the py-sox library, so you will need to find another one.
I am going through the Get started tutorial of OpenUI5 and using a notepad to enter javascript code. I am in step 3 - XML Views.
My folders are as follows
XML File App.view - copy paste from step 3
HTML file example - copy paste from step 3
When I run this I am getting an error which says
Unfortunately I cannot use any other tool due to customer restrictions. I can only use notepad.
Please help.
Your issues is no related to the used editor. It seems that you are loading the index.html from your file system. Please use a local webserver to deliver it and open it using HTTP.
I am trying to troubleshoot getting a Data/Service deployed to a live server in Flash Builder with an Air application.
I am using AMF/Zend but provided my own Php file called database.php. I'm trying to find somewhere in the code base where that is referenced, but I can't find anything, other than a link to my php file in the /services folder, which I have no idea how to change to linking to a remote file.
I am using CGI.pm version 3.10 for file upload using Perl. I have a Perl script which uploads the file and one of my application keeps track of different revisions of the uploaded document with check-in check-out facility.
Re-creational steps:
I have done a checkout(download a file) using my application (which is web based uses apache).
Logout from current user session.
Login again with same credentials and then check-in (upload) a new file.
Output:
Upload successful
Perl upload script shows the correct uploaded data
New revision of the file created
Output is correct and expected except the one case which is the issue
Issue:
The content of the newly uploaded file are same as the content of the last uploaded revision in DB.
I am using a temp folder for copying the new content and if I print the new content in upload script then it comes correct. I have no limit on CGI upload size. It seems somewhere in CGI environment it fails might be the version i am using. I am not using taint mode.
Can anybody helps me to understand what might be the possible reason?
Sounds like you're getting the old file name stuck in the file upload field. Not sure if that can happen for filefield but this is a feature for other field types.
Try adding the -nosticky pragma, eg, use CGI qw(-nosticky :all);. Another pragma to try is -private_tempfiles, which should prevent the user from "eavesdropping" even on their own uploads.
Of course, it could be that you need to localize (my) some variable or add -force to the filefield.
I found the issue. The reason was destination path of the copied file was not correct, this was because my application one of event maps the path of copied file to different directory and this path is storing in user session. This happens only when I run the event just before staring upload script. This was the reason that it was hard to catch. As upload script is designed to pick the new copied file from same path so it always end up uploading the same file in DB with another revision. The new copied file lying in new path.
Solved by mapping correct path before upload.
Thanks