Uploading blob to azure - http header not in correct format - rest

I am trying to upload videos to azure media server via rest api. I have reach the step of uploading the video however I am getting an error. I use the following code to upload the video.
var client = new HttpClient();
client.DefaultRequestHeaders.TryAddWithoutValidation("Authorization", "Bearer " + token);
client.DefaultRequestHeaders.Add("x-ms-version", "2.8");
client.DefaultRequestHeaders.Add("x-ms-date", "2015-02-5");
client.DefaultRequestHeaders.Add("DataServiceVersion", "3.0");
client.DefaultRequestHeaders.Add("MaxDataServiceVersion", "3.0");
client.DefaultRequestHeaders.Add("x-ms-blob-type", "BlockBlob");
var formcontent = new MultipartFormDataContent();
FileStream stream = File.OpenRead(#"C:\AzureMediaUploadTest\MediaUploadTest\VideoFiles\tom.mp4");
byte[] fileBytes = new byte[stream.Length];
stream.Read(fileBytes, 0, fileBytes.Length);
stream.Close();
var streamcontent = new StreamContent(new MemoryStream(fileBytes));
formcontent.Add(streamcontent);
formcontent.Headers.ContentType = new MediaTypeHeaderValue("application/octet-stream");
result = await client.PutAsync(uploadurl, formcontent);
However the result gives a 400 - A http header is not in the correct format. Iam not sure which header is refered too or am I missing something.
Any help is appreciated.
UPDATE: I have marked the question as answered however I am now having issues with the authentication header - the new issue is asked here - Uploading blob to azure - Create authentication header

According to this documentation here:
All authenticated requests must include the Coordinated Universal Time
(UTC) timestamp for the request. You can specify the timestamp either
in the x-ms-date header, or in the standard HTTP/HTTPS Date header. If
both headers are specified on the request, the value of x-ms-date is
used as the request's time of creation. The storage services ensure
that a request is no older than 15 minutes by the time it reaches the
service. This guards against certain security attacks, including
replay attacks. When this check fails, the server returns response
code 403 (Forbidden).
Your 2015-02-05 if far from valid UTC date format.
And according to this documentation here, and the sample PUT request, the Date header is represented as x-ms-date: Wed, 23 Oct 2013 22:41:55 GMT
There is no single in the Azure Blob REST API documentation where Date is referred to as yyyy-mm-dd format.

Related

Amazon S3 DELETE Request Successful but returns no confirmation?

I am testing the S3 REST endpoints using Postman creating the DELETE request as explained here.
The DELETE request returns a 204 (No Content) response with the following headers:
Content-Length →0
Date →Wed, 15 Aug 2018 21:11:38 GMT
x-amz-request-id →3458780640
The file before the command was present, after the request the file was deleted so I know the request is working. But theres nothing from the response to really confirm.
Can anyone explain why? I am planning to run this in an automated script so I need to be able to monitor whether files have been successfully deleted.
In the documentation they state:
This implementation of the operation does not return response elements.
If versioning is enabled on the S3 bucket, they do however return some values in the response headers:
x-amz-delete-marker and x-amz-version-id
The delete marker is an indication that the object was deleted, and you can also use it to un-delete that object.

JSON format Not supported exception from Azure storage REST API

I am working on REST API calls on Azure storage table, I am successful to query table and get response in xml format but when I try to change the Accept header to JSON I am getting the exceptions.
Note: I set the value of x-ms-version to 2018-03-28
headers.put("Authorization", "SharedKey " + store + ":" + hash);
headers.put("x-ms-date", date);
// headers.put("x-ms-version","2009-09-19");
headers.put("x-ms-version","2018-03-28");
headers.put("Accept-Charset","UTF-8");
// headers.put("Accept","application/atom+xml,application/xml");
headers.put("Accept","application/json;odata=nometadata");
headers.put("DataServiceVersion","1.0;NetFx");
headers.put("MaxDataServiceVersion","1.0;NetFx");
I am getting the Response status code 415 with message "JsonFormatNotSupportedJSON format is not supported."
DataServiceVersion and MaxDataServiceVersion are not necessary, but if you want to use, change them to 3.0;NetFx.
Only 3.0 is compatible with x-ms-version 2013-08-15 or later. See the document.
I have removed below 2 headers and now I am getting the response in JSON format.
headers.put("DataServiceVersion","1.0;NetFx");
headers.put("MaxDataServiceVersion","1.0;NetFx");

Azure bing cognitive services speech to text in javascript via REST fails

In JavaScript we use recorder.js to capture microphone input, down sample it to 16kHz, encode it as a WAV file and get a blob.
Next, we obtain the raw blob bytes via a FileReader onload() callback and then use an XMLHttpRequest to send() the raw bytes to Bing.
The XMLHttpRequest includes the headers:
'Ocp-Apim-Subscription-Key' : 'xxxxxx'
'Content-Type' : 'audio/wav; codec=audio/pcm; samplerate=16000'
A sample blob size is 62456 bytes.
FireFox network tracing shows 2 interactions. The first is
Request URL: https://speech.platform.bing.com/speech/recognition/interactive/cognitiveservices/v1?language=en-US&format=simple
Request Method: OPTIONS
and the second
Request URL:https://speech.platform.bing.com/speech/recognition/interactive/cognitiveservices/v1?language=en-US&format=simple
Request Method: POST
content-length: 94476
However, I keep getting the following reply
{"RecognitionStatus":"InitialSilenceTimeout","Offset":29000000,"Duration":0}
FWIW, any idea why the source blob size of 62456 would result in content-length: 94476?
The same raw blob bytes are processed by Amazon Lex properly.
Is there any JavaScript RESTful example?
Many thanks.
/--------------------------------------------------------------
After putting together the test case below I also tried the following without success.
console.log("Send to BING blob");
var self = this;
console.log(blob);
var msUrl = 'https://speech.platform.bing.com/speech/recognition/interactive/cognitiveservices/v1';
msUrl += '?language=en-US';
msUrl += '&format=simple';
console.log(msUrl);
var xhr = new XMLHttpRequest();
xhr.onload = function(evt) { console.log('onload', xhr, evt);};
xhr.open('POST', msUrl, true);
xhr.setRequestHeader('Accept', 'application/json;text/xml');
xhr.setRequestHeader('Ocp-Apim-Subscription-Key', 'xxx');
var bingContentType = 'audio/wav; codec=audio/pcm; samplerate=16000';
xhr.setRequestHeader('Content-Type', bingContentType);
xhr.send(blob);
The shorter code version of sending to Bing was fine. The probelm was that the recorder worker's encodeWAV(samples) function did not
take into account the down sampling to 16000. The function was incorrectly writing the captured sampling rate as the header value. The lines to be tweaked are:
view.setUint32(24, downSampleRate, true);
view.setUint32(28, downSampleRate * 2, true); /*MONO*/
Apparently AWS Lex ignores the header values as it only expects 16kHz mono whereas the Bing service has to look at the header information to determine which of the audio formats supported is being sent.
Today I came across this problem, after spending half an hour, I was able to find the real cause of my issue. Let me go through the steps which are mentioned in this link.
Verified my Bing speech API is in running status.
Verified my key by running the below code in
$FetchTokenHeader = #{
'Content-type'='application/x-www-form-urlencoded';
'Content-Length'= '0';
'Ocp-Apim-Subscription-Key' = ''
}
$OAuthToken = Invoke-RestMethod -Method POST -Uri https://api.cognitive.microsoft.com/sts/v1.0/issueToken -Headers $FetchTokenHeader
show the token received
$OAuthToken
As mentioned in the last point in that link, InitialSilenceTimeout may be the result of the unformatted/invalid wav file. So I downloaded a new wav file from internet and tested with it.
Bingo, that worked. And finally, I was able to get my speech in text format

How to make a REST call to an Azure Queue

I am able to make a C# library call to a queue using the SDK. However I am unable to make a REST call to the queue.
How shall I proceed? Any code sample will be appreciated.
I am able to make a c# library call to a queue using SDK. However i am unable to make a Rest Call to the queue. How shall i proceed and Any code sample will be appreciated.
Firstly, this link lists the REST operations for working with message queues that Azure Storage provides, please check the link to get detailed informations.
Secondly, here is a sample request to create a queue under the given account, you could construct your request like this.
HttpWebRequest req = (HttpWebRequest)WebRequest.Create(string.Format(CultureInfo.InvariantCulture,
"https://{0}.queue.core.windows.net/{1}",
StorageAccount, queuename));
req.Method = "PUT";
req.Headers.Add("Authorization", AuthorizationHeader);
req.Headers.Add("x-ms-date", mxdate);
req.Headers.Add("x-ms-version", storageServiceVersion);
req.ContentLength = 0;
and please refer to the following code and Authentication for the Azure Storage Services to construct the signature string for generating AuthorizationHeader.
string canonicalizedHeaders = string.Format(
"x-ms-date:{0}\nx-ms-version:{1}",
mxdate,
storageServiceVersion);
string canonicalizedResource = string.Format("/{0}/{1}", StorageAccount, queuename);
string stringToSign = string.Format(
"{0}\n\n\n\n\n\n\n\n\n\n\n\n{1}\n{2}",
requestMethod,
canonicalizedHeaders,
canonicalizedResource);
the request looks like this.
There are examples in the official documentation:
Request:
POST https://myaccount.queue.core.windows.net/messages?visibilitytimeout=30&timeout=30 HTTP/1.1
Headers:
x-ms-version: 2011-08-18
x-ms-date: Tue, 30 Aug 2011 01:03:21 GMT
Authorization: SharedKey myaccount:sr8rIheJmCd6npMSx7DfAY3L//V3uWvSXOzUBCV9wnk=
Content-Length: 100
Body:
<QueueMessage>
<MessageText>PHNhbXBsZT5zYW1wbGUgbWVzc2FnZTwvc2FtcGxlPg==</MessageText>
</QueueMessage>
https://learn.microsoft.com/en-us/rest/api/storageservices/fileservices/put-message

Lightstreamer Client for Matlab

I am trying to build a lightstreamer client for Matlab. There do exist a couple of libraries for platforms like JAVA, Python, .Net etc. But unfortunately not Matlab.
However, it turns out that most of these client implementations use the very same text-mode protocol for lightstreamer which is pretty basic HTTP requesting.
I figured out how to establish/close a lightstreamer session. I get the sessionId and I can use this id to subscribe to the data I want to stream. But although I do get a valid response for the subscription call, there is no data pushed.
I use the urlead2 function and the response seems fine:
[output,extras] = urlread2([lightstream_url,'/lightstreamer/control.txt'],'POST',body,headers);
allHeaders =
Response: {'HTTP/1.1 200 OK'}
Server: {'Lightstreamer'}
Content_Type: {'text/plain; charset=iso-8859-1'}
Cache_Control: {'no-store' 'no-cache'}
Pragma: {'no-cache'}
Expires: {'Thu, 1 Jan 1970 00:00:00 GMT'}
Date: {'Wed, 8 Apr 2015 11:15:02 GMT'}
Content_Length: {'4'}
status =
value: 200
msg: 'OK'
isGood =
1
output =
OK
It is correct that the response body contains "OK ", this is documented (documentation, page 20ff.), but there is supposed to be the stream data itself as well, isn't it?
So how do I get the actual data?
Somewhere in your code you should have a create_session.txt/bind_session.txt request, otherwise you should not have a valid session id that is required to obtain an OK answer from a control.txt request (e.g. the following generates the SYNC ERROR, that means that the server does not recognize the specified session: http://push.lightstreamer.com/lightstreamer/control.txt?LS_op=add&LS_session=invalid )
The data stream is not received on the control.txt response, that OK response simply means "OK I have added the subscription to your session".
The data stream is received on the create_session.txt/bind_session.txt response. Sections 4.1 and 4.2 + section 4.5 on the document you linked should explain how the data is received
I've found that opening a polling connection by setting LS_polling=true works fine without needing a listner. urlread2 hangs if you leave LS_polling as the default of false.
Create the session with /lightstreamer/create_session.txt
Request a subscription with /lightstreamer/control.txt
Repeatedly poll the connection to get the data with
/lightstreamer/bind_session.txt
The return from urlread2 will look something like this:
d =
OK
SessionId:S9b09da8ebd6b835aT5316913
ControlAddress:apd119a.marketdatasystems.com
KeepaliveMillis:1000
MaxBandwidth:0.0
RequestLimit:50000
1,1|10162.00|0.00|0.00
2,2|10686.8|TRADEABLE|0.5524861
2,13|1202.6|CLOSED|0.5714285
2,14|5900.51|CLOSED|0.5714285
...
LOOP 1000