Azure Function BlobTrigger fires, but blob not found - triggers

I finally reproduced this issue with the code below.
I am simply triggering a durable function with a blob trigger, and in one of the activity functions I read the blob. BUT... when I read the blob I get an error that the blob doesn't exist.
Can someone explain what I am doing wrong here?
Code:
[FunctionName("BlobTrigger")]
public static async void Trigger(
[BlobTrigger("incoming-blob/{filename}", Connection = "")]Stream myBlob,
[OrchestrationClient]DurableOrchestrationClient starter,
string filename,
ILogger log)
{
var instanceId = await starter.StartNewAsync("Orchestrator", filename);
}
[FunctionName("Orchestrator")]
public static async Task RunOrchestrator(
[OrchestrationTrigger] DurableOrchestrationContext context)
{
var filename = context.GetInput<string>();
await context.CallActivityAsync("Read_Blob", filename);
}
[FunctionName("Read_Blob")]
public static async Task Activity(
[ActivityTrigger] string filename,
[Blob("incoming-blob")] CloudBlobContainer container,
ILogger log)
{
var stream = new MemoryStream();
var blob = container.GetBlockBlobReference(filename);
await blob.DownloadToStreamAsync(stream);
//EXCEPTION THROWN AT ABOVE LINE.
stream.Dispose();
}

Looks like you are trying container.GetBlockBlobReference("filename") rather than container.GetBlockBlobReference(filename) using the passed in parameter [ActivityTrigger] string filename

Related

Generating a coruppted PDF file from a visual force page

I am trying to generate a PDF file attachment in an after update trigger and below is the trigger handler method but the file sent is corrupted.
public static void sendEOB(List<Claim_Payment__c> newList,Map<Id, Claim_Payment__c> oldMap){
for(Claim_Payment__c claimPayment : newList){
if (claimPayment.Status__c == 'Pending'){
String sfUrl = URL.getSalesforceBaseUrl().getHost();
String myURL = 'https://'+sfUrl+'/apex/ClaimPayments?Id='+ claimPayment.Id;
PageReference pdf = new PageReference(myURL);
// the contents of the report in pdf form
Blob body;
try {
body = pdf.getContent();
} catch (VisualforceException e) {
body = Blob.valueOf('PDF Get Failed');
}
Messaging.EmailFileAttachment attach = new Messaging.EmailFileAttachment();
attach.setContentType('application/pdf');
attach.setFileName('Payment_Statement.pdf');
attach.setInline(false);
attach.Body = body;
Messaging.SingleEmailMessage mail = new Messaging.SingleEmailMessage();
mail.setUseSignature(false);
mail.setToAddresses(new String[] {'whatever#gmail.com' });
mail.setSubject('Payment Statement');
mail.setHtmlBody('Important Statement Attachment!');
mail.setFileAttachments(new Messaging.EmailFileAttachment[] { attach });
// Send the email
Messaging.sendEmail(new Messaging.SingleEmailMessage[] { mail });
}
}
}
Update :
Using #future(callout=true) & getContentAsPDF() now i get this error message FATAL_ERROR Internal Salesforce.com Error on executing body = pdf.getContent();
public static void sendEOB(List<Claim_Payment__c> newList,Map<Id, Claim_Payment__c> oldMap){
for(Claim_Payment__c claimPayment : newList){
if (claimPayment.Status__c == 'Pending'){
String sfUrl = URL.getSalesforceBaseUrl().getHost();
String myURL = 'https://'+sfUrl+'/apex/ClaimPayments?Id='+ claimPayment.Id;
sendPDFEmail.sendPDF(myURL);
}
}
}
Public class sendPDFEmail{
#future(callout=true)
public static void sendPDF(String myURL){
PageReference pdf = new PageReference(myURL);
// the contents of the report in pdf form
Blob body;
body = pdf.getContentAsPDF();
// Rest of the implementation goes below
}
}
What errors (if any) you see in debug log? From what I remember you might need #future(callout=true) or other asynchronous processing (Queueable etc) because calling getContent counts as callout. And shouldn't that be getContentAsPdf? or is the page already set to renderas="pdf"?
If you download the file - what size it is, couple bytes or bigger? That try-catch is swallowing the actual error message, you don't even show it as System.debug(e); Remove the try-catch and check in the debbug what exactly it explodes on.

HTTP for images between client and server sends empty stream

I am trying to send a stream (containing an image file) from a WASM client to a backend .NET Core 5 server. In the WASM app, I start with a MemoryStream that contains the file data. In order to send the data contained in this MemoryStream using HttpClient.PostAsync, I seem to have to convert it to a StreamContent object:
StreamContent streamContent = new StreamContent(imageMemoryStream);
I use the debugger to verify that the length of the content of streamContent is not zero at this point. So far so good.
I then use HttpClient.PostAsync to send this stream to the server:
var response = await Http.PostAsync("api/HttpStreamReceiver", streamContent);
On the server side, I have a controller that receives HTTP messages:
[Route("api/[controller]")]
[ApiController]
public class HttpStreamReceiverController : ControllerBase
{
[HttpPost]
public async Task<ActionResult> Get()
{
Stream imageStream;
try
{
imageStream = Request.Body;
}
catch (Exception)
{
return new BadRequestObjectResult("Error saving file");
}
}
}
Here, it seems that Request.Body is empty. Trying to evaluate the length of either Request.Body or of imageStream on the server side results in a System.NotSupportedException, and
await imageStream.ReadAsync(buffer);
leaves buffer blank. What am I doing wrong here?
The image file cannot be transmitted through the body unless it is serialized. I suggest you use MultipartFormDataContent to pass the file.
This is an example.
class Program
{
static async Task Main(string[] args)
{
string filePath = #"D:\upload\images\1.png";
HttpClient _httpClient = new HttpClient();
string _url = "https://localhost:44324/api/HttpStreamReceiver/";
if (string.IsNullOrWhiteSpace(filePath))
{
throw new ArgumentNullException(nameof(filePath));
}
if (!File.Exists(filePath))
{
throw new FileNotFoundException($"File [{filePath}] not found.");
}
//Create form
using var form = new MultipartFormDataContent();
FileStream fs = new FileStream(filePath, FileMode.OpenOrCreate, FileAccess.ReadWrite);
byte[] buffur = new byte[fs.Length];
BinaryWriter bw = new BinaryWriter(fs);
bw.Write(buffur);
//var bytefile = AuthGetFileData(filePath);
var fileContent = new ByteArrayContent(buffur);
fileContent.Headers.ContentType = MediaTypeHeaderValue.Parse("multipart/form-data");
form.Add(fileContent, "image", Path.GetFileName(filePath));
//the other data in form
var response = await _httpClient.PostAsync($"{_url}", form);
response.EnsureSuccessStatusCode();
var responseContent = await response.Content.ReadAsStringAsync();
bw.Close();
}
}
Web api.
[Route("api/[controller]")]
[ApiController]
public class HttpStreamReceiverController: ControllerBase
{
[HttpPost]
public async Task<ActionResult> Get(IFormFile image)
{
//...
return Ok("get");
}
}
Result:

PUT requests to LogStash fail when sent using HttpClient, succeed when sent using cURL

Here is my logstash.conf file. (Apologies for not pasting the code here directly; StackOverflow does not allow posts exceeding a certain code-to-text ratio.)
My remote VM, which also hosts my ElasticSearch and LogStash servers, listens on Port 8080.
On my local machine, I periodically send zipped folders (containing JSON documents) over TCP to my remote server, which receives the data into a memory stream, unzips the folders, and sends the contents to LogStash. LogStash in turn forwards the data to ElasticSearch.
I am currently testing the workflow with some dummy data.
On my remote server, here is the method for receiving data over TCP:
private static void ReceiveAndUnzipElasticSearchDocumentFolder(int numBytesExpectedToReceive)
{
int numBytesLeftToReceive = numBytesExpectedToReceive;
using (MemoryStream zippedFolderStream = new MemoryStream(new byte[numBytesExpectedToReceive]))
{
while (numBytesLeftToReceive > 0)
{
// Receive data in small packets
}
zippedFolderStream.Unzip(afterReadingEachDocument: LogStashDataSender.Send);
}
}
Here is the code for unzipping the received folder:
public static class StreamExtensions
{
public static void Unzip(this Stream zippedElasticSearchDocumentFolderStream, Action<ElasticSearchJsonDocument> afterReadingEachDocument)
{
JsonSerializer jsonSerializer = new JsonSerializer();
foreach (ZipArchiveEntry entry in new ZipArchive(zippedElasticSearchDocumentFolderStream).Entries)
{
using (JsonTextReader jsonReader = new JsonTextReader(new StreamReader(entry.Open())))
{
dynamic jsonObject = jsonSerializer.Deserialize<ExpandoObject>(jsonReader);
string jsonIndexId = jsonObject.IndexId;
string jsonDocumentId = jsonObject.DocumentId;
afterReadingEachDocument(new ElasticSearchJsonDocument(jsonObject, jsonIndexId, jsonDocumentId));
}
}
}
}
And here is the method for sending data to LogStash:
public static async void Send(ElasticSearchJsonDocument document)
{
HttpResponseMessage response =
await httpClient.PutAsJsonAsync(
IsNullOrWhiteSpace(document.DocumentId)
? $"{document.IndexId}"
: $"{document.IndexId}/{document.DocumentId}",
document.JsonObject);
try
{
response.EnsureSuccessStatusCode();
}
catch (Exception exception)
{
Console.WriteLine(exception.Message);
}
Console.WriteLine($"{response.Content}");
}
The httpClient referenced in the public static async void Send(ElasticSearchJsonDocument document) method was created using the following code:
private const string LogStashHostAddress = "http://127.0.0.1";
private const int LogStashPort = 31311;
httpClient = new HttpClient { BaseAddress = new Uri($"{LogStashHostAddress}:{LogStashPort}/") };
httpClient.DefaultRequestHeaders.Accept.Clear();
httpClient.DefaultRequestHeaders.Accept.Add(new MediaTypeWithQualityHeaderValue("application/json"));
When I step into a new debug instance, the program runs smoothly, but dies immediately after executing await httpClient.PutAsJsonAsync for each of the documents contained inside the zipped folder -- response.EnsureSuccessStatusCode(); is never hit; neither is Console.WriteLine(exception.Message); nor Console.WriteLine($"{response.Content}");.
Here is an example of ElasticSearchJsonDocument that is passed to the public static async void Send(ElasticSearchJsonDocument document) method:
When I ran the same PUT request using cURL, the Book index was successfully created, and I could then a GET request to retrieve the data from ElasticSearch.
My questions are:
Why did the program die immediately (with no visible exception messages) after executing await httpClient.PutAsJsonAsync(...) for each of the JSON document inside the received zipped folder?
What changes should I make to ensure that I can make successful PUT requests to LogStash using a HttpClient instance?
I changed my httpClient instantiation code from
httpClient = new HttpClient { BaseAddress = new Uri($"{LogStashHostAddress}:{LogStashPort}/") };
httpClient.DefaultRequestHeaders.Accept.Clear();
httpClient.DefaultRequestHeaders.Accept.Add(new MediaTypeWithQualityHeaderValue("application/json"));
to
httpClient = new HttpClient();
httpClient.DefaultRequestHeaders.Accept.Clear();
httpClient.DefaultRequestHeaders.Accept.Add(new MediaTypeWithQualityHeaderValue("application/json"));
And I changed await http.Client.PutAsJsonAsync(...) to
HttpResponseMessage response =
await httpClient.PutAsJsonAsync(
IsNullOrWhiteSpace(document.DocumentId)
? $"{LogStashHostAddress}:{LogStashPort}/{document.IndexId}"
: $"{LogStashHostAddress}:{LogStashPort}/{document.IndexId}/{document.DocumentId}",
document.JsonObject);
response.EnsureSuccessStatusCode();
It turns out that the BaseAddress field in HttpClient is extremely user-unfriendly, so instead of wasting more time on it, I decided to just eliminate it entirely.

Spring boot rest service to download a zip file which contains multiple file

I am able to download a single file but how I can download a zip file which contain multiple files.
Below is the code to download a single file but I have multiples files to download. Any help would greatly appreciated as I am stuck on this for last 2 days.
#GET
#Path("/download/{fname}/{ext}")
#Produces(MediaType.APPLICATION_OCTET_STREAM)
public Response downloadFile(#PathParam("fname") String fileName,#PathParam("ext") String fileExt){
File file = new File("C:/temp/"+fileName+"."+fileExt);
ResponseBuilder rb = Response.ok(file);
rb.header("Content-Disposition", "attachment; filename=" + file.getName());
Response response = rb.build();
return response;
}
Here is my working code I have used response.getOuptStream()
#RestController
public class DownloadFileController {
#Autowired
DownloadService service;
#GetMapping("/downloadZip")
public void downloadFile(HttpServletResponse response) {
response.setContentType("application/octet-stream");
response.setHeader("Content-Disposition", "attachment;filename=download.zip");
response.setStatus(HttpServletResponse.SC_OK);
List<String> fileNames = service.getFileName();
System.out.println("############# file size ###########" + fileNames.size());
try (ZipOutputStream zippedOut = new ZipOutputStream(response.getOutputStream())) {
for (String file : fileNames) {
FileSystemResource resource = new FileSystemResource(file);
ZipEntry e = new ZipEntry(resource.getFilename());
// Configure the zip entry, the properties of the file
e.setSize(resource.contentLength());
e.setTime(System.currentTimeMillis());
// etc.
zippedOut.putNextEntry(e);
// And the content of the resource:
StreamUtils.copy(resource.getInputStream(), zippedOut);
zippedOut.closeEntry();
}
zippedOut.finish();
} catch (Exception e) {
// Exception handling goes here
}
}
}
Service Class:-
public class DownloadServiceImpl implements DownloadService {
#Autowired
DownloadServiceDao repo;
#Override
public List<String> getFileName() {
String[] fileName = { "C:\\neon\\FileTest\\File1.xlsx", "C:\\neon\\FileTest\\File2.xlsx", "C:\\neon\\FileTest\\File3.xlsx" };
List<String> fileList = new ArrayList<>(Arrays.asList(fileName));
return fileList;
}
}
Use these Spring MVC provided abstractions to avoid loading of whole file in memory.
org.springframework.core.io.Resource & org.springframework.core.io.InputStreamSource
This way, your underlying implementation can change without changing controller interface & also your downloads would be streamed byte by byte.
See accepted answer here which is basically using org.springframework.core.io.FileSystemResource to create a Resource and there is a logic to create zip file on the fly too.
That above answer has return type as void, while you should directly return a Resource or ResponseEntity<Resource> .
As demonstrated in this answer, loop around your actual files and put in zip stream. Have a look at produces and content-type headers.
Combine these two answers to get what you are trying to achieve.
public void downloadSupportBundle(HttpServletResponse response){
File file = new File("supportbundle.tar.gz");
Path path = Paths.get(file.getAbsolutePath());
logger.debug("__path {} - absolute Path{}", path.getFileName(),
path.getRoot().toAbsolutePath());
response.setContentType("application/octet-stream");
response.setHeader("Content-Disposition", "attachment;filename=supportbundle.tar.gz");
response.setStatus(HttpServletResponse.SC_OK);
System.out.println("############# file name ###########" + file.getName());
try (ZipOutputStream zippedOut = new ZipOutputStream(response.getOutputStream())) {
FileSystemResource resource = new FileSystemResource(file);
ZipEntry e = new ZipEntry(resource.getFilename());
e.setSize(resource.contentLength());
e.setTime(System.currentTimeMillis());
zippedOut.putNextEntry(e);
StreamUtils.copy(resource.getInputStream(), zippedOut);
zippedOut.closeEntry();
zippedOut.finish();
} catch (Exception e) {
}
}

Setting timeouts with Google Cloud Storage JSON API

I'm writing some bytes to gcs and would like to use the JSON API wrappers provided by Google, but with a timeout. Currently I have this:
storage = new Storage
.Builder(GoogleNetHttpTransport...)
StorageObject storageObject = new StorageObject().setBucket(bucket).setName(path);
Storage.Objects.Insert insertObject =
storage.objects().insert(bucket, storageObject, content).setName(path);
insertObject.execute();
}
Is there a simple way to add a timeout to either CloudStorage, StorageObject or the .execute?
It turns out that the storage abstraction import com.google.api.services.storage.Storage has a way to set timeouts on initialization with HttpRequestInitializers separate from your credentials.
If you have a MyGCSAbstraction that you create for each GCS operation, you can do the following:
private static HttpRequestInitializer setHttpTimeout(final HttpRequestInitializer requestInitializer) {
return new HttpRequestInitializer() {
#Override
public void initialize(HttpRequest httpRequest) throws IOException {
requestInitializer.initialize(httpRequest);
httpRequest.setConnectTimeout(1000); // ms
httpRequest.setReadTimeout(1000); // ms
}
};
}
MyGCSAbstraction(String applicationName, Credential credential) throws GeneralSecurityException, IOException {
Builder builder = new Storage.Builder(GoogleNetHttpTransport.newTrustedTransport(), JacksonFactory.getDefaultInstance(), setHttpTimeout(credential));
builder.setApplicationName(applicationName);
storage = builder.build();
}