Zapier: from spreadsheet to Facebook Offline events tracking - facebook

I've got a database containing offline conversions (email, phone, name, purchase_amount, etc). I can export this database in .csv or .xls and I can also email this file on a daily basis to a Gmail account.
As Zapier has a Google Sheet to "Facebook offline event" API, I tried this workflow with Zapier.com:
Export my database in .xls: OK
Mail it to my Gmail account as an email attachment: OK
Grab the attachment and upload file to Google Drive using Zapier: OK
This is the part where I'm in trouble: I want to copy the content of the .xls file that is on Google Drive to a new Google Sheet. I can't figure out how to do this in Zapier.
Finally, on every new spreadsheet created or new row added (depending on how I configure the Zap) , pushing the data to Facebook API.
I'm not a developer so I want to avoid coding if possible. I tought I could easily do it with zapier but it seems that working with data inside a file is not so easy.
Any help would be much appreciated.
Thank you,
Best regards,
Tim.

If it were me I would look into the scripting capabilities of Google Sheets to try and achieve this, having your code execute from a single place eliminates other possible points of failure. That said, I have put together a somewhat hacky, code free solution that should set you up to do what you are looking to achieve. I break it down step by step below:
Step 1: Export database as .csv file. I could only get this to work with .csv files and not .xlsx files. There may be the ability to do so but it would require further trial and error.
Step 2: Mail it to your Gmail account where I assume there is a Zap which triggers to upload the attachment to your drive account automatically.
Step 3: Setup a second Zap that is connected to your Gmail account
that triggers when you receive an email with an attachment.
Step 4: Isolate the attachment file from the results of the triggered Zap and use it as input for the following formatter action step.
Step 5: Setup your formatter action step using the text option. Within the formatter template select trim white space and use the attachment, isolated from the trigger step, as its input. See example photo here.
Step 6: Setup your final step which is the create Google Sheet function of the Google Sheets Zap. Enter a title for your new sheet, it will probably need to be a unique value I used the attachment ID from step one as my title but you can set it to whatever you would like. In the headers section type =IMPORTDATA("") . Between the two quotation marks place the output of the previous formatter step and then run the Zap. See example photo here.
Explanation: When Zapier catches the attachment file from your inbound email it seems to be stored as raw data. Given this we cannot simply dump this information into a spreadsheet as it would be unreadable. However it seems Zapier has a method for converting this raw data through the endpoint https://zapier.com/engine/hydrate. When we input the raw attachment data into the formatter step Zapier provides a link pointing to the URL for converting the data into its original format. We take this URL and using the Google worksheet function IMPORTDATA() we are able to import the file using Zapier's file conversion engine. Now that the data is in your new sheet you can set up an additional Zap to do something with it. Also note that the Zap to upload the attachment to your Google Drive is not necessary with this setup. That said if you are looking to keep backups of your data then keep it on otherwise you may have the opportunity to save yourself some zaps.
Hope this helps!

Many thanks for your awesome reply. I also tried the "trim whitespace" to get the data back. I only missed the "importdata" function which is super powerful. Indeed it only works with .csv. With .xls file, importdata gives the source code of xls file which is useless.
I ended with 2 zaps:
Grab Gmail attachment, upload to Google Drive (for backup & monitoring) and create new spreadsheet
Send Facebook offline conversion when new spreadsheet is added (filter: only continue when file name is xxxx), lookup spreadsheet row (I took one column that has the same value for each row) and finaly I could match my columns with the Facebook API.

Related

Google Trends monthly data export

If I write the following url in the browser, I get a CSV file with weekly data from Google Trends:
www.google.com/trends/trendsReport?q=SearchTerm&export=1
In the past, this file would have been the same as the one obtained by manually clicking the "export as CSV" button on the Google Trends page. But now, instead, by using this button I get a data file with monthly data.
Does anyone know whether it is possible to obtain this monthly file by using an url similar to the above-mentioned one?
You could use date parameter with your request
www.google.com/trends/trendsReport?q=SearchTerm&export=1&date=01/2010 37m
Some pages recommend to use more than 36 months, but I tried with 140 months and always returns weekly data. Check this page for more reference, https://github.com/GeneralMills/pytrends
Also, you could check how to call this url
https://www.google.com.mx/trends/api/widgetdata/multiline/csv?req=%7B%22time%22%3A%222004-01-01%202016-10-07%22%2C%22resolution%22%3A%22MONTH%22%2C%22locale%22%3A%22es-419%22%2C%22comparisonItem%22%3A%5B%7B%22geo%22%3A%7B%7D%2C%22complexKeywordsRestriction%22%3A%7B%22keyword%22%3A%5B%7B%22type%22%3A%22BROAD%22%2C%22value%22%3A%22pope%22%7D%5D%7D%7D%5D%2C%22requestOptions%22%3A%7B%22property%22%3A%22%22%2C%22backend%22%3A%22IZG%22%2C%22category%22%3A0%7D%7D&token=APP6_UEAAAAAV_kTww_1wNWLYGrce91gQBIxkGPV4lGg&tz=300
This return a monthly data, but it appear you need to use a token
decoded url:
https://www.google.com.mx/trends/api/widgetdata/multiline/csv?req={"time":"2004-01-01 2016-10-07","resolution":"MONTH","locale":"es-419","comparisonItem":[{"geo":{},"complexKeywordsRestriction":{"keyword":[{"type":"BROAD","value":"pope"}]}}],"requestOptions":{"property":"","backend":"IZG","category":0}}&token=APP6_UEAAAAAV_kTww_1wNWLYGrce91gQBIxkGPV4lGg&tz=300

(drupal) Webform2pdf blank submitted data tokens when PDF send by mail

I run into an issue with module webform2pdf, which I am too unable to solve for a few days. I am using Commerce Kickstart as a drupal commerce module for handling all the shopping fuctionality, and we were in need to add webforms for returns policy (required by law in my country).
This form has many fields, like when you have pursached the product, what is it's serial number etc. Webform2pdf module was used for sending submitted data as PDF as attachment to email. But the received PDF by mail has blank data tokens, no matter how I try. Weird thing is, that when I hit download PDF in administration of drupal website, it just fills the data tokens right.
I have tried many tokens, all of these:
[submission:values:meno:withlabel]
[submission:values:meno]
[webform:val-meno]
[webform:meno]
%email[meno]
%email_values
%label_nl[meno]
%nl[meno]
%label_all[meno]
%label_all_nl[meno]
%all[meno]
%all_nl[meno]
[submission:values]
%value[meno]
Neither of them works, most of them simply prints token label. For example [submission:values] prints all the labels, but no data.
Upper in the mail, I have also token:
%label_all[typ_servisu]
Which prints correctly when sent via mail, but refuses to print when "download PDF" in administration. This token holds select radio buttons.
I have also tried send pdf as attachment via Rules, but with no success. I am not very experienced developer.
Any advice would be greatly appreciated. Thank you.

Extract Email Address from Nested Tables in HTML Emails

I will be receiving approx 500 emails over the next day or so. The emails are all identical in layout i.e. HTML emails that display information inside a table. There are nested tables within the email.
I need to extract the email address from each email and store it in a file (text/csv). Rather than "copy & paste", is there a php script or some browser plugin I can use to do this?
GF
I will use php parse by id,class or name and a little script to go throw all the pages inside that folder to extract it as a csv I'll use fputcsv.
Hope it helped.

Expression Engine using the Champagne Extension troubleshooting

I have an issue in Expression Engine using the Champagne Extension where it won't allow me to send out campaigns. This extension utilizes https://www.campaignmonitor.com/ api to send out mass emails.
The Error I get is "HTML Content URL Required" when I try to send out campaigns form the back end of the Expression Engine Install.
What could be causing this issue that relates to the expression engine install?
This error is received anytime the URL to your html or text content is not visible. More often this is seen when someone forget to include the text version. A good way to test is to click the preview HTML/TEXT button and make sure both give you the correct results.
They cannot be blank.
For an alternative solution, be sure to checkout my add-on Postmaster. It allows clients to publish email campaigns just by creating a new entry. You can setup any number of configuration, so you can even send draft email to a test subscriber list. And since everything is within channel entries, you can use whatever fieldtypes you like, and it works with MailChimp as well as CampaignMonitor.
https://objectivehtml.com/articles/postmaster-the-definitive-email-solution-for-expressionengine

How to automatically save emails to ascii files?

I have a data stream that will be sent as daily emails containing temperature and wind speed from a measurement site. I would like to to automatically filter out these emails from the other emails I receive, then save the email body content to its own text file. Each text file must have a distinct file name; for example it could include the time that the email was sent or received. All files must all end up in a chosen directory. And ideally the process would be robust enough that it could run unattended for weeks. Our email system is Outlook but I could choose to send the email to my gmail account, for example. What is the big picture of how to do this?
Bigger picture: create a VBA script that runs on the Items_ItemAdd event, which fires whenever an email arrives.
Specifics: Use the solution on this page, but in the Items_ItemAdd routine change the olSaveAsMsg to olSaveAsTxt to get the text format you want.
Note that the file name format in the example should match what you need, but you'll need to add criteria to the Items_ItemAdd routine to check that the message is one that you want to save. For example, you could read the Item.Subject property.
it means you are working with exchange, i suggest to use imap protocol to read the mails, and you will be able to save the body.