I'm running a website that is dependent of some data which changes daily. Each day at a specific time, I have a webhokk to Netlify which is triggering a build that is grabbing the data and saves it to /static/data folder of Gridsome. After the update is completed, the script that collects data is also saving a json file with the date when the last update of data occured.
My problem is that I've committed this data to github and every time I do a commit to github Netlify will copy a previous data from /static/data folder to LIVE exposing older data to my calls by overwriting the existing data which in most of the cases is older!
How do I tackle this issue? Will it be safe to add the /static/data folder to .gitignore? The decision to write new data is by comparing today date with the one I read from mysite.org/data/udapte.json
Any help will be appreciated.
Thanks!
Related
I am new to GitHub API's and I was wondering if there is a way to get all the file names that have been committed on since a given date, I am aware that the "since" filter exists but it only gives back the most recent file not files.
Here is my html Get request
https://api.github.com/repos/{Username}/{Repository-name}/commits/main?since=2021-06-1T08:15:46Z
I'm new to github, and recently I finished an action with auto craw and process data on a daily basis. So, after with the workflow, I can have latest dataset.
My question is, in github readme file, is there a way to show the last date in my dataset.
For example, after my daily workflow finished, the last row of my dataset is '05/09/2022', and I want to see that on my readme file, without manually edit it.
I tried to google it, but haven't found anything, maybe because I don't know how to search the right question?
Was wondering if anyone know how to achieve this?
Thanks in advance.
Update:
I found a way to display code in readme file, called permalink, it could show exactly what i want, but i need it changed from code to data
There is a GitHub action that could help to maintain the README file base on some data files. For example, you can easily maintain a big Markdown-based table from some YAML files.
See also https://github.com/LinuxSuRen/yaml-readme
I am Peter and I am a Siri Shortcuts creator. What my problem is that I store my images (base64) in a json file. In my ‘shortcut’ I use a cache system so it won’t download every-time. What I want to do is every time a new version came out of that json file, I want to update another file with a version number (commit 1 is v1-V2, commit 2 is v2-v3 and so one). I want that version number to update automatically every time I update the base64 json file. I want that to happen so I can check in my shortcut every time if a cache update is needed.
I don’t know how to this so help would be appreciated!
We have couple excel files are used by several departments, and we would like to track on user and saved time. excel can read some info, but not accurate if the file got recovered.
For these files, they can be updated, then got recovered to prior version. There is a need to share all info upon time and user that saved the file in the last month.
Is it achievable by using powershell codes ?
Thank you for your time in advance.
May
I'm currently setting up an Akeneo (2.1) instance that needs to communicate with an e-commerce solution. I was wondering what the best practices are when it comes to importing and exporting data. The documentation kind of lacks in this; it tells how you can set it up, but I'm missing practical use cases here.
Here's what I'm thinking of:
I want our customer to be able to upload their images / CSV files using an FTP connection.
Akeneo should ideally only start importing when a mutation in this (FTP) destination folder is detected.
Exporting should only be done once or twice a day, and upon completing the archive should be transfered with (s)FTP to a different location
I'm currently having trouble on how to implement this flow in Akeneo. Because if I look at what comes out of the box I can come up with the following:
I can setup an FTP account that ends up in `app/uploads/product/` and allow the customer to upload to that location
Akeneo does not detect file system changes, so I can only setup a cronjob that tries to import every hour or something. The drawback of this approach is that Akeneo will copy the CSV file(s) every time to `app/archive/import`. If you have big CSV files this can cause for some increment in disk usage.
I can setup a cronjob to export twice a day, but again: Akeneo will create archives on every export, so `app/archive/export` will grow even bigger every day. Please note that my customer has 4GB+ assets (images, documents, etc.). Does Akeneo cleanup the `app/archive`-folder every now and then?
Every exported archive comes in a new folder (with an every incremented job number (`app/archive/export/csv_product_export/28/` for example)), so I'm kind of wondering how I can detect for this new folder and how I can trigger to uploading of the archive to the remote (S)FTP server after the export is complete.
I was just wondering how other people who work with Akeneo handled these challenges. I know I can write my own custom bundle and hook into a ton of events or write shell scripts that do lot of the magic for me, but I am wondering what Akeneo itself already offers regarding this subject.
Any thoughts / ideas / suggestions / experiences on this topic are welcome!
To answer your questions:
Akeneo doesn't need to have csv uploaded in the app/uploads/product/
folder. You can define the csv location in the import profile. That way you can use whatever location you want.
To import images, you need to zip them with the csv file (to see how the structure of the archive should look like, you can export some products with media on demo.akeneo.com)
Setting up a cronjob seems to be a good idea. If the disk usage is a problem, this cronjob could also clean the folder after the import.
To export twice a day, you can use the export builder to only export products that have been updated since last export (delta export). That way, you don't use too much space for nothing.
Again, the app/archive/export/csv_product_export/28/ path is only for internal use. This is a working directory used by Akeneo during the export (before zip for example) and the final file (csv or zip) is moved to the defined destination (in job configuration).
With all those informations, here is my recommendation:
Write a simple bash/php script to detect change in a folder and if there is one, move the file to another location and launch the import.
If you want to handle images, you can add to your script a way to generate the zip file with the good format
Then to export to your ecommerce, setup a cronjob to export every hours and export only new or updated products to the desired destination.
Another way would be to use the new REST API which is well documented here: https://api.akeneo.com/