How to open and edit an XLSM file online - xlsm

friends,
I have a Macro enabled excel file. How and where I can upload my Xlsm file and do the updation. I mean, I want my HR people to do the updation on regular basis on this XlSM file from the branch office and myself want to see the updated XLSM file at my Head office. Is there any way for that? I think Google docs cannot be possible.
thanks in adavance.

you may use database as a place that will store data. Your remote HR fill in excel tables then upload it to database (Access or MySQL or whatever) then you can download it wherever you are - this method requires writing macro that use ADO component - there is planty of tutorials about it.
regards
m

Related

Loading a .docx file into ETL/ELT tool?

Could someone please guide me on how to extract a .docx file and load it onto a database using an ETL(Extract-Transform-Load) or ELT(Extract-Load-Transform) tool?
Assuming that the .docx file contains mostly unstructured data, isn't it an ELT tool I should go for instead of ETL?
The ETL and ELT tools I found this far didn't support the MS Word component. What other way is there to extract and store the content in a .docx file onto a database?
My requirement is to:
Extract the data inside the .docx file,
Convert them into meaningful data, and
Store them onto a data lake so I can perform data analysis, and take productive decisions based on those results.
It's just like how e-commerce companies convert customer reviews into meaningful data so they can take decisions to boost their sales. In my case, it's Word files I need to analyze.
I'm asking this because I've searched for so many ETL and ELT tools but couldn't find anything that supported Word files. Maybe it's because I haven't been searching for the right tool or the right way to do it?
If somebody knows a way, please guide me through the process. What should I start looking for? A tool, or a way to code the entire thing?
I've been looking for an answer for weeks now but didn't find a helpful answer. And it's starting to get really frustrating to see all the tools supporting every other component like social media, MongoDB, or whatever EXCEPT Word files.
You have to do this in 2 steps:
Extract the data from the .docx file to txt or xml
Now use SSIS to import. (Azure Data Factory if you are in the cloud)

How to Export Moodle Course format to Dspace?

I need to export courses from Moodle but, but as it is a very closed
application, and the courses are in moodle format, is there any way
to extract the contents / metadata that format to facilitate the
migration to DSpace.
I know, it possible to make on the 'big-hand', but ira spend a lot of
time. For DSpace and moodle use very different and complex databases.
Moodle exports courses with a .mbz extension. Simply rename it to .zip and you can extract the XML files from inside. These files will have all the information you need. You could potentially create a tool that programmatically extracts this information and imports it to DSpace.
Also, Moodle is open source, not a closed application. Source available here: https://github.com/moodle/moodle

Difference between a twb and twbx

Please provide some information about difference between twb workbook with extract and twbx workbook. Also I am facing some issues, I have workbook(twbx) on Tableau Server which use published extract. Extract was refreshed today. But workbook shows old data....
TWB - XML file for your Tableau Workbook, contains all the selections and layout you've made. It does not contain data. These tend to be very small.
TWBX - zipped file that contains the TWB as well as data used by TWB in an extract
Here's some more info from the Tableau website.
http://kb.tableausoftware.com/articles/knowledgebase/sending-packaged-workbook
Try closing & opening your workbook. If that doesn't refr
Make sure that the data at the path or database connection that the Tableau Server points to the exact source you wish to refresh from.
Remember the Server may have different drives mounted, different firewall rules. If you are reading from a file like Excel or Access to create your extract; changing the version of the file elsewhere on the file system won't affect the extract on the Tableau Server if that extract points elsewhere (kind of obvious, but often forgotten, especially if a copy of the Excel file is bundled up into the twbx file).
It is also often a good idea in production to publish a data source and extract separately from the workbooks that use it so that they can be updated independently. Look under the data menu to find the publish command.
TWBX is intended for sharing. It does not link to the original file source; instead it contains a copy of the data that was obtained when the file was created.
If you need to give them TWBX, you can create a TWB as a template and then use it to create TWBX when your data source is updated. Your clients will get a TWBX that they want and you don't have to do anything.
You can even have a batch process for that. Here is a video: https://www.youtube.com/watch?v=Odk2xr6qOoQ
As Ryan mentioned, the twbx file contains its own data extract. Since you have a twbx file that uses a published data extract as its source, you basically created an extract of the original extract. In other words, the data is not coming from the published extract anymore, but is self-contained in the workbook itself, so refreshing the published extract won't update your workbook.
You can try scheduling the workbook itself (after the refresh of the extract of course). However, that didn't work for me, and I always have to refresh the extract manually from the Tableau Desktop.

How do I edit files in place that were uploaded to Moodle?

I would like a better workflow for debugging uploaded SCOs. As things are, I must edit a file in the activity, repackage, upload, and test. Often, I just need to change a single line of code. It would be VERY nice to be able to edit that file, that line of code, on the server. So far, all I've found is that Moodle manages the files, so it seems impractical to locate and decipher the renamed files after upload.
Is there a way to configure Moodle so that it doesn't rename and relocated files in SCOs upon extraction? Actually, I'm open to any suggestions on the best, fastest workflow for debugging SCOs.
Problem background
Since Moodle 2.0, files are no longer stored on server in the conventional /this/is/the/path/to/my.file way. Instead, files are rehashed and stored in Repositories (i.e. spread all over the moodledata folder as a collection of seemingly random data). This increases security and cross-OS compatibility but complicates stuff for people who would like to simply upload a SCORM zip package via FTP. Here's more information on file handling in Moodle 2.0
Path to the soluton
Let's locate the file you want to update, then update it.
Run phpmyadmin, go to mdl_files table, find your file by name in the filename field (let's say it's portrait.jpg)
Look at the contenthash field, it'll look like abcde1234567890. This means your file is stored in moodledata/filedir/ab/cd/ folder under the name abcde1234567890.
Rename the updated portrait.jpg to abcde1234567890, upload and overwrite.
Go back to phpmyadmin and update the filesize field in record for portrait.jpg with the size of the updated file.
Obviously, this process can be automated. You'll have to write a script that allows you to upload a file, then it'll search for that file in mdl_files, save it to the correct folder and update all fields accordingly.
Alternative idea
Enable external package type (and also enable 'Update on every launch'). Go to Site administration / Plugins / Activities / SCORM and check the box down below. Now you'll be able to launch SCORM packages directly from another server, so Moodle won't mess with it. Of course, you can run in other (probably cross-domain related) problems.
Sergey's answer is very good, with one caveat:
In his example with the contenthash of abcde1234567890, the file is stored in the moodledata/filedir/ab/cd/ folder under the name abcde1234567890. Moodle uses the full contenthash to name the file.

Choosing right tool

I have following need:
1) Users will upload .xls or .csv files in "uploads" folder.
2) "uploads" folder have to be constantly monitored, and with each new file added to him, a job has to be started.
3) Job will process data from .xls or .csv file so they meet DB table structure, and write this data into DB table.
This have to be automated process, and I'm looking for all-in-one solution tool.
You didn't tell on which operating system, and you didn't tell if the user upload the files on a different server, or not. If the upload goes thru a web application (using an HTTP POST request), it is also different.
And I'm not sure that your wish scales well with many users.
You should take a look at Pentaho Data Integration, a.k.a. Kettle: http://sourceforge.net/projects/pentaho/
With Kettle you can desing a Job that pools the upload directory and once a file is found makes all the needed transformation and input on the desired database table.