SugarCRM - Mass update Contacts in a Target List - sugarcrm

I want to insert todays date in a field in all the Contacts that are in a Target List.
What would be the best way? SQL somehow/somewhere, existing plugin, other ideas?
I am often seeing the use of BeanFactory when I google Sugar Customisations, but has no clue where to insert this code in the sugar files to acomplish what I want.....
Using SugarCRM CE 6.5.8, and Kinamu Reports module to generate the Target List.
Rgds
Petter

Export the Target List so that you get a CSV file
Ensure the CSV contains the Contact records' ID and name fields
Update the CSV to include a column for whatever field you're updating, and add the value you want all records to have
Contacts -> Import Contacts and select Insert & Update to update the records with the new values.

you can add date field in mass update panel
Only fields with the types of relate, parent, enum, contact_id, assigned_user_name, account_id, account_name and date can be added to the mass update panel. Of these you can only create types date and enum with the studio editor (I think).
You would just add 'massupdate' => true to the field description in vardefs.php.
If it's a custom field than search that field file in custom/Extension folder,
and add 'massupdate' => true there do repair and rebuild !

Related

How to automatically populate Custom Item Field with data from a saved report in NetSuite?

Software Platform - NetSuite
Goal - Run a weekly sales report. Use the data from that report to populate a custom item field on the item record (Kit item) in NetSuite.
Can this be done using a workflow or???
Before you try to write a Script or create a Workflow, I recommend you investigate NetSuite's ability to populate Custom Fields with Search Results.
Check out the Help page titled Creating Custom Fields with Values Derived from Summary Search Results for the details.
The basic process will involve creating a Saved Search that generates the data you need, then using the Validation and Defaulting tab on your Custom Item field to select the Search you created.
This should be done using SuiteScript. In your script, run a search, store the data as JSON, and submit the data to the item record. You could submit the data into a single text field on the item record, but it would be better to create a Custom Record Type "Sales Report" which has a list/record field sourced from the item and with the "Record Is Parent" checkbox checked. This will display your custom record on the item in a sublist. Using a Custom Record Type will allow you to store the data over multiple iterations. If you use a field on the item record it will be replaced each time you run the script.

Filemaker Pro 14 History tables

With a few solutions Ive worked with I've created temp table's or history tables. Normally I script it to take a handful of fields needed from a main table and copy it over to the other table by
Setting a variable then setting field to the variable for each field in the new table / new record.
I have a situation now, where Im building a history table that needs to copy the current record as is. A snapshot where all fields from that instance of the record are copied to the history table.
Rather then setting a variable then set field to the variable, Id like to get some input on a quicker way to get this done where I can do this on a record level and not type out field by field to get it done. Also if fields are added to both tables then I have to make sure my script gets updated.
Ill keep hunting around.. appreciate any help.
-Rich
Do you have a sample of copying a record from 1 table to another
including all fields and setting some fields?
As I suggested in comments, use the Import Records[] script step, and select the same file as the source. If you choose Arrange by: [ matching names ] in the Import Field Mapping dialog, it will automatically map all source fields to their similarly named counterparts.
Note that you must establish a found set in the source table before importing.
For "setting some fields", you can define auto-enter options and activate them during the import, or run Replace Field Contents[] immediately after the import.

MS Word, Import Table with Query Condition Based on Merge Field

I'm creating a compliance mailing for my organization, the mailing will include merge fields that identify the office location, physician, and SiteId. The mailing will also include a table of information that is dependent upon the particular SiteId.
I'd like to use the import table function of MS word and set up a query that references a merged field (SiteId) so that the inserted tables populate the appropriate data for the particular site. I'm unable to do this.
How can I set up this document so that I can import only records from my source (an ms access query) that match the SiteId merge field?
Word's mail merge does not support one-to-many relationships. There are ways to coerce it, but only one of them can yield a table as a result and over the years it has become less and less reliable as Microsoft has not regarded it as important enough to maintain...
What you need to do is set up a query that provides ONLY the information you want displayed in the table, plus the key (SiteId). It's best to sort it so that all the SiteId entries list together, and are in the order the data will come through in the mail merge data source.
On the Insert tab go to Text/Quick Parts/Insert Field and select the Database field from the list in the dialog box. Click "Insert Database" and follow the instructions in the dialog box to link in the data. Be sure to set the Query Options to filter on the first SiteId from the data source. When you "Insert Data" make sure to choose the option to "Insert as a field".
This inserts a DATABASE field in the document which you can see by toggling field codes (Alt+F9). The field code can be edited and what you need to do is substitute the literal SiteId value you entered for the query with its corresponding MergeField.
When you execute the merge to a new document that should generate a table for each data record corresponding to the SiteId for the record. But, as I said, Microsoft hasn't done a great job of maintaining this, so it may require quite a bit of tweaking and experimenting.
If the results are not satisfactory then you should give up the idea of mail merge and use automation code to generate and populate the documents.
You can find more (albeit somewhat out-dated) information on this topic at http://homepage.swissonline.ch/cindymeister/mergfaq1.htm

Kentico Import Toolkit 8.1

I am currently using the Kentico Import Toolkit to create documents in the tree.
At this point, I have imported around 100 documents using the toolkit, and they are all located at the correct place in the tree. Now the issue/concern that I had was, as I have imported these documents, my spreadsheet has been updated, so extra fields and data were added, so how do I go about importing this extra data into the currently existing documents? Also just bear in mind I don't want other fields or data to be affected by this, as some of the documents were updated with some other content by the content editors using CMS Desk, which isn't available in the spreadsheet.
Import toolkit is not the right tool to achieve this task. Even if you select "Import new and overwrite existing pages" it'll overwrite most of your columns. Actually it only preserves system and id columns from the existing documents - all other columns get overwritten.
Either you can write a piece of custom code or you can try following:
Open SSMS and navigate to the coupled table of your page type (something like CONTENT_MyDocType). This is where your custom columns are stored.
Right click -> Edit top 200 rows
Click "Show SQL Pane"
Adjust the columns, ORDER BY and WHERE clause to match your excel file, re-run the query
Select desired rows in your excel file and copy them to clipboard
Paste the data in the SSMS
rocky is right, Import Toolkit is meant for importing complete objects, not partial/continuous update.
You could map the fields that you know are not changed in the spreadsheet to a SQL query selecting the value from the target database.
To achieve this, just insert #<target> at the beginning of the SQL select statement you will be mapping the field to.
It will be rather laborious though and it also requires certain knowledge about the nature of the spreadsheet changes.

Copy Field contents from one Table to another table - FileMaker

I am new to Filemaker pro. I am working with Filemaker pro 13.
My database contains 3 tables:
category (fields = _pkCatID & CatName)
subcategory (fields = _pkSubcatID , _fkCatID & SubcatName)
books (many fields including _fkSubcatID)
I have no problems in conditional value lists, so making two popup menus in books layout for categories and subcategories was successful.
But I want to put both categories and subcategories in one menu/sub-menu using 2empowerfm Menu Popper plugin.
I created a new field in subcategory table to store a calculation to be used in the value list of the plugin.
The calculation is = CatName & ">" & SubcatName & ";" & _pkSubcatID .
So the returned value when choosing in books layout will be "_pkSubcatID".
The problem is CatName is not in Subcategory table, and if I choose it from the related table Category, I can't make the calculation "stored" which is a requirement to use a field in a value list.
So, I need to copy the field CatName from category table to a new created field in subcategory table. I don't know how to do it.
You just need to create a lookup field in your subcat table pointed to the category name in the category table.
Create a field in the sub-cat table called "Category"
Click on Options
Auto-Enter Tab at the bottom, check "Looked-up value"
Select the correct starting (subcat) and related (Cat) tables and select the name field for the Category.
That is all.
To populate this for existing records click into the _fkCatID field on a subcat layout after showing all records and in the menus select Records->Relookup Field Contents
#Michael Wallace answer is correct and that solution should work.
I'd add however that if the table is likely to become large (and it could do if you're cataloguing books for a library) then I'd suggest you run some tests on a fake large data to see if this menu technique holds up (especially if you are serving over a network). Running two global search fields with an executeSQL lookup for subcategory within category would be more efficient in a big data set - this technique is well described here and other places:
http://forums.filemaker.com/posts/c4ed6f9923