PowerShell script to update SharePoint Online list items does not save all column values - powershell

I'm processing a CSV file to create SharePoint Online list items and save metadata as fields in its custom content type. At first, I thought it would be an issue related to column type, but I'm failing to save some simple text field values. What's puzzling is that I am able to save some but not other fields - even though I'm using the same means to do this - somehow the list item is not preserving all the values I send in for the list item update.
I create a new item with the following code:
$newFile = $targetfolderObj.Files.Add($FileCreationInfo)
$targetContext.Load($newFile)
$spContext.ExecuteQuery()
Then I get the list item and start setting the field values
$newFileListItem = $newFile.ListItemAllFields
$newFileListItem.Properties["ColumnNameA"] = $_.CustomValue1 # saves
$newFileListItem.Properties["ColumnNameB"] = $_.CustomValue2 # not saving !!!
$newFileListItem.Properties["ColumnNameC"] = $_.CustomValue3 # saves
After setting up the properties with values I call the update and execute functions.
$newFileListItem.Update()
$spContext.ExecuteQuery()
$.CustomValue1 $.CustomValue2 and $_.CustomValue3 when printed all read as "Hello" for example, there are no special characters or symbols here - even hardcoding the field values to simple strings will fail to update field 2 but works on 1 and 3.
I am giving this example with my code to show it's not sequential - for example field 2 data is NOT saved but field 1 and 3 data is saved.
It always fails on the same columns, I verified the columns exist, I'm using the internal field names, string length is well within range, and there is no other validation applied on these columns. What could I possibly do to troubleshoot this?

After a lot of debugging, I'm seeing SharePoint funnery at play...
while the content type is there, and the content type internal field name is "ColumnNameA" something must have gone sideways in the library instance, causing the column name to now be "ColumnNameA0" yes, a zero added at the end of the column name. Every column that was not updating had somehow now a "zero" at the end of the internal column name.
Solution 1: add the zero
Solution 2: recreate the library and re-associate it with the site conten type
I went with Solution 1 to test that my updates worked, then rebuilt the library.

Related

BIRT - Remove empty row from parameter

I have two cascading parameters that need to be set before you generate the report.
The first one is an ID I am selecting and the the second one can be a list of values (an array) that are linked to the first ID.
I have managed the array with this line on the beforeOpen script
this.queryText=this.queryText.replace("999",params["ID_BOBJECT"].value.join(","));
and my query looks like this:
SELECT V_MOUVEMENT_1.*
FROM V_MOUVEMENT_1
WHERE V_MOUVEMENT_1.ID_BOBJECT IN (999)
My problem is that in the end, the list of my second parameter has a blank value first and when I am creating the list selecting all the values (including the empty one) I get an error that the value is not specified:
Example
Well, the only solution that worked is only this one:
Solution
Too bad it works only from eclipse...
If I deploy my rptdesign to another application, it doesn't work anymore.

Filemaker Pro 14 History tables

With a few solutions Ive worked with I've created temp table's or history tables. Normally I script it to take a handful of fields needed from a main table and copy it over to the other table by
Setting a variable then setting field to the variable for each field in the new table / new record.
I have a situation now, where Im building a history table that needs to copy the current record as is. A snapshot where all fields from that instance of the record are copied to the history table.
Rather then setting a variable then set field to the variable, Id like to get some input on a quicker way to get this done where I can do this on a record level and not type out field by field to get it done. Also if fields are added to both tables then I have to make sure my script gets updated.
Ill keep hunting around.. appreciate any help.
-Rich
Do you have a sample of copying a record from 1 table to another
including all fields and setting some fields?
As I suggested in comments, use the Import Records[] script step, and select the same file as the source. If you choose Arrange by: [ matching names ] in the Import Field Mapping dialog, it will automatically map all source fields to their similarly named counterparts.
Note that you must establish a found set in the source table before importing.
For "setting some fields", you can define auto-enter options and activate them during the import, or run Replace Field Contents[] immediately after the import.

SSRS Multiple Dataset Errors

I have a simple SSRS report that displays data from one table. What I want to do is have a distinct list from that table displayed in a drop down list for the user to select. If I only use one dataset I can get it to display, but it displays values from the column multiple times.
Example
Bob
Bob
Bob
Cathy
Cathy
If I create a second dataset that will list distinct values I get the following error message:
An Error occurred during local report processing. The definition of the report is invalid. The Variable expression for the report 'body' refers directly to the field without specifying a dataset aggregate. When the report contains multiple datasets, field references outside of a data region must be contained within aggregate functions which specify a dataset scope.
I"m trying to follow the example I found here:
http://msdn.microsoft.com/en-us/library/aa337400.aspx
The second dataset is only for the parameter list. I don't understand why it's causing problems with the actual report.
It's impossible to tell exactly where without the report definition, but there is an item on the report that is referencing a field or Dataset, and was implicitly using the only Dataset present in the report, but now doesn't know which Dataset to use once more than one is added to the report.
For example, when you create a table you can set a Dataset associated with it. If this is not set and there is only one Dataset, it doesn't matter as it will take the only one available. Once you add a new Dataset, the table doesn't know which one to use and you'll get the error you're seeing.
Another way to get the error is specifying a field in an expression, e.g. in a TextBox in the report somewhere without specifying the scope; just set the scope to a particular Dataset e.g. if you have:
=Count(Fields!name.Value)
change this to:
=Count(Fields!name.Value, "DatasetToUse")
If you've only got one Dataset the first expression will run fine by using the only one available, but once you add another it won't know which to use and it will error.
in the query (SQL) you should add DISTINCT clause at the beginning, that way, you will get only one record per value. Check out http://www.w3schools.com/sql/sql_distinct.asp
Double click the Dataset which contains that field.
Go to fields on the left and delete that field.
Add new field by clicking Add -> Query Field.
Just type in the name of the new field under field name and field source.
It happens when you have added a field by selecting "Calculated Field" instead of "Query Field" from Dataset Fields list tab.
Cheers,
Ahmed Latif

How can I (partially) automate the transfer of a FileMaker database structure and field contents to a second database?

I'm trying to copy some field values to a duplicate database. One record at a time. This is used for history and so I can delete some records in the original database to keep it fast.
I don't want to manually save the values in a variable because there are hundreds of fields. So I want to go to the first field, save the field name and value and then go over to the other database and save the data. Then run a 'Go to Next Field' and loop through all the fields.
This works perfectly, but here is the problem: When a field is a calculation you cannot tab into it and therefore 'Go to Next Field' doesn't work. It skips it.
I though of doing a 'Go to Object' but then I need to name all the objects and I can't find a script to name objects.
Can anyone out there think of a solution?
Thanks!
This is one of those problems where I always found it easier to do an export/import.
Export all the data you want from the one database, and then import it into the other database. All you need to do is:
Manually specify which fields you want to copy
Map the data from the export to the right fields in the new database/table
You can even write a script to do these things for you.
There are several ways to achieve this.
To make a "history file", I have found there are several cases out there, so lets take a look.
CASE ONE
Single file I just want to "keep" a very large file with historical data, because I need to erease all data in my Main file.
In this case, you should create a "clone" table (in the same file ore in other file, is the same). Then change any calculation field to the type of the calculation result (number, text, date, an so on...). Remove any "auto entered value or calculation from any field, like auto number, auto creation date, etc..). You will have a "Plain Table" with no calculations or auto entered data.
Then add a field to control duplicate data. If you have lets say an invoice number (unique) for each record, you can do this to achieve this task. But if you do not have a unique field that identifies the record as unique, then you have to create one...
To create such a field, I recommed to add a new field on the clone table and set as an aunto entered calculation and make a field combination that is unique... somthing like this: invoiceNumber & "-" & lineNumber & "-" " & date.
On the clone table make shure that validation is set up for "always", and no empty values allowed and that this value is unique.
Once you setup the clone table... then you can import your records, making sure that the auto enty option is on. Yo can do it as many times as you like, new records will be added and no duplicates.
If you want, can make a Script to do the move to historical table all the current records before deleting them.
NOTE:
This technique works fine when the data you try to keep do not have changes over time. This means, once the record is created is has no changes.
CASE TWO
A historical table must be created but some fields are updated.
In the beginnig I thougth a historical data, never changes. In some cases I found this is not the case, like the case I want to track historical invoices but at the same time, keep track if they are paid or not...
In this case you may use the same technique above, but instead of importing data... you must update data based on the "unique" fields that identifiy the record.
Hope this technique helps
FileMaker's FieldNames() function, along with GetField() can give you a list of field names and then their values

Is it possible to store hidden metadata information that is tied to a specific Table or Cell within a Word document?

I am trying to store metadata (basically a unique id) along with each cell of a table in a Word document. Currently, for the add-in I'm developing, I am querying the database, and building a table inside the Word document using the data that is retrieved.
I want to be able to save any of the user's edits to the document, and persist it back to the database. My initial thought was to store a unique id along with each cell in the table so that I would be able to tell which records to update. I would also like to store some sort of "isChanged" flag within each cell so that I could tell which cells were changed. I found that I could add the needed information into the "ID" property of the cell - however, that information was not retained if the user saved the document, closed it, and re-opened it. I then tried storing the data by adding a data to the "Fields" collection - but that did not work and threw a runtime error. Here is the code that I tried:
object t1 = Word.WdFieldType.wdFieldEmpty;
object val = "myValue: " + counter;
object preserveFormatting = true;
tbl.Cell(i, j).Range.Fields.Add(tbl.Cell(i, j).Range, ref t1, ref val, ref preserveFormatting);
This compiles fine, but throws this runtime error "This command is not available".
So, is this possible at all? Or am I headed in the wrong direction?
Thanks in advance.
Wound up using "ContentControls" to store the information I needed. I used the "Title" field to store the unique id, and the "tag" field to track whether the field had been changed or not. See this link for more info: http://blogs.technet.com/gray_knowlton/archive/2010/01/15/associating-data-with-content-controls.aspx
Since a "Word 2007 Document" is XML, you can add a namespace to the document then adore the elements with attributes from your namespace. Word should ignore your namespace when loading and saving. Moreover, you can add new elments to store any information (metadata) needed.
With that said, I have not used this technique with Word, but I have done it successfully using Excel 2003.
First thing to try, is create a bare "Word 2007 Document". In your case, add a simple two by two table. Open it with a text or XML editor and add your namespace, and adore an attribute and add an element. Open with Word make a change then save it. Open with editor and make sure your namespace attribute and element have not been changed.