Weird "data has been changed" issue - postgresql

I'm experiencing a very weird issue with "data has been changed" errors.
I use ms access as a frontend and postgresql as backend. The backend used to be in ms access and there were no issues, then it was moved to sql server and there were no issues there either. The problem started when I moved to postgresql.
I have a table called Orders and a table called Job. Each order has multiple jobs, I have 2 forms, one parent form for the Order and one Subform for the Jobs (continuous form). I put the subform in a separate tab, first tab contains general order information and the second tab has the Job information. Job is connected Orders using a foreign key called OrderID, Id of Orders is equal to OrderID in Job.
Here is my problem:
I enter some information in the first tab, customer name, dates etc, then move to the second tab, do nothing in the second tab, go back to the first one and change a date. I get "The data has been changed" error
I'm confused as to why this is happening. Now why I call this weird?
First, if I put the subform on the first tab, I can change all fields of Orders just fine. IT's only if I put it on the second tab and, add some info, change tab, then go back and change an already existing value that I get the error
Second, if I make the subform on the second tab Unbound (so no ID - OrderID) connection, I get the SAME error
Third, the "usual" id for "The data has been changed" error is Runtime Error 440. But what I get is Runtime Error: "-2147352567 (80020009)". Searching online for this error didn't help because it can mean a lot of different things, including "The value you entered isn't valid for this field" like here:
Access Run time error - '-2147352567 (80020009)': subform
or many different results for code 80020009 but none for "the data has been changed"
MS access 2016, postgresql 12.4.1

I'm guessing you are using ODBC to connect Access to Postgresql. If so do you have timestamp fields in the data you working with? I have seen the above as the Postgres timestamp can have a higher precision then Access. This means when you go to UPDATE Access uses a truncated version of the timestamp and can't find the record and you get the error. For this and other possible causes see:
https://odbc.postgresql.org/faq.html#6.4
Microsoft Applications

Related

Power BI desktop - REST API refresh times out

I'm connecting to a REST API to bring several tables into a Power BI file. I can connect to the API and retrieve the data without any issues using 'Get Data > Other > Web' from the main toolbar, and then entering a URL in the format:
https://api01.naturalhr.net/2.0/timeoff/key/(security key here)/format/xml
The data usually comes back quite quickly - within about 10-20 seconds.
My issue is that when I try to refresh the same data it usually times out after 5-ish minutes. To refresh I go to 'Transform Data (I think this was 'Edit Queries' in earlier versions) > Select the query I'm interested in (in this case 'timeoff') > Select the 'Refresh Preview' button on the main menu.
The source in the formula bar in the Power Query editor is again just:
= Xml.Tables(Web.Contents("https://api01.naturalhr.net/2.0/timeoff/key/(security key here)/format/xml"))
So I'm just trying to refresh the same URL with which I'd retrieved the data without any issues, but for some reason it is at best taking much longer, and more commonly just timing out altogether.
Note that I do have some transformations to the original data, but even when I remove all of these I am still seeing the time-out.
Can anyone explain why I can get, but not refresh, the same data? Many thanks.
###EDIT:
To add some further information to this, I've used the new-ish Power BI diagnostic tools to try to troubleshoot this. What I've noticed is that while the Resource column displays the original URL, the Data Source Query column appends the text 'HTTP/1.1' to the original URL. Please see the screenshot below. If I try to establish a new connection with the added text, the query times out. Can anyone tell me why the extra text is added, why this prevents the data being returned, and how I can work around this? Thanks
Power BI Diagnostics Output
Try this way in a blank query:
let
GetData =
let
source = Web.Contents("https://api01.naturalhr.net/2.0/timeoff/key/(security key here)/format/xml"),
xml = Xml.Document(source)
in
xml
in
GetData
Use Fiddler like #Rick Grimes said to see if your request is being sent normally.

Importing data into Address entity in CRM 13 failing

I have some Account info that contains multiple addresses that I need to enter into CRM 13 for a client. I was instructed by my PM to create a separate csv doc with the duplicate accounts and their info on them and import them into the Address entity, and that this should tie those records to their respective account records. The problem I am having, though, is that every time I have tried to do the import, the process fails. When I look at the log, I have two references that the data failed an import because of a parent id issue. I get both this error: The parent is not valid; and this one under the last column on the log: parent id not set for address type 1701.
I have no idea what parent id I am supposed to map to, because there aren't any to map to when on the import field mapping set up. I have tried creating a lookup field on the Address entity that calls back to the Account form, but that is not allowed by the system. I have relabeled ALL the fields I have to map on the CSV doc to match the proper entity names, and it fails. I have even tried to map it to the one primary key field name on the drop down, and it fails. When I look at the fields individually, there is only one lookup field that goes back to the Account data: The Parent field. It's the ONLY one I can think I could use, but when you go into the form editor it doesn't exist. This seems to be the only way I can do this like my PM wanted.
I am at a loss. I have scoured the internet, and I can't find an answer to how to resolve this. If anyone knows how I can import this data so that it ties to the Account entity, I would greatly appreciate it.
Turns out that CRM 15 has some issues with showing fields in Firefox, at the moment, but not with IE. And it was the last thing I thought to look at, until a coworker brought it up. Problem solved.

InfoPath Form Reset Main Data Connection

I'm building an InfoPath form connected to a SharePoint List.
After the user has filled out a couple of fields (which can uniquely identify a record), I re-query the main data connection to see if the item already exists. This works fine and loads the remaining details of the existing record into the form.
However, if the record does not exist, all the fields on the form become disabled... how can I reset the main data connection back to its initial state to allow a new item to be submitted?
You will need to set a condition rule. So for example on the last field the do an insert on do your query, then in your rules (you may need multiple depending on how many fields) , saying if field is (whatever) then do (whatever). The disable features are set in the formatting section of your rule.

SSRS: Dropdown is not populated in filter in Report Builder

Whenever I try to apply filter to an attribute, which has ValueSelection= Dropdown, the dropdown is not populated and error message "The requested list could not be retrieved because the query is not valid or a connection could not be made to the data source" is shown instead.
If I set up ValueSelection=List I am getting a different error message:
An attempt has been made to use a semantic query extension associated with the data extension 'SQL' that is not registered for this report server.
(Microsoft.ReportingServices.SemanticQueryEngine)
This happens within BIDS environment and was observed both in SQL 2005 and SQL 2008.
I've already studied articles, which discussed the similiar problem, but neither of them applied to my case. The user account in data source has all necessary rights, data could be retrieved without any problem (for example if i try "Explore data" in data source view). The SQL profiler shows that no query is being sent to SQL Server when there is an attempt to populate dropdown. So nothing is wrong with the query, it is simply never executed.
Your connection is not working. Try to test you connection by trying a simple table and query output.
This will enable you to test the connection before trying anything advanced.
Got this problem and in my case it was caused by wrong connection string in Data Source - instead of just having a SQL Server name like "SOMESQLSERVER_MACHINE" I had for some reason "SOMESQLSERVER_MACHINE.our.corp.domain". It had to be the same, but then I realized that the domain is wrong, after removing it all works like a charm again. That said: it's always good idea to start with detailed checks on your basic settings.
Otherwise this could be a problem with permissions to the folders on Report Manager.

How to get list of aggregates using JOliviers's CommonDomain and EventStore?

The repository in the CommonDomain only exposes the "GetById()". So what to do if my Handler needs a list of Customers for example?
On face value of your question, if you needed to perform operations on multiple aggregates, you would just provide the ID's of each aggregate in your command (which the client would obtain from the query side), then you get each aggregate from the repository.
However, looking at one of your comments in response to another answer I see what you are actually referring to is set based validation.
This very question has raised quite a lot debate about how to do this, and Greg Young has written an blog post on it.
The classic question is 'how do I check that the username hasn't already been used when processing my 'CreateUserCommand'. I believe the suggested approach is to assume that the client has already done this check by asking the query side before issuing the command. When the user aggregate is created the UserCreatedEvent will be raised and handled by the query side. Here, the insert query will fail (either because of a check or unique constraint in the DB), and a compensating command would be issued, which would delete the newly created aggregate and perhaps email the user telling them the username is already taken.
The main point is, you assume that the client has done the check. I know this is approach is difficult to grasp at first - but it's the nature of eventual consistency.
Also you might want to read this other question which is similar, and contains some wise words from Udi Dahan.
In the classic event sourcing model, queries like get all customers would be carried out by a separate query handler which listens to all events in the domain and builds a query model to satisfy the relevant questions.
If you need to query customers by last name, for instance, you could listen to all customer created and customer name change events and just update one table of last-name to customer-id pairs. You could hold other information relevant to the UI that is showing the data, or you could simply hold IDs and go to the repository for the relevant customers in order to work further with them.
You don't need list of customers in your handler. Each aggregate MUST be processed in its own transaction. If you want to show this list to user - just build appropriate view.
Your command needs to contain the id of the aggregate root it should operate on.
This id will be looked up by the client sending the command using a view in your readmodel. This view will be populated with data from the events that your AR emits.