SSAS - Time Dimension Increasing Time Structure - sql-server-2008-r2

I'm looking to increase the time dimension within a cube. Its currently set to Calendar end date 31/12/2012. Im looking to update it to end 31/12/2016. Currently using BIDS I've Increased the CalendarEndDate property in the dimension to '31/12/2016'. Saved it, processed the dimension, went to the browser and viewed the data but only shows up to Calendar 2012, no sign of 2013, 2014 ... etc.
Am I missing a step or is there anything else etc I must do before this will update?
Using SQL Server 2008R2. Dimension was set up prior to me taking ownership so unsure of how it was originally created but seems a pretty standard dimension with the source as the DSV.
Thanks

It sounds like the developer used "Generate a Time Table on the Server" (first page of Dimension Wizard) option originally as you have CalendarEndDate available. If that is the case, changing the CalendarEndDate, Processing the Dimension from Visual Studio (not SQL Server!) and browsing the Dimension should work. By processing from dimension inside Visual Studio, you will force the new definition to be deployed.
If you are still having problems, try creating a new time dimension inside a test project and deploying to see if you get the same issue. Compare the properties of both time dimensions to see if you can spot anything unusual.

Related

Problem displaying geoserver's layer when resource is updated with rest API

I am having a weird issue when using geoserver api to update a netcdf resource of a coverage store layer. The resource is a netcdf containing one 3D (lon, ,lat, time) variable. However, the time dimension is only of length = 1. My code runs within a docker container and uses curl in a .sh file to run the api commands.
I must stress that the problem occurs only once in a while, maybe 10% of the time, maybe less.
When th problem occurs, the update of the store seems to have a problem and the layer cannot be displayed. When looking in the get Capabilities, one of the weird thing is that the time dimension is not right date, but is rather equal to 1970-01-01T00:00:00.000Z, which is the reference date used in the netcdf for the time dimension. Also, no problems are detected in the logs.
I do know that the problem is not with the file, and probably not with the upload of the file. Indeed, when the problem occurs, I can successfully create a store and a layer with the same resource and the same parameters as the layer that is not working.
I have tried multiple things via the API to solve this issue:
Reset the resource cache. It sometimes works, but not always
Delete layer and store and recreating them every time I need to update the resource
Delete resource, layer and store and recreating everything when resource update.
Nothing seems to get rid of the problem permanently. Has anyone experienced the same kind of behavior? It is not the first time I use geoserver’s api in a data harvester, but it is the first time I have this problem!
EDIT
I also tried to make the make the netcdf file as simple as I could, by removing the time dimension.
So now, the netcdf file only has 4 variables: lon, lat, the gridded variable, and a variable called crs that is of dimension 0, so is empty (I left it there for now since it comes from the outside source file).
But then again, the same kind of issue occurs, and again only once in a while. However, when it occurs, there seems to be something wrong caught in geoserver's log:
2022-06-08 16:01:28,267 WARN [operation.projection] - Possible use of "Popular Visualisation Pseudo Mercator" projection outside its valid area.
Longitude 2147483287°00.0'W is out of range (±180°).
But again, if when this happens, I can usually clear the resource cache and the layer will become visible again.
So I still dont know what is happening. Could it be the empty crs variable that sometimes creates problems?
Thanks a lot for your help!

Is there a way to force the output of a Stat panel in Grafana to only show values in days

I'm building a Grafana dashboard with some Stat panels that show average, minimum, and maximum time values (see below) for specific fields in my database. I'm storing the data in seconds and setting the value's units to seconds after which the panel displays the time in weeks, days, hours, etc. For the sake of consistency, I would like everything to be shown in days but I haven't been able to find a way to force units for the output value. If it's possible to do this, could someone please point me to some docs, or something that could show me which configurations to make in my panels?
So far, I've tried (without success):
Configure each panel to use units of days
The result of this was that everything showed up in years, etc.
Configure each panel to create a new field by performing a binary calculation where I converted from seconds to days and then I updated the units to be days
The result was that the values were not changed at all -> instead of showing X days, or whatever, it just showed the value in seconds without the units. I'm not sure what I messed up there, but it didn’t change anything.
I found this link that discusses setting a time range for queries
This didn’t end up being useful for what I was trying to do because it was actually geared towards changing the query to a specific date range rather than the output.
I looked through the transformations documentation, stat panel documentation, and a few other panel documentation pages in an effort to see if there was any information on how to do it but I was unable to find anything on forcing the output value to use a specific unit.
Edit:
So I kept messing around with the dashboard and got a solution that works - i.e. it's a "good-enough" solution (see below) - but, now, I'm curious if it's possible to show the units along with the value without it converting it to some other unit. Does anyone have any ideas about this?
One thing to note is that the data for this image is different than the data for the previous one so I'm expecting an inexact conversion to days.
You can use a custom unit. It is a bit tricky to enter the unit in the UI because of the automatic selection but if you enter i.e. "days" (without the quotes) in the Unit field ans instead of leving the field with tab or mouse click use the scrollbar of the combobox and select the last entry "Custom Unit: days".
Hope it helps. And for the record: I am using Garafana 7.1.4

How do we change the "precision:ms" setting in the Grafana Query Inspector?

I have an InfluxDB database with only x11 data points in it. These data are not displaying correctly (or at least as I would expect) in Grafana when the time between them is shorter than 1ms.
If I insert data points 1 ms apart, then everything works as expected and I see all x11 points at the correct times, as shown below.:
However, if I delete these points and upload new ones but this time one point per 100 μs, then although the data displays correctly in InfluxDB, in Grafana I see only two points in my graph:
It seems like the data is being rounded/binned to the nearest millisecond, an that this is related to the “precision=ms” setting in the query here:
but I cannot find any way to change this setting. What is the correct way to fix this?
You can't configure Grafana to support different time precision for the InfluxDB. It is hardcoded in the source code: https://github.com/grafana/grafana/blob/36fd746c5df1438f27aa33fc74b24be77debc7ff/public/app/plugins/datasource/influxdb/datasource.ts#L364 (It may need to be fixed in multiple places of the source, not only in this one.)
So the correct way to fix it is to code it, which is of course not in the scope of this question.

AlphaVantage: Random data in downloaded adjusted time series

I download adjusted time series from AlphaVantage using the following call (you need to insert your own API key):
https://www.alphavantage.co/query?function=TIME_SERIES_daily_adjusted&symbol=^GDAXI&outputsize=full&apikey=yourAPIkey
Next, I look at one particular (and faulty) data point at date 2003-04-18:
"5. adjusted close": "766464.0000"
Then, I reload the exact same API call and check the same data point again. However, this time there is a different value for adjusted close here! Every time I reload, different value (and always wrong, too). Why is this happening and how do I fix this wrong data?
For those who come across the same problem with AlphaVantage data, I try to answer my own question.
The random data problem only occurs on some (not all) non-trading days. For example, the above date is Good Friday in 2003. I have written a function to filter out all non-trading days from the downloaded AlphaVantage data, and that "fixed" the problem of the random-data days.

SSRS Error: "One or more parameters required to run the report have not been specified. (rsParametersNotSpecified)"

Okay there are similar questions to this but this is NOT a duplicate. This error seems to come up when you have parameters referencing a dataset which is shared. Deleting the report from the server and redeploying does not fix in my case.
So I am developing on VS 2010 Professional with Business Intelligence Development Studio, BIDS, which is under source control with Team Foundation Server. I am deploying to a 2008R2 server which I thought may be the issue. The workaround is to change the dataset references to be embedded instead which stops this error dead in it's tracks but that is pretty poor in my opinion and I would like to have this work with shared datasets ultimately.
Things I have tried:
Ensure the naming of the dataset matches the reference. EG: "Name is ClientQuery, shared dataset is ClientQuery"
Ensure the naming on the server matches the refernces in step 1.
Ensure that this is what is breaking it by removing the reference to the shared dataset, works right away then.
Ensures that the shared dataset is not enabling some type of caching on the server.
I had a filter on a second shared dataset limiting scope, I removed that and there was still an error.
Removed all parameters and only added a single shared dataset, it gives error right away.
Added an option to the parameters binding to say: "Allow Empty values". Did this with Nulls as well.
Recreated EVERYTHING, a whole brand new RDL file, and copy and pasted only elements on the body of the report but explicitly created the parameters and the datasets and this STILL HAPPENED.
9. UPDATED - I have done the old destroy the RDL and then hope to redeploy. I found that a lot online. That does not work in this case. It is almost like this reference in the RDL:
< DataSet Name="**ClientQuery**">
< SharedDataSet>
< SharedDataSetReference>**ClientQuery**</SharedDataSetReference>
< /SharedDataSet>
< Fields>
< Field Name="CUSTOMER_ID">
< DataField>CUSTOMER_ID</DataField>
< rd:TypeName>System.String</rd:TypeName>
< /Field>
< Field Name="CUSTOMER_NAME">
< DataField>CUSTOMER_NAME</DataField>
< rd:TypeName>System.String</rd:TypeName>
< /Field>
< /Fields>
< /DataSet>
It appears that somehow the mention of this refernce causes havoc. I would examine my bin(environment) directory under my project. (I deploy for multiple environments and set up QA, UAT, PROD, etc.. under solution config) Each time the RDL is getting updated as it should and posting the updates I am showing. I think 'rebuild' is a lot of the issue at times when people see their report files not updating on a server, in my case a rebuild usually gets updates to the RDL versus just hitting deploy first.
While all of this is happening the hard part is that it works throughout changes every time on BIDS seamlessly. So the error is dealing completely with what the source server believes the rdl data to represent.
Any help is much appreciated, I would rate myself advanced at SSRS but this one has me stumped of what the error is refernecing that it is not getting.
I know this is an old question, but I just ran across this and was able to resolve my issue. Thought an updated option is warranted for others struggling with it. My issue had to do with the parameter settings on the Shared Dataset properties. The menu looks like this:
Specifically, make sure that you check the "Allows null value" option where needed. This instantly resolved my issue where a dataset would not work when pointing to a shared but embedding the dataset did.
Okay so the answer Jeroen proposed and others is half right. My issue was that my source code was under an older SVN Source Control, that was deployed to an SSRS 2008 Server, then we migrated the code base to TFS Source Control. The issue appears to be that the Shared Datasets were BELIEVING to be different identifiers than they actually were. The simple work around IN ADDITION to deleting the files is to redeploy the shared datasets as well. In my case I went into my project settings and deployed them to a different location entirely under the report structure to keep them in the same area so: Reports/Datasets instead of just Datasets. This seems to clear up the issue in my case so I believe this was just a perfect storm. In doubt with SSRS just delete everything and start from the ground up I guess.