Due date not updated in Google Tasks - Gmail UI - google-workspace

I created a new task in the new Gmail Tasks side panel and set the due date. I edited the due date of the task by using Google Tasks Edit API.
The changes are not reflected in Tasks side panel of Gmail.
But when I get the task using Get Tasks List api, the due date is correctly updated for the task.
The rest of the task details like notes, title are updated in the Gmail UI when changed using Google Tasks api.
Is this any known issue? Or am I missing something?

Related

How can I trigger a contact workflow on a specific date in Bitrix24?

I would like to trigger a workflow resetting a sum on all contacts every 1st January of the year. The workflow is implemented. It could be triggered on the specific date or even everyday, because the date is verified in the workflow before resetting.
How can i run it on every contact?
Hello and welcome to Stackoverflow, the answer to your question is as the following:
You need to create a Rest API outbound webhook in your Bitrix account with "CRM" & "Business Process" permissions.
You will write down a PHP code that will be executed on the exact date/time of you choice
The PHP script will do 2 process:
Will retrieve all the contacts that you have on your Bitrix account and store them into a string value.
It will run that Business process and pass the list of all the found contacts, and inside your Business process you're going to set it up in a way that it will run the required actions on every contact from eh contacts IDs passed by the first step.
The PHP script can be hosted in a webhosting or even on your local machine since it's once a year.
This is only the concept of the solution, I can't share the script code because I don't have it at the moments, please contact me if you faces problems.

Send an email with query results on a scheduled basis (Everyday morning 9AM):

My head of engineering, product and other departments wants a daily email sent out of Shared queries or particular Work Item query like Bug, Task, Epic to know the status of Work Items.
Is there any extensions available to send the query results on a schedule basis?
You could do it through Schedule pipeline to call APIs to get result and send mail programming.
You also could build a app (e.g. service application, powershell job) to call REST APIs to get necessary result and send mail programming.
Related REST API:
Wiql - Query By Id
Work Items - Get Work Items Batch
There is an extension for this purpose - Scheduled Work Item Query:
The Goal of this extension was to keep as much functionality as
possible inside Azure DevOps / Team Foundation Server. We have opted
to realize this by using a standard Build Pipeline in Azure DevOps /
Team Foundation Server as our "Scheduling" Engine.
The actual functionality is realized as a Build Pipeline Task called
"Scheduled Work Item Query". This task executes a query that is saved
in either the "My Queries" or "Shared Queries" folders, and sends the
results by e-mail using either SendGrid or standard SMTP.

Platform / extranet suggestions for capturing data from external sales agents

I'm looking for suggestions for a tool/platform that can help in capturing form and file attachment data from external sales agents. I currently use Google Forms and a plugin to enable sending email notifications to specific contacts, but the hope is to create something more advanced where people providing the data can login to submit data and make updates to the records. The platform should also enable approval workflows with email notifications to ensure the incoming data meets requirements.
Here is a description of the desired process:
Agent logs in to the platform
Agent submits data (and optional file attachments) into the system using a form
Upon submission, the system triggers a review task and sends an email notification to a reviewer
If the reviewer deems the information invalid/incomplete, they will
reject it, with comment, and the system will
automatically send the information back to the agent while copying
their manager. Record status will be updated automatically.
If the reviewer deems the information invalid, they will approve it, with or without
comment, and the system will automatically send the information back to the agent while copying
their manager and project administrator. Record status will be updated automatically.
Additional requirements:
Contacts for email notifications will vary by agent and they should be determined by identification items on the user/agent profile
Agents need to be able to update records but a new review task will be triggered once an update is submitted
The system should have functionality for searching the record fields
An agent should see their own data and data submitted by others in the same agency, but not the data submitted by agents outside their agency
I should be able to see the full list of submissions and version history of the records
Approximate scale: 100-150 agencies, 1000-1500 agents
So far, I've thought of using a list within SharePoint Online combined with Microsoft Flow, but I'm not sure it would enable all the desired functionality. I'd be interested in your thoughts about that setup as well as others that you think could work and/or have found successful. Even if your solution does not tick all the boxes, I'd be interested to know about it.
Thanks a lot for your help!
JP

VSTS Analytics request blocked due to exceeding usage of resource AnalyticsBlockingResource

I have a PowerBI mashup that performs 4 queries against different VSTS projects on our tenant via the VSTS Analytics module. I have setup each query as a specific analytics views. Each view returns < 200 records and are simple "Get Story Backlog items" for a single team for today only.
I am frequently getting a message like the following:
An error occurred in the ‘DS BI WorkItems’ query. Error: Request was
blocked due to exceeding usage of resource 'AnalyticsBlockingResource'
in namespace 'User'. For more information on why your request was
blocked, see the topic "Rate limits" on the Microsoft Web site
(https://go.microsoft.com/fwlink/?LinkId=823950). Details:
DataSourceKind=Visual Studio Team Services
ActivityId=a6ac93f3-549c-4eb0-b64e-2b38e18ae7ee
Url=https://vrmobility.analytics.visualstudio.com/_odata/v2.0-preview/WorkItems?$filter=((ProjectSK%20eq%208e25983d-a154-4b53-915f-1394b34e5338)%20and%20((ProjectSK%20eq%208e25983d-a154-4b53-915f-1394b34e5338%20and%20Teams/any(t:t/TeamSK%20eq%2019afa381-35ca-47db-9060-51baa5d0485e))))%20and%20Processes/any(b:(b/BacklogName%20eq%20'Stories')%20and%20((b/ProjectSK%20eq%208e25983d-a154-4b53-915f-1394b34e5338%20and%20(b/TeamSK%20eq%2019afa381-35ca-47db-9060-51baa5d0485e))))&$select=LeadTimeDays,CycleTimeDays,CompletedDate,StateCategory,ParentWorkItemId,ActivatedDate,Activity,VRAgile_ActualCompletionIteration,VRAgile_ActualUatIteration,BusinessValue,VRAgile_ChangeAreaOwnerTeam,ChangedDate,ClosedDate,CompletedWork,VRAgile_CompletionTargetConfidence,CreatedDate,FinishDate,FoundIn,WorkItemId,VRAgile_IncludedinVersion,IntegrationBuild,OriginalEstimate,VRAgile_PlannedCompletionIteration,VRAgile_PlannedUATIteration,Priority,Reason,VRAgile_ReleaseQuality,RemainingWork,vrmobility_VRAgile_RequestedBy,VRAgile_RequestedDept,R...
error=Record
I have checked the page and looked at the Usage page on our VSTS tenant but during these times my user is not indicated as blocked and VSTS user interface works normally.
The issue goes away after a few minutes but it will then return after a couple of changes made in PowerBI (like adding a new column, changing data type etc) because this automatically refreshes all 4 queries again and this seems to trigger this unacceptable usage.
It is really frustrating as I can't continue working on the report and have to go and do something else for 5 minutes really impacting my flow.
Any ideas on cause, solution/workarounds? It feels to me like an overly sensitive VSTS limit on the VSTS Analytic service
Azure DevOps Services limits the resources individuals can consume and the number of requests they can make to certain commands. When these limits are exceeded, subsequent requests may be either delayed or blocked.
See the below link...
https://learn.microsoft.com/en-us/azure/devops/integrate/concepts/rate-limits?view=azure-devops

Can I take control of a user's Google Cloud Print printer?

We've written an application to replace a third party tool to download and print jobs through Google Cloud Print. For new customers this will work well. We create the printer in the cloud and download jobs. It works. Customers up and running with the third party tool are using a printer created with that tool. I thought I'd be able to access that printer's jobs by getting the user to go through oauth authentication to give our application the permission to manage the user's printers. However, having done this and all seeming to work when I fetch jobs from that printer the response is that there are no jobs. But there is a job. Is this behaviour to be expected. Is there any way around this? We'd just like to avoid our customers having to create new printers.
The question is a little unclear; feel free to edit your question and I'll edit this answer.
Being able to manage jobs is not the same as being able to download jobs. Each printer belongs to a user, and each has a robot account. Only those two accounts (I believe) can download the job ticket and payload.
After a job is marked as completed (through the /control API), the payload is deleted.
A third user account that can manage jobs is allowed to view information about the job, as well as cancel/delete the job, but can't (I believe) download the job payload.