I made a .Net Maui app and I tried to make an online version of it using SignalR.
The connection is established successfully and I get an answer from the server and the Shell.Current.GoToAsync command is executed but the page is not changing at all.
If I use the same command outside the connection the page changes as expected.
I believe that the cause of this is that it is not running on the main thread but I need to navigate to another page when I get the correct response from the SignalR server.
Through debugging I can see that the other page initialization is executed but from another thread.
Below is a much simpler version of the code showing only what is required for this issue.
[ObservableProperty]
private string connectionID;
[ObservableProperty]
private string word;
private HubConnection _connection;
[RelayCommand]
async Task JoinGame()
{
_connection = new HubConnectionBuilder()
.WithUrl($"{baseUrl}/Hub")
.Build();
_connection.On<string, string, string>("CheckIfConnectionMatchAnswer", async (connectionID, answer, word) =>
{
if (ConnectionID == connectionID && answer == "True")
{
Word = word;
await Shell.Current.GoToAsync(nameof(GamePage));
}
});
await _connection.StartAsync();
await _connection.InvokeCoreAsync("CheckConnectionID", args: new[] { ConnectionID });
}
I expected to navigate to another page but that didn't happen.
The solution was to use MainThread.BeginInvokeOnMainThread(action); to trigger the action with the UI thread and update the page. Please note that this works as well with updating UI elements were I was struggling with the same issue once again.
Microsoft - Run code on the main UI thread .Net Maui
Related
Say I am desingning an exam portal page where i want To show countdown for the exam date and time and if that specific date and time is reached I want to redirect user to a login page. How can i do this? I have tried using Hangfire but it doesnt redirect me to the redirected page after the time is reached.The scheduled jobs gets updated on the database (HangFire.Job) but it doesnt redirect me to the other page. I am a total newbie in programming and i dont know much so Guiding me on how to acheive something like that would be appreciated a lot and yes, is it even possible to do something like this using Hangfire ?
public void Configuration(IAppBuilder app)
{
// For more information on how to configure your application, visit https://go.microsoft.com/fwlink/?LinkID=316888
GlobalConfiguration.Configuration
.UseSqlServerStorage("calanders");
var option = new DashboardOptions { AppPath = VirtualPathUtility.ToAbsolute("/Default.aspx") };
app.UseHangfireDashboard("/hangfire",option);
app.UseHangfireServer();
}
This is the Startup.cs
protected void Button1_Click(object sender, EventArgs e)
{
GlobalConfiguration.Configuration
.SetDataCompatibilityLevel(CompatibilityLevel.Version_170)
.UseColouredConsoleLogProvider()
.UseSimpleAssemblyNameTypeSerializer()
.UseRecommendedSerializerSettings()
.UseSqlServerStorage("calanders", new SqlServerStorageOptions
{
CommandBatchMaxTimeout = TimeSpan.FromMinutes(5),
SlidingInvisibilityTimeout = TimeSpan.FromMinutes(5),
QueuePollInterval = TimeSpan.Zero,
UseRecommendedIsolationLevel = true
});
BackgroundJob.Schedule(() => startCountdown(),TimeSpan.FromSeconds(20));
//d.InsertDate(TextBox1.Text.ToDa);
}
public void startCountdown()
{
Response.Write("<script>alert('Time Reached')</script>");
Response.Redirect("WebForm1.aspx");
}
This is the Main Page. (Default.aspx)
In webforms there is no ongoing connection to the server in between postbacks, so a job that is activated on the server side won't be able to interact with the user at an arbitrary future time; only during the immediate time between the button click being triggered and the response being returned.
If you want to show a timer and then do something when it is done, you should look into how to do it with Javascript on the client side, if that works for your scenario. You can use Javascript to make background requests to the server if you need to, but your job won't be directly driven by the server side.
I'm using .NET Core app with a PostgreSQL database (with Npgsql) combined with SignalR to receive real-time data and latest data entries. However, I am not receiving the latest entry, and sometimes the Clients.All.SendAsync method sends more than one entry to the client. Here is my code:
Hub method that sends new data to client:
public async Task SendForexAsync(string name)
{
var product = GetForex(name);
await Clients.All.SendAsync("CurrentData", product);
using (var conn = new NpgsqlConnection(ApplicationDbContext.GetConnectionString()))
{
conn.Open();
var cmd = new NpgsqlCommand("LISTEN new_forex", conn).ExecuteNonQuery();
conn.Notification += async (o, e) =>
{
var newProduct = GetForex(name);
await Clients.All.SendAsync("NewData", newProduct);
};
while (true)
{
await conn.WaitAsync();
}
}
}
Console app that periodically polls for new data from an API:
var addedStocksDJI = FetchNewStocks("DJI");
if (addedStocksAAPL > 0 || addedStocksDJI > 0)
{
using (var conn = new NpgsqlConnection(ApplicationDbContext.GetConnectionString()))
{
conn.Open();
var cmd = new NpgsqlCommand("NOTIFY new_stocks", conn).ExecuteNonQuery();
}
}
The other code of the app is most definitely correct because I was receiving new and correct data before I tried implementing the LISTEN/NOTIFY feature. But now, I get one (or more) of entries of newProduct on my client, but it is the "old" product, that is, the database does not query and send the latest entries, but only the old ones via SignalR. When I refresh the page manually, the new data is correctly displayed, though.
I believe it has something to do with a single connection being open so I constantly receive only the "old" set of data, but even if that is the case, I am unable to figure out why I sometimes get more than one packet of data, even though I am only trying to send one, and I am calling NOTIFY only once.
I figured it out. Hopefully this will help someone else who gets stuck with this in the future!
The issue was that I was declaring my dbContext via .NET Core's dependency injection in my Hub class, which created the context only once per that class, and also because of that per page or WebSocket transaction. Which is why I was unable to get the latest data, I assume, since the dbContext was "old" and unaware of changes.
I fixed the problem by using a dbContext via the using scheme inside of my methods, twice in my SendForexAsync method (once per every call of the GetForex function), as well as in the GetForex function itself. That way, a dbContext is created and disposed of immediately, so the next time I poll the database for new data via the GetForex function (when I get a notification from the database due to the NOTIFY from the console app), a new instance of dbContext is created which can contain that new data.
I am having this phantom problem in my application where one in every 5 request on a specific page (on an ASP.NET MVC application) throws this error:
Npgsql.NpgsqlException: ERROR: 57014: canceling statement due to user request
at Npgsql.NpgsqlState.<ProcessBackendResponses>d__0.MoveNext()
at Npgsql.ForwardsOnlyDataReader.GetNextResponseObject(Boolean cleanup)
at Npgsql.ForwardsOnlyDataReader.GetNextRow(Boolean clearPending)
at Npgsql.ForwardsOnlyDataReader.Read()
at Npgsql.NpgsqlCommand.GetReader(CommandBehavior cb)
...
On the npgsql github page I found the following bug report: 615
It says there:
Regardless of what exactly is happening with Dapper, there's
definitely a race condition when cancelling commands. Part of this is
by design, because of PostgreSQL: cancel requests are totally
"asynchronous" (they're delivered via an unrelated socket, not as part
of the connection to be cancelled), and you can't restrict the
cancellation to take effect only on a specific command. In other
words, if you want to cancel command A, by the time your cancellation
is delivered command B may already be in progress and it will be
cancelled instead.
Although they have made "changes to hopefully make cancellations much safer" in Npgsql 3.0.2 my current code is incompatible with this version because the need of migration described here.
My current workaround (stupid): I have commented the code in Dapper that says command.Cancel(); and the problem seems to be gone.
if (reader != null)
{
if (!reader.IsClosed && command != null)
{
//command.Cancel();
}
reader.Dispose();
reader = null;
}
Is there a better solution to the problem? And secondly what am I loosing with the current fix (except that I have to remember the change every time I update Dapper)?
Configuration:
NET45,
Npgsql 2.2.5,
Postgresql 9.3
I found why my code didn't dispose the reader, resulting in calling command.Cancel(). This only happens with QueryMultiple method when not every refcursor is read.
Changing the code from:
using (var multipleResults = connection.QueryMultiple("schema.getuserbysocialsecurity", new { socialSecurityNumber }))
{
var client = multipleResults.Read<Client>().SingleOrDefault();
if (client != null)
{
client.Address = multipleResults.Read<Address>().Single();
}
return client;
}
To:
using (var multipleResults = connection.QueryMultiple("schema.getuserbysocialsecurity", new { socialSecurityNumber }))
{
var client = multipleResults.Read<Client>().SingleOrDefault();
var address = multipleResults.Read<Address>().SingleOrDefault();
if (client != null)
{
client.Address = address;
}
return client;
}
This fixed the issue and now the reader is properly disposed and command.Cancel() is not invoked.
Hope this helps anyone else!
UPDATE
The npgsql docs for version 2.2 states:
Npgsql is able to ask the server to cancel commands in progress. To do
this, call the NpgsqlCommand’s Cancel method. Note that another thread
must handle the request as the main thread will be blocked waiting for
command to finish. Also, the main thread will raise an exception as a
result of user cancellation. (The error code is 57014.)
I have also posted an issue on the Dapper github page.
I'm trying to have signalR hub as part of a plugin using MEF. But after calling ImportMany on a List<> object and then adding the catalog/container/ComposeParts part in the Application_Start() method of the Global.asax file, all I get is :
Uncaught TypeError: Cannot read property 'server' of undefined.
I've got no clue if the problem comes from my interface, the plugin, the global.asax file, or the javascript.
The interface:
public interface IPlugin
{
}
the plugin:
[Export(typeof(IPlugin))]
[HubName("testHub")]
public class TestHub : Hub, IPlugin
{
public string Message()
{
return "Hello World!";
}
}
in the Global.asax file:
[ImportMany(typeof (IPlugin))]
private IEnumerable<IPlugin> _plugins { get; set; }
protected void Application_Start()
{
var catalog = new AggregateCatalog();
catalog.Catalogs.Add(new DirectoryCatalog(#"./Plugins"));
var container = new CompositionContainer(catalog);
container.ComposeParts(this);
RouteTable.Routes.MapHubs();
//log4net
log4net.Config.XmlConfigurator.Configure();
AreaRegistration.RegisterAllAreas();
WebApiConfig.Register(GlobalConfiguration.Configuration);
FilterConfig.RegisterGlobalFilters(GlobalFilters.Filters);
RouteConfig.RegisterRoutes(RouteTable.Routes);
}
and finally the javascript:
$(document).ready(function () {
$.connection.hub.url = 'http://127.0.0.1/signalr/';
var proxy = $.connection.testHub;
$.connection.hub.start({ transport: ['webSockets', 'serverSentEvents', 'longPolling'] })
.done(function () {
proxy.invoke('Message').done(function(res) {
alert(res);
});
})
.fail(function () { alert("Could not Connect!"); });
});
the only information I've found was this post but I could not make it work. everything works fine when I add the reference manually, but when I have a look at "signalr/hubs" after loading the plugin, then there is not reference to my hub's method.
Thanks a lot for your help.
Your problem is that SignalR caches the generated "signalr/hubs" proxy script the first time it is requested. SignalR provides the cached script in response every subsequent request to "signalr/hubs".
SignalR not only caches the script itself, but it also caches the collection of Hubs it finds at the start of the process.
You can work around the cached proxy script issue by simply not using the proxy script, but that still won't enable you to actually connect to Hubs defined in assemblies that are loaded after the process starts.
If you want to be able to connect to such Hubs, you will need to implement your own IHubDescriptorProvider that is aware of Hubs defined in plugins loaded at runtime.
You can register your provider with SignalR's DependencyResolver which can be passed into SignalR via the Resolver property of the HubConfiguration object you pass into MapSignalR.
That said, it would probably be easier to restart the app pool/server process whenever a plugin is added to the "./Plugins" directory.
I have a GWT application and I want to run some code when the user leaves the application to force a logout and remove any data etc.
To do this I am using a CloseHandler and registering it using Window.addCloseHandler.
I have noticed that when the refresh button is clicked the onClose method is run but I have been unable to differentiate this event from a close where the user has closed the browser. If it is a refresh I do not want to do the logout etc, I only want to do this when the user closes the browser/tab or navigates away from the site.
Does anybody know how I can do this?
There is no way to differentiate the 'close' from 'refresh'. But, you can set a cookie that holds the last CloseHandler call time and check, when loading the module, if this time is old enough to clean the information before showing the page.
You can do that with the folowing utility class (BrowserCloseDetector). Here is an example using it on the onModuleLoad.
The test lines:
#Override
public void onModuleLoad() {
if (BrowserCloseDetector.get().wasClosed()) {
GWT.log("Browser was closed.");
}
else {
GWT.log("Refreshing or returning from another page.");
}
}
The utility class:
import com.google.gwt.user.client.Cookies;
import com.google.gwt.user.client.Window;
public class BrowserCloseDetector {
private static final String COOKIE = "detector";
private static BrowserCloseDetector instance;
private BrowserCloseDetector() {
Window.addWindowClosingHandler(new Window.ClosingHandler() {
public void onWindowClosing(Window.ClosingEvent closingEvent) {
Cookies.setCookie(COOKIE, "");
}
});
}
public static BrowserCloseDetector get() {
return (instance == null) ? instance = new BrowserCloseDetector() : instance;
}
public boolean wasClosed() {
return Cookies.getCookie(COOKIE) == null;
}
}
Have you tried
<BODY onUnload = "scriptname">
in your gwt hosting/launching html file?
I am thinking that if you defined a map "hash" (i.e. a javascript pseudo hash) in the hosting file and then accessed the "hash" in GWT through Dictionary class, you could update values in that hash as the user progresses through the gwt app. Which means, your programming style would require you to log milestones on the user's progress onto this map.
When the user closes the browser page, the onunload script of the launching html page would be triggered. That script would access the map to figure out what needs to be updated to the server, or what other url to launch.
I am intereted too if someone got a solution (GWT/java side only).
Maybe we can do it with HistoryListerner ?
1-set a flag for your current viewing page.
2-when ClosingHandler event, launch a "timeout" on server-side (for example 10s)
3-if during this time your got a massage from HistoryListerner with the same last flag so it was just a refresh.
of disconnect if timer is over...
Is not a good solution but I think it is easy to do... If someone have a better one...