Reputation: 1
Desktop; WPF + MVVM, Prism, Entity Framework, UnitOfWork + Repository.
UnitOfWork was initialized on two computers at the same time. someUser
was added on the first PC.
How can I get someUser in _unitOfWork.Users
on the second PC? For view, view model is created once, right? So context inside the UnitOfWork is unaware of the new someUser
.
public class UnitOfWork : IUnitOfWork
{
private readonly MyDbContext _context;
private IUserRepository _users
public IUserRepository Users => _users ??= new UserRepository(_context);
public UnitOfWork(MyDbContext context)
{
_context = context;
}
public async Task CommitAsync()
{
await _context.SaveChangesAsync();
}
public Task Rollback()
{
return Task.CompletedTask;
}
}
public class UsersViewModel : BindableBase
{
private readonly IUnitOfWork _unitOfWork;
public UsersViewModel (IUnitOfWork unitOfWork)
{
_unitOfWork = unitOfWork;
}
public async Task Foo()
{
_unitOfWork.Users.Add(someUser);
_unitOfWork.CommitAsync();
}
}
Upvotes: 0
Views: 36
Reputation: 34968
With WPF you do not want to rely on constructor injection for the DbContext. This works fine for ASP.Net MVC/Razor pages because the scope of a Controller etc. is per-request so a DbContext is only "alive" while processing each request. In a WPF application an instance of a class will be alive considerably longer. Instead of injecting a DbContext (or a Unit of Work wrapping a DbContext) you should consider injecting a DbContextFactory class or using a UoW pattern that acts as a Scope around the DbContext. (I.e. Zejji.DbContextScope)
Long running DbContexts are not good because by default the DbContext is using a tracking cache for all entities is reads. Tracking caches are not there for performance, they are for change tracking and reference tracking. (to avoid 2 or more different object instances for the same record) The more entities that are tracked, the slower the DbContext gets, and you also run into stale data situations. (2 clients "read" the same record, 1 client updates the row and saves, the second client "reads" but gets its cached copy, not seeing the change by the other client.)
You can mitigate the above scenario by using AsNoTracking()
queries. For instance if you use: context.Users.AsNoTracking()
client B will fetch the current data state from the database and avoid adding user instances to its tracking cache.
Separate applications will not be notified about new inserts and updates unless you implement a centralized messaging system. (See: Publish/Subscribe libraries like the EventAggregator in Prism or a message queue like RabbitMQ) EF has no built in support for push-based notifications, you need to incorporate a suitable solution yourself. For instance if one client inserts a new user or updates a user, it publishes a message to a message bus saying "{Event:UserInserted, Id:10234}" or "{Event:UserUpdated, Id:10234}" where each application subscribes to events depending on which view you are on. For instance if another client is on a User List page and it subscribes and receives a UserInserted or UserUpdated event it takes action such as refreshing the user list or re-loading/updating a user line. Alternatively you can use a polling-based system (Pull) where in the background your page requests user rows periodically that have changed since the last poll timestamp. For instance if you are on a search page with filters querying for the filters appending a timestamp check:
var dataChanged = context.Users
.Where( /* build your normal Where clause based on search criteria */)
.WHere(x => x.LastModifiedDateTime >= _lastPolledDateTime)
.Any();
_lastPolledDateTime = DateTime.Now();
if (dataChanged)
// Reload your search results.
This would trigger your page to refresh if any row in your results had changed, or a new record might be included. If you are using pagination (skip/take) then you would want to fetch the current page of IDs and compare them to the current list of page IDs, reloading the page if the ID list changes. I.e.:
var updatedIds = context.Users
.Where( /* build your normal Where clause based on search criteria */)
.OrderBy( /* order criteria */)
.Select(x => x.Id)
.Skip(_currentPage * _pageSize)
.Take(_pageSize)
.ToList();
if (_currentIds.Count != updatedIds.Count || _currentIds.Except(updatedIds).Any())
// Reload your search results.
Where the page keeps a list of the current IDs for the page of rows currently displayed. We poll for just the IDs of the current page. If there are any changes in the list we refresh the current page.
Push-based systems are better for larger muti-client setups, but can be trickier to test, and complex to write if you want to avoid kicking off a full refresh when you get a subscribed event notice. Pull-based systems are straight-forward to test but can be "chatty" with the database especially the shorter the polling cycle. (to see changes earlier)
Upvotes: 0