Reputation: 1149
I have an XML file that I need to have concurrent access to. I do not need concurrent write, but I do need concurrent read. The perscribed method for reading/modifying this XML file is by using a PowerShell script, which uses Linq to XML (XElement) to work with the XML file.
The ideal scenario is:
XElement
object, perform the desired operations, then save the XElement
back to the file, and then release the lock.I have seen this related post, 'Concurrent file usage in C#', and attempted to implement this, using the below code. In one instance of PowerShell ISE, I execute the Read-Write function, and then while in the "Enter an item name" loop, I execute the Read-Only function in another instance of PowerShell ISE. When attempting to open the document read-only, I receive an exception stating The process cannot access the file 'C:\Path\To\file.xml' because it is being used by another process.
using namespace System.Linq.Xml
using namespace System.IO
Add-Type -AssemblyName 'System.Xml.Linq (rest of Assembly's Full Name)'
$filename = "C:\Path\To\file.xml"
function Read-Only()
{
$filestream = [FileStream]::new($filename, [FileMode]::Open, [FileAccess]::Read)
$database = [XElement]::Load($filestream)
foreach($item in $database.Element("Items").Elements("Item"))
{
Write-Host "Item name $($item.Attribute("Name").Value)"
}
$filestream.Close()
$filestream.Dispose()
}
function Read-Write()
{
$filestream = [FileStream]::new($filename, [FileMode]::Open, [FileAccess]::ReadWrite, [FileShare]::Read)
$database = [XElement]::Load($filestream)
$itemname = Read-Host "Enter an item name ('quit' to quit)"
while($itemname -ne 'quit')
{
$database.Element("Items").Add([XElement]::new([XName]"Item", [XAttribute]::new([XName]"Name", $itemname)))
$itemname = Read-Host "Enter an item name ('quit' to quit)"
}
$filestream.Seek(0, [SeekOrigin]::Begin)
$database.Save($filestream)
$filestream.Close()
$filestream.Dispose()
}
How can I lock a file for exclusive editing, while still allowing read-only access to any other client?
Edit: I have solved this problem by using the solution suggested by mklement0 - using a seperate file as a lock file. Code is posted here: https://pastebin.com/ytzGE7se
Upvotes: 1
Views: 868
Reputation: 438793
If you use [FileAccess]::Read
without specifying a file-share mode explicitly, then if the file is already open, opening it again will only succeed if it was originally opened with file-share mode [FileShare]::ReadWrite
(even though you're only asking for read access, the method defaults to requesting write access too) - whereas your Read-Write
function (sensibly) uses just [FileShare]::Read
.
Your immediate problem goes away if you explicitly open your read-only filestream with file-share mode [FileShare]::ReadWrite
:
[System.IO.FileStream]::new(
$path,
[System.IO.FileMode]::Open,
[System.IO.FileAccess]::Read,
[System.IO.FileShare]::ReadWrite # !! required
)
This allows both other readers concurrent access as well as the one and only (read+)writer (which may have opened the file first).
However, a file getting rewritten while being read by others can be problematic, so for robust and predictable operation I suggest a different approach:
Note: This answer to a related question shows a simpler alternative to the solution below, which, however, takes longer to update the file.
Make modifications in a temporary copy of your file, then replace the original.
This requires explicit synchronization to coordinate between would-be updaters so as to serialize updates in order to prevent updates from overwriting each other.
You could achieve this with a separate lock file (sentinel file) named, say, updating
, which acts as an indicator to other would-be writers that an update is in progress.
Mike Christiansen (the OP himself) ended up also storing the username of the locking user in that file, to provide feedback to other would-be lockers.
When modification is requested:
Keep trying in a loop until creating lock file updating
succeeds, failing if the file already exists.
updating
file signals to other would-be writers that a modification has started. Readers, by contrast, can continue to read at this point.Create a (temporary) copy of the current XML file and perform the modifications there.
Replace the original file with the modified copy - use a retry loop until copying over / rewriting the original succeeds, given that other readers may temporarily prevent deleting (re-creating) / rewriting the original file.
Delete file updating
, signaling to other would-be modifiers that the update has completed.
try .. finally
), because letting it linger would block future updates; you may also need a timeout-based mechanism that on waiting for a preexisting file to be deleted eventually forces deletion, if it can be assumed that the previous updater has crashed.As for read access:
While no modifications are taking place, concurrent read access should work fine.
While the XML file is being replaced with the temporary copy / rewritten, opening the file for reading will fail for the duration of the file-copy operation / write operation, so you'll need a retry loop there too.
The code Mike ultimately used can be found here.
Upvotes: 1
Reputation: 2355
Maybe this can help. When you create a filestream there is an FileShare option. If you set it to ReadWrite, multiple processes can open that file
$fsMain = [System.IO.File]::Open("C:\stack\out.txt", "Open", "ReadWrite", "ReadWrite")
$fsReadOnly = [System.IO.File]::Open("C:\stack\out.txt", "Open", "Read", "ReadWrite")
Write-Host ("fsMain: CanRead=" + $fsMain.CanRead + ", CanWrite=" + $fsMain.CanWrite)
Write-Host ("fsReadOnly: CanRead= " + $fsReadOnly.CanRead + ", CanWrite=" + $fsReadOnly.CanWrite)
$fsMain.Close()
$fsReadOnly.Close()
Upvotes: 1
Reputation: 3372
May I suggest a slightly different approach to it all. You can do the following with PowerShell which allows you to avoid the issue on the reads. My XML navigation may be a little off, but it will work as described here.
function Read-Only()
{
$database = [xml](Get-Content $filename)
foreach($item in $database.Items)
{
Write-Host "Item name $item.Name"
}
}
Now you can keep your same write function or you can do it all with PowerShell. If you wanted to do it with PowerShell you can do something like this.
function Read-Write()
{
$database = [xml](Get-Content $filename)
$itemname = Read-Host "Enter an item name ('quit' to quit)"
while($itemname -ne 'quit')
{
$newItem = $database.CreateElement("item")
$newItem.SetAttribute("Name", $itemname)
$database.Items.AppendChild($newItem)
# You can also save all the changes up and write once.
$database.Save($filename)
$itemname = Read-Host "Enter an item name ('quit' to quit)"
}
}
Upvotes: 1