dtlowndes avatar image
0 Votes"
dtlowndes asked dtlowndes commented

Installing and running executable off a shared drive vs shared service

I've inherited some legacy commercial Windows desktop MFC Win32/C++ software (let's call it "MyApp") that has its own installation program that defaults to installing executables in C:\\MyApp. It'll also install MyApp.exe quite happily to a shared network folder so multiple users can run the executable at the same time from different workstations. It reads configurations from this folder and it'll also write data to these folders that are shared amongst users. This creates concurrency issues if users are modifying and saving this data at the same time. I'm new to managing desktop apps so want to get advice on: - modern approaches that allow multiple users to use a single app installation across multiple workstations. Moving to multiple installations for each user is okay if I can solve: - architecture, installation/deployment options for multiple installations (clients) to use a common backend service when that service is C++ and needs to be decoupled from the MFC UI. If we need to migrate to client/server set up the move needs to be as close to the current single process model as possible. Off the top of my head I'm considering options like installing a windows service that the client (installed with the same installer) can automatically detect and operate with the service at a low level (like it was calling a class). Does Windows have something like this? Further client installations could be configured at install time (WiX) to detect and use this shared service running on another workstation? Does anything like this exist in the Windows world? Is there anything other than a windows service that fits this model? Some of our customers just need a single installation for a single user on one workstation so a solution that works well for both would be ideal.

· 2
5 |1600 characters needed characters left characters exceeded

Up to 10 attachments (including images) can be used with a maximum of 3.0 MiB each and 30.0 MiB total.

This creates concurrency issues if users are modifying and saving this data at the same time.

Why don't you prevent this by only allowing single concurrent use of that data - open the file(s) in question exclusively for the period of the operation?

If doing so makes use of the application painful, how do you think that re-architecting it will solve that fundamental problem?

0 Votes 0 ·

Yes, ive thought about introducing a lock as a stop gap solution. The long term solution i had in mind was a shared service which can enforce rules and logic to handle concurrency issues before it reaches the persistence request. But i guess the current application can do that through checking current file checksum for changes or whatever. Thanks.

0 Votes 0 ·

0 Answers