I found a solution for replicating the ndi files we receive to Inventory server from Beacon but is it possible to replicate it to multiple locations so that we can do it to a DEV instance as well as a different drive in PROD?
Replication possibility found from post:
It would be great to hear a positive response.
Apr 01, 2022 09:02 AM
We have shared the replication folder on our Prod app/batch server. I then have a daily scheduled task running in each of our non-prod environments that pull the latest X number of files from the share and place the files into the Incoming folder in that environment. In Dev, we might only copy 10 files/day, in Test or Perf, we could copy more, depending on needs.
The process doesn't pull the same files every day so some inventory gets stale but it is a way to keep inventory processing in the non-prod environments on a regular basis without cloning anything at the database level.
Apr 01, 2022 09:21 AM
Would it be possible to share the purge task scheduler job that remove files from the replication folder that are older than X days. I want to implement in one of my customer environment.
Oct 17, 2023 03:45 AM
The mechanisms you've found described on the page at https://community.flexera.com/t5/FlexNet-Manager-Knowledge-Base/How-to-configure-replication-to-collect-a-uploaded-agent-files/ta-p/2060 will replicate uploaded NDI (or other) files to a single folder.
If you want to have them in multiple folders, you would need to set up a process to copy/move them out of the replication folder to other location(s) that you want them - something along the lines of what @darren_haehnel has nicely described.
Apr 04, 2022 03:04 AM
I did something like this but on the beacon level, as we have a multi tenant implementation, the hassle was to big to do this job on the main application level. But also I had some additional requirement, the customer want the .ndi files to be replicated to his own beacon server, but only the .ndi files that are in his scope.
So I did the following:
On beacon server, I modified the upload tasks, and put before uploading a robocopy task in place, that will copy the incoming folder to a staging database.
On the upload server, I have dump .csv file that contain all the devices that are in scope for the customer, this CSV file is put into a virtual directory and with curl I download this file from the beacon. Having this file on beacon with power shell I check it and move the data to an upload folder. After that with ndupload and with some parameters I send the data directly to the customer beacon as requested.
For your implementation you can do the same and send the data to you dev beacon.
Apr 04, 2022 10:45 AM - edited Apr 04, 2022 10:46 AM