A new Flexera Community experience is coming on November 18th, click here for more information.
We're working on scaling up dashboarding outside of Flexera (e.g. Tableau/PowerBI - not the Cognos module). Namely, this will be using the IT Asset Management Data API (flexera.com) .
We have run some pilots with Tableau & PowerBI in which we extract data in one of two ways:
Both of these are not very scalable versus a traditional database connection. The direct query also runs into issues in terms of scaling up given the ITAM has a 10,000 row limit on each call. Between the login, then iterating over the responses and appending the data, it hangs and gives issues on scheduled refresh.
Does anyone have a method in which they scale up large datasets outside of the tool?
I don't want to re-invent the wheel, but I also don't want to have dozens of .CSV's all over the place getting imported with both manual and automated means.
Thanks in advance for any guidance you all may have!
‎Apr 17, 2023 01:44 PM
In my experience it is most common to extra data into CSV format (including allowing for multiple pages of data to need to be retrieved), and then load it in to Power BI.
‎Sep 11, 2023 10:56 PM
In my experience it is most common to extra data into CSV format (including allowing for multiple pages of data to need to be retrieved), and then load it in to Power BI.
‎Sep 11, 2023 10:56 PM
Thanks Chris. It looks like that is the case - we are working on scheduled scripts to extract CSVs, and will likely store them in a SQL DB to stage and cleanse the data for PowerBI.
For a 'light' report - PowerBI direct query to ITAM API works well for a report under the 10K pagination limit, especially if it's a simple refresh and replace data.
For any sort of aggregation, or having to loop thru and pull 100K+ rows, I ran into issues w. Direct Query as PowerBI starts getting tricky when you try add query logic to loop and append lists in the query editor itself. Much easier to just do that outside of PowerBI with powershell then import the data.
‎Sep 12, 2023 07:42 AM - edited ‎Sep 12, 2023 07:42 AM
One suggestion. Fetch the data from powershell scripting and make it sure you can fetch data in set of 500/1000 records with the help of loop
This is the best way to enhance the performance !!
‎Sep 13, 2023 01:51 PM
Would it help to have Flexera add custom (SQL-based) report(s), which could do some of the heavy lifting of filtering, aggregation, and ordering? And then using this/these report(s) to pull the data via API?
‎Sep 13, 2023 01:57 AM
Certainly - Is there a standardized process for requesting those sorts of reports?
As a cloud customer I cant self-implement the reports Nicolas publishes from time to time. Especially in cases where we just want to expose more fields from the database that already exist but aren't in the WebUI.
‎Sep 13, 2023 07:54 AM
I haven't done custom reports in Flexera One yet. I've just requested custom fields in Flexera One and would assume that custom reports work in the same way, by requesting them via a support ticket. I would prepare the code for them in a local VM and they would have to add the TenantID as a filter.
‎Sep 13, 2023 08:30 AM
User | Count |
---|---|
8 | |
7 | |
3 | |
3 |