cancel
Showing results for 
Show  only  | Search instead for 
Did you mean: 

FNMS Cloud - ITAM API to BI Tool Architecture

We're working on scaling up dashboarding outside of Flexera (e.g. Tableau/PowerBI - not the Cognos module). Namely, this will be using the IT Asset Management Data API (flexera.com) . 

We have run some pilots with Tableau & PowerBI in which we extract data in one of two ways:

  1. Scheduled CSV's: Automate a script to query the API (reportsExecute) , and save resulting data in .csv for upload into reporting server for Tableau/PowerBI.
  2. Direct Query: In PowerBI for example, write a Blank Query with PowerQuery that queries the REST API, and stores the data all in one shot.

Both of these are not very scalable versus a traditional database connection. The direct query also runs into issues in terms of scaling up given the ITAM has a 10,000 row limit on each call. Between the login, then iterating over the responses and appending the data, it hangs and gives issues on scheduled refresh.

Does anyone have a method in which they scale up large datasets outside of the tool?

I don't want to re-invent the wheel, but I also don't want to have dozens of .CSV's all over the place getting imported with both manual and automated means.

 

Thanks in advance for any guidance you all may have!

(1) Solution
ChrisG
By Community Manager Community Manager
Community Manager

In my experience it is most common to extra data into CSV format (including allowing for multiple pages of data to need to be retrieved), and then load it in to Power BI.

(Did my reply solve the question? Click "ACCEPT AS SOLUTION" to help others find answers faster. Liked something? Click "KUDO". Anything expressed here is my own view and not necessarily that of my employer, Flexera.)

View solution in original post

(6) Replies
ChrisG
By Community Manager Community Manager
Community Manager

In my experience it is most common to extra data into CSV format (including allowing for multiple pages of data to need to be retrieved), and then load it in to Power BI.

(Did my reply solve the question? Click "ACCEPT AS SOLUTION" to help others find answers faster. Liked something? Click "KUDO". Anything expressed here is my own view and not necessarily that of my employer, Flexera.)

Thanks Chris. It looks like that is the case - we are working on scheduled scripts to extract CSVs, and will likely store them in a SQL DB to stage and cleanse the data for PowerBI.

For a 'light' report - PowerBI direct query to ITAM API works well for a report under the 10K pagination limit, especially if it's a simple refresh and replace data.

For any sort of aggregation, or having to loop thru and pull 100K+ rows, I ran into issues w. Direct Query as PowerBI starts getting tricky when you try add query logic to loop and append lists in the query editor itself. Much easier to just do that outside of PowerBI with powershell then import the data.

One suggestion. Fetch the data from powershell scripting and make it sure you can fetch data in set of 500/1000 records with the help of loop 

This is the best way to enhance the performance !!

mfranz
By Level 17 Champion
Level 17 Champion

Would it help to have Flexera add custom (SQL-based) report(s), which could do some of the heavy lifting of filtering, aggregation, and ordering? And then using this/these report(s) to pull the data via API?

Certainly - Is there a standardized process for requesting those sorts of reports?

As a cloud customer I cant self-implement the reports Nicolas publishes from time to time.  Especially in cases where we just want to expose more fields from the database that already exist but aren't in the WebUI.

I haven't done custom reports in Flexera One yet. I've just requested custom fields in Flexera One and would assume that custom reports work in the same way, by requesting them via a support ticket. I would prepare the code for them in a local VM and they would have to add the TenantID as a filter.