cancel
Showing results for 
Show  only  | Search instead for 
Did you mean: 
dwampach1
By Level 8 Flexeran
Level 8 Flexeran

The Flexera SaaS Management (FSM) team has developed an application incorporating workflow and error messaging for importing SaaS data from over 26,000 SaaS vendors. The FSM Data Ingestion Utility is a client-based Windows desktop tool to help upload data from your organization’s multiple SaaS applications on a regular schedule. This application provides the Software Asset Manager with a complete view across all their SaaS applications.

FSM Data Ingestion Utility Workflow

The diagram below describes the key steps of the FSM Data Ingestion Utility.

  1. Map the input CSV file or API data source fields to Flexera One’s SaaS Import Job API.
  2. Schedule the upload data frequency between your organization’s SaaS applications and FSM Data Ingestion Utility.

 

FSMDIU1.png

 

 

 

 

 

Key Features

The FSM Data Ingestion Utility has the following key features.

All Inclusive

Ingests all your organization’s SaaS vendor data to reflect your entire SaaS environment, irrespective of whether that data is available via a vendor API or stored in a CSV file.

Flexible & Extendable

  • Creates and changes field mappings as needed
  • Creates job upload schedules to suit application usage data availability

Reliable

  • Tracks and provides visibility into application activity through audit logs
  • Retries job uploads multiple times to ensure reliable delivery

Scalable

  • Enables the ingestion of enterprise-level data
  • Automatically handles ingestion of large CSV files by splitting files into 10MB chunks

Secure

  • Requires token-based authentication
  • Encrypts and stores refresh token in a database.

Additional Resources

(4) Comments
ryanhardcastle0
By
Level 3

It's good to see something like this. I can see the potential.

I fully appreciate it's version 1.0 but just as a suggestion it would be good to be able to view a sample of the API data so we can be sure the fields contain the values we need.

Also the ability to transform the API data that's received before it's mapped will be helpful.  For example if my API data doesn't contain the first name and last name values (which are mandatory) then I 'd like the option to apply some transformation to the email address to obtain these missing values. 

I know I can do this in a CSV file but that then relies on someone having to generate it etc. 

aswindells
By Technical Writer
Technical Writer

Thanks for the response Ryan, these are great ideas! I will get them added into the backlog.

Thanks

ryanhardcastle0
By
Level 3

Another suggestion 🙂

I have a couple of customisations that require me to retrieve a bearer / access token using a refresh token (identical to the F1 API).  So the first part of the job is to POST the refresh token and then use the access token in the reply in the GET headers.

I can't seem to be able to do the POST job correctly, it's possibly the content type (needs to be x-www-form-urlencoded) but even if I could I can't do anything with the result in the sense of treating it as a variable for the GET job. 

We could do with a facility to store and manage credentials for the API connections and then use those as variables in the job configuration - a bit like how Postman does it.

aswindells
By Technical Writer
Technical Writer

Hi Ryan - the bearer token requirement is already on the backlog for '23. More details to follow.