Automating Inphinity API data ingestion with QTC and Snowflake UDTF
This guide follows our previous blog about ingesting Inphinity API data using a Snowflake UDTF. Now we'll show you how to automate this process within Qlik Talend Cloud (QTC) and prepare the data for use in cross-pipeline projects.
By implementing this solution, you'll create an automated pipeline that not only ingests API data but also makes it readily available for business users across your organisation.
Before you start
To follow this guide successfully, you'll need:
- A Snowflake Python UDTF that calls an API and returns a JSON string as a table
- Qlik Talend Cloud with Snowflake connections configured
- Snowflake with users and schemas properly setup
If you haven't created your Snowflake UDTF yet, we recommend checking our previous blog first or adapting an existing UDTF you have.
Why bring your Inphinity API ingestion into QTC?
Moving your API ingestion process from purely Snowflake-based to QTC provides significant advantages:
- Enables cross-pipeline features for business users
- Creates automated dependencies between pipelines
- Allows business users to consume the data in their own pipelines
- Sets up automatic triggering of downstream processes
This approach bridges the technical API ingestion with business-focused data consumption.
Building your QTC pipeline for Inphinity API
The complete QTC pipeline consists of four key steps:
- Register
- Store
- Land
- Clean
Let's break down each step in detail.
Step 1 and 2: register and store configuration
Since QTC cannot use a UDTF as a registered source, we need a workaround:
- Register an empty table as a placeholder
- This table only needs to contain a timestamp field (or any minimal data)
- Position these two blocks in your pipeline as a foundation
We just want these two blocks in place as the main logic will be within the first transform block.
Step 3: implementing the transform logic
The core functionality happens in your raw landed transform block:
- Use the same query from our previous blog about Snowflake UDTF
- This query exports data using the table function and breaks it into columns
- Simply paste that query into the QTC UI interface
When describing the table, you'll see all the API data coming through as we've already created our query to export and tabularize the data format.
Step 4: data landing and cleaning approach
For optimal data management:
- Land all your API data as strings/varchars in a raw landing table in Snowflake
- Create a separate cleaning step to prepare this data for downstream consumption
- Configure your project settings to land data in your chosen database and schema
This two-stage approach ensures data integrity while making it accessible for business users.
Setting up automation schedules
Make your pipeline fully automated by:
- Setting a schedule on the storage task using the QTC UI
- Configuring how often the pipeline should run based on your needs
- Scheduling further transforms to start upon completion of the storage task
This approach eliminates manual intervention once the pipeline is established.
The business advantage: cross-pipeline capabilities
The primary benefit of bringing this process into QTC rather than keeping it in Snowflake is the cross-pipeline functionality:
- Expose clean data to business users who can consume it in their own pipelines
- Enable teams to set up their pipelines to automatically start after successful ingestion
- Create dependencies between technical and business processes
You don't get this option in Snowflake. This creates significant advantages for business workflow integration.
Implementation considerations
When setting up your pipeline:
- Test each component thoroughly before connecting the full process
- Document your approach for future reference
- Ensure your scheduling aligns with business needs and API limitations
These steps help ensure a smooth implementation and reliable operation.
Get expert help with your Snowflake and Qlik integration
Need help implementing this solution? We've helped over 250 UK businesses get more value from their data. We specialise in Qlik, Snowflake and Inphinity integration.
Book a demo call today to see how we can integrate your Inphinity data with Qlik Talend Cloud and Snowflake.
Comments