Edit Content
We Are Akira

Snowflake Data Warehousing for a US Based Logistics & Transportation Company 

 

Snowflake DW – Data Flow architecture

Snowflake DW – Data Flow architecture - Features

SOURCE SYSTEMS
STORAGE & COMPUTATION
SEMANTIC LAYER

Flat files from the various source systems – CRT, GP and other utility applications will be uploaded to AWS S3 based on the respective data cadence

Source files from the external staging of AWS S3 will be loaded into Staging tables in Snowflake through Snow-pipe via auto ingestion mechanism

Views or tables to be utilized by BI reports, ad hoc queries by the users and outbound extracts for vendors.

  • Sales details, inventory details, Pick ticket details etc. are pushed as flat files from CRT to S3 bucket.
  • Raw data from the source files are loaded to Staging tables for data transformation and mapping.
  • Tables and views built for tableau reporting are connected from Tableau desktop through tableau ODBC connector and are published to Tableau online as datasets.
  • Sales Employee details and Store master details are sourced from the utility applications Sales reporting and TP Locations. Quote details data is sourced from customer web application.
  • Snowflake virtual warehouses process the staged data using Streams, Tasks and Stored procedures for change data capture and loads the data into storage tables.
  • Entities for SSRS reports too will be connected through ODBC connectors and are published as datasets.
  • Active Employee data is extracted from the ADP file.
  • Final layer of Storage tables in Data warehouse will have the transformed data to be utilized by Semantic layer and downstream utilities.
  • Ad hoc queries through other BI platforms like Excel will have to be provisioned through an ODBC connection set up adding Snowflake as a data source.

ETL and error handling - process flow detail