Simon Sjönneby

Data Engineer & Architect
10+
years of experience

Summary

Extensive experience in data and analytics, leading implementations, ensuring stable operation and driving business value through data. Skilled in both data engineering and architecture, with a focus on optimizing data warehouses for cost efficiency and performance. Strong business acumen and a passion for helping organizations improve operations through data-driven insights.

Key Competencies

Data EngineeringData ArchitectureData Warehouse OptimizationBusiness IntelligenceCost EfficiencyData Quality Assurance

Previous Missions

Stora Enso

Data Architect

Implementation of a data warehouse solution supporting 250+ users with daily data needs, dashboards, ad-hoc reports, and automated Excel data flows for deeper analysis. Collaborated closely with developers and key business stakeholders to ensure a reliable and scalable data platform.

Challenges:

  • Designing a cost-efficient and optimized data warehouse
  • Meeting diverse data needs across many business functions
  • Simplifying and consolidating 20+ data sources

Key Deliverables:

  • Complete data warehouse architecture in Snowflake
  • dbt for ingestion, transformation, and data quality assurance
  • Data models supporting 500+ Power BI reports, Excel analysis, and ad-hoc queries
  • Data extractions from 20+ sources using Azure Data Factory

Impact: Delivered scalable data platform serving 250+ users daily with comprehensive reporting capabilities and automated data flows.

SnowflakedbtPower BIAzure Data Factory

Playpilot

Data Engineer

Migration from an on-premise Qlik Sense solution to a cloud-based data warehouse using dbt and PostgreSQL. Improved data quality with enhanced testing, reduced load times, and lowered overall costs.

Challenges:

  • Efficiently migrating existing logic from Qlik Sense to PostgreSQL with dbt
  • Reducing time-to-insight through optimized data loads
  • Maintaining business continuity during migration

Key Deliverables:

  • Data warehouse built on PostgreSQL
  • dbt transformation models migrated from Qlik Sense
  • Daily data quality monitoring and testing framework
  • Business intelligence reporting in Qlik Cloud

Impact: Successfully migrated to cloud-based solution with improved data quality, reduced load times, and lower operational costs.

PostgreSQLdbtQlik CloudQlik Sense

Klarsynt

Data Engineer

Migration of complex business logic and heavy Power BI transformations (Power Query and DAX) to SQL transformations using dbt in Microsoft Fabric. Simplified troubleshooting and improved data quality transparency for business users.

Challenges:

  • Understanding and migrating intricate business logic for calculating store bonus payments based on pricing tiers
  • Simplifying architecture to improve maintainability and reduce future workload
  • Ensuring business users could monitor data quality transparently

Key Deliverables:

  • dbt jobs performing all required transformations
  • Simplified and transparent data quality monitoring for business users
  • Migrated complex Power Query and DAX logic to SQL
  • Microsoft Fabric-based data architecture

Impact: Simplified architecture enabling easier troubleshooting, improved maintainability, and transparent data quality for business stakeholders.

Microsoft FabricdbtPower BISQLDAX

Skills & Technologies

Data Platforms

SnowflakedbtAzure Data FactoryMicrosoft FabricPostgreSQL

BI & Analytics

Power BIQlik SenseQlik CloudExcel

Programming

SQLPythonDAXM (Power Query)

Cloud Platforms

Microsoft AzureAzure DevOpsGit
CV created byStormyran