Connecting...

W1siziisimnvbxbpbgvkx3rozw1lx2fzc2v0cy9yzwqty29tbwvyy2uvanbnl2jhbm5lci1kzwzhdwx0lwvulmpwzyjdxq

BI Solution Architect

Location: Stockholm, Sweden Salary: 600kr - 700kr per hour
Sector: Consultancy Type: Contract
Reference #: CR/077116_1611823554

BI Solution Architect / Must be based in Stockholm / 12 months / Start ASAP

Local language preferred

BI Solution Architect to cover following:

-Legacy BI/DW Competencies (SSIS/SSRS/SSAS/SQLserver DB, MDS, MPP etc)
-Modern Azure Data Lake Competencies (Azure Data Factory, Databricks, ADLS Gen2, AML etc)

Description:

There are multiple Data Applications existing within customers application landscape (Legacy BI Data warehouses and a Modern Azure Data Lake residing on Azure Cloud)
Legacy BI/DW Applications manage production data and includes large amounts of data with daily loads of millions of transactions daily with storage and presentation of billions of transactions.

The Legacy BI/DW Applications consists of the following components mostly:

* ETL/ELT - SQL Server Integration Services
* DB - SQL Server and Azure Synapse (Azure Data Warehouse)
* Cubes - Azure Analysis Services
* Reference data - SQL Server Master Data Services (MDS)
* SSRS reports

Data Lake is modern data platform on Azure cloud where new Applications/Transformations are delivered. Some of the existing legacy DW Applications will be consolidated and migrated to new Data Lake. (Azure Data Factory, Databricks, ADLS Gen2, AML, Power BI, Azure Devops etc)

Competencies and Experience: Following are expected competencies and responsibilities:

* Solution Experience in Azure Data Lake (Azure Data Factory, Databricks, ADLS, AML etc)
* Solution Experience in SSIS, SSAS (Tabular), SSRS, Power BI, MS SQL Server
* Ability to analyse existing Legacy BI/DW Components and map/migrate to Azure Data Lake components.
* Design Batch and Streaming Data Integration from Kafka
* Design ingestion, curation and data consumption patterns in Data Lake
* Expand ETL/ELT processes to load more data to DW/DataLake
* Create integrations to production systems using component development and Azure products to load data to DW/DataLake
* Expand existing DW/DB solution with more information
* Expand reference data in MDS (Master Data) to manage more information.
* Optimize existing solution for better performance
* Use Azure Synapse and some SQL Server
* Experience of working with MPP environments is meritorious

Agile: The assignment is performed in a team with agile development methodology according to SCRUM in a SAFe context. The project will work on short frequent deliveries, where the end result is a solution with higher data quality and richer information model, as well as improved performance in both data loads and data recovery.
Devops: Experience in Devops way of working , Agile Testing methods & Automation Frameworks
Good to have: Microsoft Azure certifications in Azure Data Lake