Connecting...

W1siziisimnvbxbpbgvkx3rozw1lx2fzc2v0cy9yzwqty29tbwvyy2uvanbnl2jhbm5lci1kzwzhdwx0lwvulmpwzyjdxq

Data Modeller

Location: American Canyon, California Salary: €300 - €325 per day
Sector: Consultancy Type: Contract
Reference #: CR/081037_1623156603

Data Modeller / Remote role / 6-9 months / Start ASAP

Working hours - 8am - 5pm EST / 1pm - 10pm GMT / 2PM - 11PM CET

The modeller/developer will help model data required to support GGB BI needs and migration from QlikView to PowerBI. This resource will work with the Data Hub team to bring data from different sources including SAP to the Data Hub (Redshift/S3). The primary responsibility of this person will be to create the enterprise model for the reporting database, implement the ETL job to source data from the DataHub landing zone (data extracted from source systems) to the reporting database. PowerBI will connect to the reporting database to create the BI dashboard.

Here are the skillsets/job duties in more details

* Develop the company data model based on business requirements and collaboration with lead enterprise data modeller
* Analyse and determine informational needs and elements, data relationships and attributes, proposed manipulation, data flow, storage requirements, and data output and reporting capabilities
* Define logical attributes and inter-relationships and designs data structures to accommodate database production, storage, maintenance, and accessibility
* Develop, implement, and support Extract, Transform, and Load (ETL) processes in AWS Redshift environment to accommodate a wide variety of data sources and user needs
* Analyse the data stored in the data warehouse and make recommendations relating to the performance and efficiency of the stored data
* Facilitate monitoring and optimisation of service performance
* Write, test, and validate queries and reports
* Additional tasks as required
* Strong virtual communication and collaboration skills

Skill:

* Bachelor's degree or equivalent in Computer Science, Computer Engineering, Information Systems or similar. Master's degree preferred
* Hands-on experience working in enterprise-scale data warehousing, data lake, and/or data engineering environments.
* 5+ years of designing, implementing, and building pipelines (ETL) that deliver data with measurable quality under the SLA
* 5+ years of SQL ( Snowflake, Oracle, AWS Redshift, Databricks, Hive/Hadoop etc.) experience is required
* Hands-on data modeling experience, especially dimensional data modelling
* Nice to have knowledge of Tableau, PowerBI, Qlik or other Data visualisation tools
* Familiarity with manufacturing, ERP systems like SAP, Oracle, Netsuite is a plus.