
About the Role
Responsibilities
• Develop and understanding of business obstacles, create solutions based on advanced analytics and draw implications for model development
• Combine , explore and draw insights from data. Often large and complex data assets from different parts of the business.
• Design and build explorative, predictive or prescriptive models, utilizing optimization, simulation and machine learning techniques
• Prototype and pilot new solutions and be part of the aim of “productifying” those valuable solutions that can have impact at a global scale
• Guide and coach other chapter colleagues to help solve data/technical problems at an operation level and in methodologies to help improve development processes
• Identify and interpret trends and patterns in complex data sets to enable the business to take data-driven decisions
• Manage cloud infrastructure
Requirements
Required competence
Skills and Abilities
Experience in Business Intelligence, Data Warehousing.
Create, develop, automate and manage data pipelines.
Document way of working. Making sure that there is good quality on the information and data.
Set a plan, requirements and strategy for governance and structure of the information and data as well as for the Data pipe line itself.
Set requirements for IT-systems providing the data pipe line with information and data.
May also be required to assist the business analyst and data scientist in understanding what the end- consumer want.
Ability to evaluate different options proactively and ability to solve problems in an innovative way. Develop new solutions or combine existing methods to create new approaches
Strong communication skills, orally and in writing
Several years prior experience with Big Data, cloud platforms (e.g. AWS), BI & ETL tools and technologies.
Designing and implementing performance optimized and scalable data pipelines with cleansing, transformation, and loading via Spark, Scala, SQL is a must
Experience in working with different forms of storage (e.g., HDFS, Object Storage, S3) and data formats (e.g., AVRO, Parquet, JSON, XML)
Experience in CI/CD, Aiflow and Github
Experience in agile workflow and strong knowledge in DevOps
Experience in developing ETL pipelines
Exposure to Data Modelling techniques and working with Databases is a plus
Data modelling, data governance, and BI presentation (PowerBI)
About the Company
Multiply is an Engineering & IT services company providing cutting edge Cloud and Engineering services and our own product to customers within retail, finance, logistics and engineering. We use data to multiply the business value of our customers. We are based in Stockholm and this role is also based here. For this particular role, those who are already in Sweden or hold a valid residence permit may apply.
Use Multiply CV under the following link for quicker handling:
https://en.multiply.se/forum/templates-1/cv-mall-cv-template