Combining your specific industry knowledge and needs with our interdisciplinary BIM knowledge and software engineering skills, we build simple solutions to increase the efficiency and quality of your work. Whether you have an idea with potential, or you are facing a challenge that existing BIM systems cannot solve, we create add-ons and standalone BIM applications to bridge that gap.
To achieve this effectively, we use SAFe as our work method, JIRA as our work environment and the following tested technologies as the building blocks: Revit API, .NET, PHP, Angular, Node JS, React, TypeScript, three.js, MongoDB, PostgreSQL, HTML and CSS.
Opis posla
We are seeking a talented and experienced Lead Data Engineer to join our dynamic Data Engineering team. As a Lead Data Engineer, you will play a crucial role in building and optimizing our data infrastructure, ensuring the smooth flow of data from diverse sources to support critical business functions. You will be responsible for leading a team of engineers and collaborating with cross-functional teams to design and implement data solutions at scale. The ideal candidate will have a strong technical background, excellent leadership skills, and a passion for driving data innovation.
Key Responsibilities:
Lead and mentor a team of data engineers, fostering professional growth and continuous learning
Design, develop, and optimize scalable data pipelines and architectures on Azure.
Collaborate with product, data science, and business teams to align data solutions with company objectives.
Provide technical guidance to solve complex data engineering challenges.
Develop ETL pipelines to integrate data from multiple sources into Azure Synapse Analytics and Data Lake Storage.
Ensure data quality, integrity, and compliance with governance best practices.
Implement automation and monitoring to enhance system reliability and performance.
Optimize query performance, storage solutions, and processing pipelines for large datasets.
Stay up to date with emerging trends in data engineering and cloud technologies to drive innovation.
Kvalifikacije
Requirements:
Bachelor’s degree in Computer Science, Engineering, Data Science, or a related field
6+ years of experience in data engineering, with at least 1 year in a leadership or senior role
Proven experience in building scalable and efficient data pipelines, ETL processes, and cloud-based data solutions
Strong expertise in data warehousing, data lakes, and cloud-based data platforms, with a focus on Microsoft Azure (Azure Data Lake Storage Gen2, Azure Synapse Analytics, Azure Data Share)
Advanced proficiency in SQL and Python, with experience in other programming languages (e.g., Java, Scala, Spark) being a plus
Hands-on experience with big data processing frameworks such as Apache Spark (preferably on Azure Databricks) and streaming solutions like Azure Stream Analytics or Kafka
Experience with data pipeline orchestration tools such as Azure Data Factory, Apache Airflow, or similar
Proficiency in version control (Git, Azure DevOps) and CI/CD pipeline automation for data workflows
Strong understanding of data governance, security, and compliance practices
Excellent leadership, communication, and collaboration skills
Strong problem-solving skills with a focus on innovation and practical implementation
Experience with machine learning data pipelines and integrations with Azure Machine Learning
Familiarity with containerization and orchestration (e.g., Docker, Kubernetes)
Knowledge of data visualization tools such as Tableau or Power BI
Previous experience with data quality frameworks and data testing methodologies
Dodatne Informacije
Working at Walter Code:
You become a member of the Walter Code Club, which includes a lot of benefits such as private health & life insurance, team & education budget, fitness budget, team activities, sporting events, parties, and other benefits from the categories: