Dash Job - Senior Data Engineer - Circunomics

:mega: Plotly Dash Skills: Experience with Dash Framework for data visualization.

Company: Circunomics

Title: Senior Data Engineer

Category: Full time

Location: Frankfurt, Germany

Experience: Mid-level

Application: Link to apply

@OpenToWork


About the job

You will

  • End-to-End Development: Design, develop, and deploy scalable, maintainable, efficient, and reliable applications using Python 3, FastAPI or Flask, with optional use of Angular. Ensure seamless integration of back-end components.
  • Big Data Management: Handle (near) real-time big data environments, managing and processing datasets starting from terabytes with precision and efficiency.
  • ETL Processes: Design, implement, and manage ETL (Extract, Transform, Load) processes to ensure efficient data handling through its entire lifecycle.
  • Performance Optimization: Implement caching services like Redis to improve application performance.
  • Data Pipelines: Develop and manage data pipelines using Airflow.
  • Mentorship: Provide guidance and mentorship to junior developers, fostering a culture of continuous learning and improvement within the team.
  • Architecture & Design: Collaborate with architects and other developers to design robust, high-performance, secure application architectures.
  • Collaboration: Work closely with product managers, designers, and other stakeholders to understand requirements, provide technical insights, and ensure the successful delivery of features.
  • Code Quality & Best Practices: Write clean, efficient, and well-documented code while setting and maintaining high standards for code quality through code reviews, testing, and automated tooling.
  • Containerization: Containerize applications using Docker and manage container orchestration with Kubernetes.
  • Infrastructure Management: Create and manage infrastructure using Helm Charts and Terraform.
  • Cloud Deployment: Deploy and manage applications on AWS cloud platforms, including S3, ECR, EKS, RDS, and EFS.
  • Data Visualization: Create data visualizations using Plotly.
  • CI/CD Implementation: Implement CI/CD pipelines using GitHub.
  • Troubleshooting & Optimization: Troubleshoot and resolve application issues while optimizing performance.

You have

  • Experience: You have a minimum of 5 to 7 years of professional experience as a Data Engineer, particularly with Python 3, FastAPI, or Flask, and optional familiarity with Angular. You have a strong track record of designing and deploying scalable applications across backend components and, optionally, front-end components.
  • Big Data Management: You possess deep knowledge in managing (near) real-time big data environments, dealing with datasets starting from terabytes. You have implemented processes to efficiently handle, analyze, and process large volumes of data within strict deadlines.
  • ETL Processes: You are proficient in implementing ETL (Extract, Transform, Load) processes, ensuring effective and efficient data handling throughout its lifecycle.
  • Deep Understanding: You have in-depth knowledge of software development methodologies, RESTful APIs, microservices architecture, and CI/CD pipelines.
  • Problem-solving: You have strong analytical and problem-solving skills and can tackle complex technical challenges.
  • Communication: You have excellent communication and interpersonal skills and can convey complex technical concepts to non-technical stakeholders.
  • Adaptability: You thrive in a fast-paced, dynamic environment and can manage multiple priorities and projects simultaneously.
  • Language Skills: You have a good command of the English language.
  • Nice to Have:
  • Experience with Dash Framework for data visualization.
  • Familiarity with other cloud platforms and technologies.