W1siziisijiwmtkvmdkvmjcvmtqvmdqvmdyvnduyl0hanumxotqxifttbwfsbf0uanbnil0swyjwiiwidgh1bwiilcixmdawedkwmfx1mdazyyjdxq

I AM A

JOBSEEKER.

continue to candidate homepage

W1siziisijiwmtkvmdkvmjcvmtmvntkvndmvnju3l1jnmv8ynzg5ifttbwfsbf0uanbnil0swyjwiiwidgh1bwiilcixmdawedkwmfx1mdazyyjdxq

I AM

HIRING.

Continue to client homepage

DataOps Engineer

  • Location

    Zürich, Switzerland

  • Sector:

    Data Analytics

  • Job type:

    Contract

  • Salary:

    Negotiable

  • Contact:

    Ashley Morton

  • Email:

    Ashley.Morton@darwinrecruitment.com

  • Job ref:

    JN -122019-85244_1575473165

  • Published:

    8 days ago

  • Duration:

    12 months

  • Startdate:

    December 2019

Main tasks/activities:

  • Install, operate and maintain the infrastructure for efficient data analytics, from inception to operationalization
  • Contribute key know-how for accessing, securing, configuring, running and maintaining SQL and NoSQL databases as well as all related tools used to build and operate data lakes and datamarts,
  • Ensure the integration of all tools and data pipelines with source systems and data warehouses
  • Collaborate closely with all stakeholders, during all project phases and product lifecycle, including data engineers, data scientists, data analysts, product owners, IT owners
  • Ensure the data analytics infrastructure is up to date with current technology while in line with our enterprise architecture standards
  • Become the lead person in the analytics teams for operationalizing and operating data analytics solutions

Position requirements:

  • Passionate about modern approaches to building and running a data analytics infrastructure, leveraging the Python technology stack for data analytics, while integrating with tools and databases like Jupyter, Tableau, PowerBI, Oracle, SQL Server, Denodo, external REST APIs
  • Senior practical DevOps/DataOps skills in a Python, SQL/NoSQL and private cloud context, using our modern cloud infrastructure (IaaS) in a mixed Windows & Linux environment, knowledge of docker/kubernetes is a plus
  • A strong drive to deliver operationalized solutions that enable reporting and data visualisation
  • 5+ years professional experience in data engineering, preferably applying DataOps best practices
  • Highly motivated self-starter, pro-active, keen on learning and open to new ideas
  • Self-organised and have a good feeling of when you have just to inform, to consult, to involve or to ask others for approval
  • Ability to navigate ambiguity and take ownership, you perceive this as a positive challenge and a source of self-fulfilment, not as a stress-factor
Save job