Skip to content
Description

Our client is looking for a savvy Data Engineer to join our growing team of analytics experts. The hire will be responsible for expanding and optimizing our data and data pipeline architecture, as well as optimizing data flow and collection for cross functional teams. These data and data pipelines are core to the technology that make investment decisions and provide the best offers to artists.

The ideal candidate is an experienced data pipeline builder and data wrangler who enjoys optimizing data systems and building them from the ground up. The Data Engineer will support our software developers, database architects, data analysts and data scientists on most initiatives and will ensure the data delivery architecture is consistent throughout ongoing projects. You should be self-directed and comfortable supporting the data needs of multiple teams, systems and products. The right candidate will be excited by the prospect of optimizing or even re-designing our company’s data architecture to support the first generation products and data initiatives.

This role is Hybrid (3 days onsite and 2 days remote)

You will:

  • Collaborate with the different teams to normalize different data sources to company standards.
  • Improve and optimize complex ETL processes for batch data processing from a variety of SQL and non-SQL data sources, while ensuring data integrity and normalization to company standards.
  • Ensure that data pipelines are scalable, repeatable, and secure.
  • Build integrations to 3rd party APIs.
  • Establish cloud deployment processes.
  • Assist the Data Science team in optimizing newly developed algorithms.
  • Ensure all deliverables and processes are of high-quality throughout the project by adhering to best practices.

Requirements

  • 2-3 years of hands-on experience designing and implementing large-scale, complex ETL applications using industry-leading products/platforms
  • 2+ years of experience using SQL and Python to perform complex data manipulation
  • Ability to deal with ambiguity and work with rapidly-changing business data.
  • Team player, great communicator, collaborative and optimistic by nature.
  • Comfortable working in a startup environment with remote colleagues.
  • Familiarity with common programming tools and best practices such as unit testing, Git, and Jira

Nice to Have:

  • Exposure to time series data / signal processing.
  • Experience writing client API libraries.
  • Experience working with Docker containers.
  • Experience in any of the more popular Clouds (Azure, AWS or GCP).
« Back to Homepage Data Engineer « Duetti Apply now

Application for {{job_title}} with {{company_name}}

    Name

    Email

    Message

    Upload a document, e.g. resume, job description (optional)