Data Engineer

The ideal candidate is an experienced data pipeline builder and data wrangler who enjoys optimizing data systems and building them from the ground up.

Have a question?

Our data experts are ready to help. Just enter your info, and we'll direct you to the right person.
Apply
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.

What We Are Looking For:

We are in search of a skilled Data Engineer to join our expanding team of experts. This role will be pivotal in the design and development of Snowflake Data Cloud solutions, encompassing responsibilities such as constructing data ingestion pipelines, establishing sound data architecture, and implementing stringent data governance and security protocols.

The ideal candidate brings experience as a proficient data pipeline builder and adept data wrangler, deriving satisfaction from optimizing data systems from their foundational stages. Collaborating closely with database architects, data analysts, and data scientists, the Data Engineer will play a crucial role in ensuring a consistent and optimal data delivery architecture across ongoing customer projects.

This position demands a self-directed individual comfortable navigating the diverse data needs of multiple teams, systems, and products. If you are enthusiastic about the prospect of contributing to a startup environment and supporting our customers in their next generation of data initiatives, we invite you to explore this opportunity.

Qualifications:

  • Possession of a Bachelor's degree in engineering, computer science, or a related field.
  • 3-6 years of experience in relevant technical roles, demonstrating proficiency in data management, database development, ETL, and/or data preparation domains.
  • At least 1+ years of experience within the Snowflake Data Cloud environment.
  • Explicit experience with Snowflake, including details on architectural design, data modelling, and implementation.
  • Proven experience in developing data warehouses and constructing ETL / ELT ingestion pipelines.
  • Adept knowledge in manipulating, processing, and extracting value from extensive disconnected datasets.
  • Proficiency in SQL and Python scripting is a requirement, with additional proficiency in Scala and Javascript considered advantageous.
  • Exposure to cloud platforms (AWS, Azure, or GCP) is a favourable attribute.
  • Proven experience with Extract, Transform, Load (ETL) or Extract, Load, Transform (ELT) tools, especially those compatible with Snowflake (e.g., Matillion, Fivetran).
  • Knowledge of DBT is an advantage.
  • Strong interpersonal skills, including assertiveness and the ability to foster robust client relationships.
  • Demonstrated proficiency in project management and organizational skills.
  • Capability to collaborate and support cross-functional and agile teams within a dynamic environment.
  • Advanced proficiency in English is a mandatory requirement.