Avenue Code

Position

Data Engineer

Canada

623cd6b909a3b04b5bb80216
Sobre a empresa:

Avenue Code é uma empresa de consultoria de e-commerce com sede em São Francisco, com três escritórios adicionais no Brasil (Porto Alegre, Belo Horizonte e São Paulo). Nós somos uma empresa 100% financiada com capital privado, rentável, e têm sido uma trajetória de crescimento sólido por vários anos. Nós nos preocupamos profundamente com nossos clientes, nossos parceiros e nossos consultores. Preferimos a palavra “parceiro” ao “fornecedor”, e nosso investimento em relacionamentos profissionais é um reflexo dessa filosofia. Orgulhamo-nos da nossa perspicácia técnica, da nossa capacidade colaborativa de resolução de problemas e do calor e profissionalismo dos nossos consultores.

About the opportunity:

We are seeking an energetic and talented Data Engineer to deliver high value, high-quality business capabilities to our data technology platform. You will be an integral member engineering team delivering across multiple business functional areas. You will build data analysis infrastructure for effective prototyping and visualization of various data-driven approaches.

Responsibilities:
  • Partner in building the infrastructure required for optimal extraction, transformation, visualization, and loading of data from a wide variety of data sources using SQL and big data technologies;
  • Design and build large and complex datasets that meet functional and non-functional business requirements;
  • Optimize data storage and query performance; ensure data integrity, cleanliness, and availability; and document data sources, methodologies and test plans/ results;
  • Build analytics, visualization and dashboards to provide actionable insights and key business metrics;
  • Identify, design, and implement process improvements by automating and integrating manual processes for greater efficiency and scalability;
  • Provide technical leadership for development of highly complex analytics/ models;
  • Collaborate with stakeholders across organizations to support their data analytics needs;
  • Build Communities-of-Practice in key data technologies.
Required Qualifications:
  • Python experience;
  • Azure or any other cloud platform;
  • SQL (Teradata/Redshift) experience for large datasets;
  • Create and maintain data ingestion pipelines;
  • Glue, Kafka, Redshift (with a focus on infrastructure-as-code);
  • Collaborate on a daily basis with the product team. This includes pairing for all aspects of software delivery;
  • Create and maintain optimal data pipeline architecture;
  • Assemble large, complex data sets that meet functional and non-functional business requirements;
  • Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, and re-designing infrastructure for greater scalability;
  • Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and AWS ‘big data’ technologies.