Avenue Code


Data Engineer


Sobre a empresa:

Avenue Code é uma empresa de consultoria de e-commerce com sede em São Francisco, com três escritórios adicionais no Brasil (Porto Alegre, Belo Horizonte e São Paulo). Nós somos uma empresa 100% financiada com capital privado, rentável, e têm sido uma trajetória de crescimento sólido por vários anos. Nós nos preocupamos profundamente com nossos clientes, nossos parceiros e nossos consultores. Preferimos a palavra “parceiro” ao “fornecedor”, e nosso investimento em relacionamentos profissionais é um reflexo dessa filosofia. Orgulhamo-nos da nossa perspicácia técnica, da nossa capacidade colaborativa de resolução de problemas e do calor e profissionalismo dos nossos consultores.

About the opportunity:

We are seeking an energetic and talented Data Engineer to deliver high value, high-quality business capabilities to our data technology platform. You will be an integral member engineering team delivering across multiple business functional areas. You will build data analysis infrastructure for effective prototyping and visualization of various data-driven approaches.

  • Partner in building the infrastructure required for optimal extraction, transformation, visualization, and loading of data from a wide variety of data sources using SQL and big data technologies.
  • Design and build large and complex datasets that meet functional and non-functional business requirements.
  • Optimize data storage and query performance; ensure data integrity, cleanliness, and availability; and document data sources, methodologies and test plans/ results.
  • Supporting Consumer use cases – managing and transforming data into a different format utilizing a hybrid data set (OnPrem and Cloud).
  • Build analytics, visualization and dashboards to provide actionable insights and key business metrics.
  • Identify, design, and implement process improvements by automating and integrating manual processes for greater efficiency and scalability.
  • Provide technical leadership for development of highly complex analytics/ models.
  • Collaborate with stakeholders across organizations to support their data analytics needs.
  • Create Data Pipelines- Currently have jobs written in Scalding – Need to be rewritten in DataFlow (GCP specific DataFlow)
  • Build Communities-of-Practice in key data technologies.
Required Qualifications:

  • Understanding of Scala (Functional, Algorithmic, Data Structural).
  • AWS or any other cloud platform.
  • Building out pipelines: MapReduce is heavily preferred.
  • Create and maintain data ingestion pipelines.
  • Collaborate on a daily basis with the product team. This includes pairing for all aspects of software delivery.
  • Create and maintain optimal data pipeline architecture.
  • Assemble large, complex data sets that meet functional and non-functional business requirements.
  • Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, and re-designing infrastructure for greater scalability.