Data Engineer With Gcp

Detalhes da Vaga

This position is a Remote position to work from on in LATAM working with US clients. You will be working as a consultant directly with clients US Clients. Must work in US PST Zone. We offer: 100% Remote Position Paid National Holidays off $30 to $35 USD per hour Title: Data Engineer with GCP We are seeking a highly motivated and experienced Data Engineer to join our Data Engineering team. In this role, you will be at the forefront of designing and developing scalable, robust data architectures and solutions utilizing the latest technologies from Google Cloud Platform (GCP) and AWS. You will collaborate closely with cross-functional teams to understand their data needs and will focus on building, optimizing and scaling data platform solutions that drive insights for marketing strategies, personalization efforts and operational efficiencies. As a senior member of the team, you will work closely with data scientists, machine learning engineers, data analysts and cross-functional teams, playing a critical role in shaping the company's data architecture. Resposibilites: Design, develop, and maintain scalable, high-performance data infrastructure to support the collection, storage, and processing of large datasets in real time and batch modes. Build reliable, reusable services and APls that allow teams to interact with the data platform for ingestion, transformation, and querying of data. Develop internal tools and frameworks to automate and streamline data engineering processes. Collaborate with senior management, product management, and other engineers in the development of data products. Develop tools to monitor, debug, and analyze data pipelines. Design and implement data schemas and models that can scale. Mentor team members to build the companys overall expertise. Work to make The RealReal an innovator in the space by bringing passion and new ideas to work every day Required Skills At least 5 years of proven experience as a Data Engineer in developing platform level capabilities for a data-driven midsize to large corporations. Strong object-oriented programming skills in languages such as Python, Java or Scala, wigth experience building large-scale, fault-tolerant systems. Experience with cloud platforms (GCP, AWS, AZURE) with strong preference to GCP. Experience with BigQuery or similar (Redshift, Snowflake, other MPP databases) Experience building data pipelines & ETL Experience with command line, version control software (git) Excellent communication and collaboration skills. Ability to work independently and quickly become productive after joining. Preferred Requirements : Knowledge of distributed data processing frameworks such as Apache Kafka, Flink. Spark, or similar. Experience with DBT (Data Build Tool) and Looker. Experience with machine learning pipelines or MLOps.


Salário Nominal: A acordar

Fonte: Adzuna_Ppc

Função de trabalho:

Requisitos

Cientista De Dados Sênior

Somos uma fintech e trabalhamos com produtos de meios de pagamentos, gestão financeira e crédito para micro, pequenos e médios empresários de diversos segmen...


Listo - Brasil

Publicado 12 days ago

Analista De Infraestrutura Em Nuvem

DescriçãoResumo da Posição: Estamos à procura de um profissional qualificado e motivado para atuar como Analista de Infraestrutura de TI. Este profissional s...


Conveniar - Brasil

Publicado 12 days ago

Analista De Sistemas

Cargo: ANALISTA DE SISTEMAS SÊNIORJá pensou em trabalhar em uma das maiores empresas de Tecnologia do Brasil?Então a Cast Group é o lugar certo para você! To...


Cast Group - Brasil

Publicado 12 days ago

Desenvolvedor Full Stack Sênior

Estamos em busca de Desenvolvedor Full Stack Sênior, para atuar de forma Remota e com modelo de contratação PJ. Requisitos:- Experiência em linguagens de Pro...


Italents - Brasil

Publicado 12 days ago

Built at: 2024-12-18T04:26:23.692Z