Job description
A
QiBit
nasceu para transformar o futuro das pessoas!
Somos uma rede global para profissionais de tecnologia,
UX e digital.
Aqui, as empresas descobrem talentos extraordinários, enquanto os profissionais encontram as melhores oportunidades de emprego digital em todo o mundo.
* Essa é uma vaga 100% remota no modelo outsourcing, onde você será contratado pela nossa empresa e ficará alocado no cliente.
Sobre nosso cliente.
Estamos em busca de uma pessoa
Engenheiro de Dados,
para atuar em um de nossos clientes internacionais.
Quer saber mais?
Logo abaixo você vai encontrar alguns detalhes sobre a vaga:
Main responsibilities
Collaborate with Lead Data Engineers and domain Architect to design, build, optimize the data architecture and extract, transform, load (ETL) pipelines to make them accessible for the data users;
Own and support data engineering platform tools using technologies like Snowflake, AWS, HVR, KNIME, etc;
Design robust ETL pipelines of medium complexity adhering to existing patterns while keeping performance, uptime, reduced technical debt, scalability, and extensibility
Develop and perform unit tests, and maintain up to date code in source control;
Collaborate with other Data Engineers for code review and participate in pair programming when needed;
Independently troubleshoot issues reported by users and errors from ETL jobs with minimal guidance; participate in on-call rotation and perform root cause analysis;
Deliver quality code and follow best practices and standards keeping performance & Scalability in mind to keep cost in check in Cloud environment;
Partner with Platform product manager to prioritize and deliver high quality data products, working in agile team;
Live the culture of sharing, re-use, design for scale and stability, and operational efficiency of data and analytical solutions.
Demonstrate the passion for innovation and continuous improvement;
Maintain awareness of advancements, and changes in technologies relating to data engineering and cloud data platforms.
Overall:
Bring a positive Run Happy energy and work with the team to deliver the best possible solutions;
Learn the business, learn the data that supports the business; be a partner – don't just implement technology;
Other responsibilities as required.
Requirements and skills
Hard Skills
Bachelor's degree or equivalent work experience in Computer Science, Engineering, Math, Information Systems, or related disciplines;
5+ years of professional experience in data warehouse ETL or Data Engineering development;
3 - 5 years of hands-on experience with Python and solid understanding of Object-Oriented Programming Language concepts;
Proficient in SQL and experience in working complex transformations;
2-3 years of experience working in Cloud data platforms (Snowflake, Microsoft Fabric preferred);
Experience working with source control tools (Git/Bitbucket);
3+ years of current professional experience in cloud based ETL data engineering development (AWS preferred).
Certification is a plus;
Solid understanding of data modeling and data architecture concepts;
Experience working in data orchestration and transformation tools (Airflow and dbt preferred);
Experience with analyzing data, unit testing & data quality validation;
Excellent verbal and written communication skills, demonstrating effective listening through concise, clear verbal and written communication;
Good interpersonal skills and demonstrated problem solving skills.
Additional information
Nosso cliente oferece:
Vaga PJ.
#J-18808-Ljbffr