At BairesDev, we've been leading the way in technology projects for over 15 years.
We deliver cutting-edge solutions to giants like Google and the most innovative startups in Silicon Valley.Our diverse 4,000+ team, composed of the world's Top 1% of tech talent, works remotely on roles that drive significant impact worldwide.When you apply for this position, you're taking the first step in a process that goes beyond the ordinary.
We aim to align your passions and skills with our vacancies, setting you on a path to exceptional career development and success.Data Engineer at BairesDevWe are looking for a highly skilled Data Engineer to join our team and tackle the challenge of designing and optimizing scalable data pipelines and infrastructure.
As a Data Engineer, you will play a key role in our data-driven decision-making processes, working with cutting-edge technologies to solve complex data challenges and contribute to the organization's success.Your responsibilities will include developing and maintaining data workflows, managing cloud-based data storage solutions, and ensuring data quality and governance.
You will collaborate closely with data scientists and analysts to deliver reliable, structured data, monitor data workflows, and troubleshoot issues to maintain operational efficiency.What You Will Do:Develop and maintain scalable data pipelines using tools like Databricks, AWS Glue, and DBT to ensure seamless data integration and transformation.Optimize and manage data storage solutions, ensuring they meet the needs of the organization.Monitor and troubleshoot data workflows to maintain operational efficiency and reliability.Implement data governance and security measures, to ensure data compliance, lineage tracking, and secure access across the organization.Here is what we are looking for:4+ years of hands-on experience in data engineering roles, strongly emphasizing data pipeline development, ETL processes, and data integration.Experience in database management systems, including relational and SQL/NoSQL databases.Knowledge of cloud computing concepts, with certifications or coursework in AWS, Azure, or GCP.Strong foundation in Python programming languages.Experience with Big Data Storage tools (e.g., Lakeview, Lakehouse, Redshift, Databricks).Nice to have:A strong grounding in statistics, probability, and applied mathematics relevant to data analysis and engineering.Familiarity with machine learning concepts and tools, beneficial for integrating predictive analytics into data pipelines.How we do make your work (and your life) easier:100% remote work (from anywhere).Excellent compensation in USD or your local currency if preferredFlexible hours: create your own schedule.Paid parental leaves, vacations, and national holidays.Innovative and multicultural work environment: collaborate and learn from the global Top 1% of talent.Supportive environment with mentorship, promotions, skill development, and diverse growth opportunities.Join a global team where your unique talents can truly thrive!#J-18808-Ljbffr