MLOPs/Devops Engineer_LATAM (Remote)_C2CHi,Trust this finds you well!We've spotted your impressive profile and have an exciting opportunity tailored to your skills and passions.Role: MLOPs/Devops EngineerMode of employment: C2CLocation: Remote (Mexico, Brazil, Argentina) – Must be a citizen or authorized to work in these countries.Candidates from other locations in Latin America (LATAM) who are ready to work remotely will also be considered.If you are from LATAM other than these coutries, please do forward your resume and let us know if you can work from these countries or remote from your place.Years of experience: 7+ yearsRole Responsibilities:• Design, implement, and maintain infrastructure and tools for software development and deployment using IaC tools.• Automate processes for continuous integration, delivery, and deployment (CI/CD pipeline) to ensure smooth software delivery.• Collaborate closely with developers, testers, and operations teams to facilitate seamless software delivery.• Monitor system health and performance, proactively identifying and resolving issues.• Perform root cause analysis to diagnose and fix production errors.• Implement logging and monitoring tools to gain insights into system behavior.• Train and guide junior developers on DevOps principles, fostering their professional growth.• Foster a collaborative learning environment within the team, promoting knowledge sharing and continuous learning.• Communicate the value of DevOps practices and reusable components to stakeholders, highlighting their impact on business outcomes.• Collaborate with stakeholders to understand their needs, aligning DevOps practices and reusable components with their goals.• Continuously improve and optimize DevOps processes and workflows, leveraging feedback and lessons learned.• Stay up to date with the latest trends in DevOps, AI, and data science, and share knowledge across the organization.• Provide strategic input to the AI ecosystem, contributing to platform evolution and new capability development.• Collaborate with AI development teams to integrate reusable components into production AI solutions.• Partner with the AIDA Platforms team to enforce best practices for reusable component architecture and engineering principles.Basic Qualifications:• Bachelor's degree in computer science, information technology, software engineering, or a related field (Data Science, Computer Engineering, Computer Science, Information Systems, Engineering, or a related discipline).• 7+ years of relevant work experience in DevOps or a related field.• Strong scripting skills in languages like Python or Bash.• Experience with Infrastructure as Code (IaC) tools such as Terraform, Ansible, or Chef.• Experience working in a cloud-based analytics ecosystem (AWS, Snowflake, etc.
)• Proficiency in Git for version control of infrastructure code and application code.• Familiarity with monitoring and observability tools such as Prometheus, Grafana, or ELK stack.• Knowledge of infrastructure security best practices and experience with security tools.• Experience with automated testing frameworks and tools.• Familiarity with database technologies and SQL query optimization.• Knowledge of serverless computing and experience with serverless platforms like AWS Lambda.• Hands-on experience working in Agile teams, following Agile processes and practices.• Self-directed learner with a strong desire to continuously improve coding skills.• Highly self-motivated to deliver both independently and with strong team collaboration.• Ability to creatively take on new challenges and work outside comfort zone.• Strong English communication skills (written & verbal)Preferred Qualifications:• Advanced degree in Data Science, Computer Engineering, Computer Science, Information Systems, or a related discipline (preferred, but not required)• Familiarity with monitoring and observability tools such as Prometheus, Grafana, or ELK stack.• Knowledge of infrastructure security best practices and experience with security tools.• Experience with automated testing frameworks and tools.• Familiarity with database technologies and SQL query optimization.• Knowledge of serverless computing and experience with serverless platforms like AWS Lambda.