Hello We are actively looking for a Senior Python Engineer. If you or your consultant are actively looking for a new job please share your profile.Role: Senior Python EngineerDuration: 2+ Months Location: Brazil ( REMOTE ) Mandatory Skills : Experience with Airflow on Kubernetes or KEDAWe are seeking a highly skilled Senior Python Engineer to join our Technical Data Delivery team. In this role, you will be pivotal in developing and managing critical components of our data platform, including Data APIs (REST APIs), Apache Airflow, and data engineering artifacts such as data ingestion and data curation pipelines. This position requires strong proficiency in Python, Apache Airflow, REST APIs, Azure DevOps, and Azure cloud services. Responsibilities: 1. Collaborate closely with the Data Engineering Lead to devise effective data ingestion strategies aligned with business objectives. 2. Define and implement robust data ingestion patterns and processes to ensure efficient and reliable data flow into the organization's data platform. 3. Contribute to the development and deployment of Apache Airflow on Azure Kubernetes Service (AKS). 4. Develop and maintain reusable data engineering or ETL pipelines and codebases using Python, Airflow, REST APIs, PySpark, Databricks, and the Azure cloud platform. 5. Design and implement robust data APIs using Python frameworks such as FastAPI or Flask, and deploy them on Azure App Service. 6. Work with cross-functional teams to understand data requirements and provide scalable data engineering solutions. 7. Design and implement batch and streaming data architectures leveraging Azure cloud services like Azure Data Factory and Azure Databricks. 8. Ensure adherence to software engineering best practices, including version control, testing, and continuous integration/continuous deployment (CI/CD) processes. 9. Participate in code reviews, technical discussions, and knowledge-sharing sessions within the team.Skills (must have): 1. Strong experience in Python programming language. 2. Extensive experience with Apache Airflow and its deployments on Azure Kubernetes Service. 3. Extensive experience with Apache Airflow developments like Dynamic DAGs and Airflow Rest API's 4. Hands-on experience with Python frameworks like FastAPI or Flask and deploying REST APIs on Azure App Service. 5. Sound understanding of software engineering development practices, including version control, testing, and continuous integration/continuous deployment (CI/CD) with proven experience using Azure DevOps 6. Demonstrated experience of core data engineering concepts and principles. 7. Proficiency in designing and implementing reusable and scalable data engineering pipelines and codebase. 8. Solid experience in designing batch and streaming data architectures using Azure cloud platform services. 9. Effective problem-solving skills and the ability to troubleshoot complex data engineering issues. 10. Commitment to continuous learning and staying updated with industry trends and best practices in data engineering. 11. Strong communication and collaboration skills with English language proficiency. Ability to work effectively in a team environment, provide training, and document processes. Skills (nice to have): 1. Proficiency in writing PySpark code for data processing and transformation. 2. Familiarity with Python tools like Pytest, tox, poetry. 3. Familiarity with Python testing frameworks like pytest and build tools like tox and poetry.Thank YouSatti Reddy