About Rocket Lawyer
We believe everyone deserves access to affordable and simple legal services. Founded in 2008, Rocket Lawyer is the largest and most widely used online legal service platform in the world. With offices in North America, South America, and Europe, Rocket Lawyer has helped over 30 million people create over 50 million legal documents, and get their legal questions answered.
We are in a unique position to enhance and expand the Rocket Lawyer platform to a scale never seen before in the company's history, to capture audiences worldwide. We are expanding our team to take on this challenge!
About the Role
Rocket Lawyer is seeking an experienced, passionate data engineer for the data engineering team. In this role, you will build a highly reliable, trustworthy data lakehouse that is leveraged across the organization to derive insights and make real-time decisions. You will drive new data projects, build reliable data pipelines, set up strong monitoring and alerting systems, and build data solutions to support a diverse set of use cases.
We value a fun, collaborative, team-oriented work environment, where we celebrate our accomplishments.
Responsibilities
Design and develop data pipelines using tools like Apache Airflow or by leveraging Snowflake's external tables functionality.
Translate HiveQL queries to Snowflake SQL, ensuring compatibility and efficient data processing.
Utilize dbt to model and transform data within Snowflake, adhering to best practices for data governance and maintainability.
Configure and manage data pipelines in GCP to orchestrate data movement and processing tasks.
Collaborate with data analysts and stakeholders to understand data requirements and ensure a successful migration outcome.
Monitor and optimize data pipelines for performance and scalability.
Develop and implement automated testing procedures to validate data quality and integrity after migration.
Document the migration process and provide ongoing support for the migrated data warehouse on Snowflake.
Requirements
Minimum 5+ years of experience as a Data Engineer with a proven track record of successful data warehouse/data lake implementation and management, with at least 2 years of leading small data teams, prioritizing work, and being accountable for all deliveries.
Deep understanding of leveraging Snowflake to build highly performant and resilient data warehouses (or experience with similar platforms).
Strong expertise in HiveQL, SQL, and experience with data warehousing/lake house concepts (dimensional modeling, data quality, etc.).
Strong programming knowledge in Python.
Experience with Apache Spark for large-scale data processing (a plus).
Proficiency in dbt for data modeling and transformation in Snowflake preferred.
Experience working with Google Cloud Platform (GCP) and its data storage services (GCS, BigQuery - a plus) - or experience with similar platforms.
Excellent written and verbal communication skills with the ability to collaborate effectively with cross-functional teams.
Strong problem-solving skills and a passion for building efficient and scalable data solutions.
Preferred Qualifications:
Strong understanding of data architectures and patterns.
Experience in DataOps implementation and support.
Experience in MLOps implementation and support.
Experience in building and supporting AI/ML platform.
Benefits & Perks
Private health insurance
Life insurance
Meal/Food voucher
Wellhub partnership
Mental health assistance
Birthday off
Daycare assistance
Financial support for those who have children with special needs and disabilities
Free Rocket Lawyer account with online access to an extensive legal documents library and brilliant licensed attorneys at discounted rates
Regime de contratação: CLT
Brazil Monthly Compensation: R$26.500 - R$28.500 BRL
By applying for this position, your data will be processed as per Rocket Lawyer.
#J-18808-Ljbffr