About the Role/Team
We're the Platform Engineering team at WEX, specializing in Kafka and Streaming Data Enablement.
Our mission is to facilitate real-time data capabilities across the organization while continuously modernizing and improving our products and technologies.
By leveraging the latest and most efficient tools, we deliver top-tier solutions that meet the evolving needs of our customers.
We have a broad range of exciting projects and we're looking for talented individuals to join us.
Our team fosters collaboration, mutual support and a commitment to maintaining a healthy work-life balance.
We are passionate about empowering change across the organization by enabling data democratization, making data accessible, usable and valuable to teams throughout the enterprise.
While applying industry best practices in Software Development and DevOps practices, we drive innovation by providing scalable, secure and efficient streaming data solutions, helping our business units harness data to make informed decisions.
While we move quickly to deliver impactful results, we remain mindful of the compliance and regulatory standards that govern the payments industry.
How you'll make an impact
You will work with our Kafka as a Service (KaaS) Engineering team to help us configure and run Kafka at the Enterprise level.
You will help us create canonical Kafka topics for the entire enterprise.
Our Eventing Platform is growing and we need a single source of truth from multiple data sources within our company.
You will help us define standards for message content, serialization schemas, canonical topic naming conventions, and anything else that helps our Eventing Platform reach new levels of excellence.
You will work closely with one or more of our divisions to understand how their data is structured.
You will work with our Engineering team to ensure our Eventing Platform meets the data needs of our customers.
This involves estimating data storage and cluster size, estimating platform data growth, helping with new monitoring and observability metrics, and helping ensure optimal performance and high availability.
You will ultimately make the job of our company's software engineers easier, by developing tools and pipelines to increase the speed of deployment and application workflows using GitOps principles.
We practice and set exemplary standards for modern end-to-end software development, utilizing best practices for all phases of the SDLC from requirements and design through engineering and testing to delivery and well-managed operations.
You will learn, research, and prototype new modern tools for our teams.
You will create Proof of Concepts (PoCs) and share your findings with a broader audience.
As a result, you will constantly learn new eventing technologies, processes, and tools.
You will help us enrich our events with meaningful information from various topics and data sources, working with Apache Flink or similar solutions.
You will help provide architectural blueprints, prototype solutions, analyze data flows, and create the necessary specs to roll out various solutions.
You're open-minded and have very strong soft skills to relate, collaborate, and communicate well with a diverse audience.
You love to learn and code!
This role changes rapidly, technology evolves quickly and we want to provide the best options for our customers continuously.
Experience you'll bring
You're passionate about data and technology and love to learn and try new things.
You're creative and feel comfortable with constant changes in the industry.
Solid development experience with at least one major programming language (Java, C#, Python, or Golang).
Proven success in delivering software features through all phases of the SDLC and building, testing, and deploying using DevOps principles.
Good experience of designing and implementing data solutions.
Hands-on data warehousing, data lake, and/or data pipeline experience is required.
Hands-on experience with at least one major RDBMS and NoSQL data store.
You know how to optimize and troubleshoot large data stores.
Working knowledge in designing and building data pipelines that meet business SLAs.
Experience in delivering solutions in the cloud, preferably AWS.
You've worked with major AWS components such as EC2, S3, SQS, etc.
Equivalent experience with GCP or Azure is also acceptable.
It would be nice if you have
Academic degree in Computer Science or equivalent field.
You know how to code and deliver containerized solutions with Docker.
Experience with AI/ML.
Experience with Apache Flink or similar tools.
Familiarity with GitHub and GitHub Actions or equivalent.
Familiarity with Terraform or equivalent tool.
Delivered projects with strong failover capabilities, with multi-region or even multi-cloud support.
Good understanding of security-related concepts and best practices, such as OWASP, SSO, ACLs, TLS, tokenization, etc.
You delivered solutions with PCI-DSS and/or HIPAA requirements.
You participated in data and process audits.
If you are looking for a growing career – come be part of WEX today!
#J-18808-Ljbffr