$110K - 150K a year
Develop, manage, and optimize AWS-based big data pipelines and platforms, ensuring data security, performance, and compliance while collaborating cross-functionally.
10+ years in data engineering with expertise in Python/Java/Scala, relational and NoSQL databases, cloud data platforms like AWS, ETL tools, big data technologies, and strong data modeling skills.
Our Platform Engineering Team is working to solve the Multiplicity Problem. We are trusted by some of the most reputable and established FinTech Firms. Recently, our team has spearheaded the Conversion & Go Live of apps that support the backbone of the Financial Trading Industry. Are you a data enthusiast with a natural ability for analytics? We’re looking forward skilled Data/Analytics Engineers to fill multiple roles for our exciting new client. This is your chance to shine, demonstrating your dedication and commitment in a role that promises both challenge and reward. What We Offer: A dynamic environment where your skills will make a direct impact. The opportunity to work with cutting-edge technologies and innovative projects. A collaborative team that values your passion and focus. We are looking for Engineers who can • Develop and deliver AWS-based data solutions with hands-on cloud engineering expertise. • Build and manage big data pipelines using Snowflake & Postgres or similar Spark-based platforms. • Design and implement parallel ETL processes to optimize resources and processing speed. • Architect data models and modern data solutions, including cloud technologies. • Collaborate cross-functionally to translate business needs into technical requirements. • Drive performance monitoring and reliability of the Data Platform. • Lead the creation of technical specification, design, and end-to-end documentation. • Ensure runbook SLA compliance and oversee daily cloud platform operations. • Utilize tools like Git, Jira, and agile methodologies for project management. • Ensure data security and compliance with relevant regulations. Key Skills & Qualifications: • Bachelor's or Master's degree in Computer Science, Engineering, or a related field. • 10+ years of experience in Data engineering, with a strong focus on building and managing data pipelines. • Proficiency in programming languages such as Python, Java, or Scala. • Extensive experience with relational databases (e.g., PostgreSQL, MySQL) and NoSQL databases (e.g., DynamoDB). • Demonstrated experience with cloud data platforms (e.g., AWS) and their data services (e.g., S3, Redshift, Snowflake, BigQuery, Data Factory). • Solid understanding of ETL/ELT principles and experience with tools like Apache Airflow, dbt, or similar. • Experience with big data technologies such as Apache Spark, Kafka. • Strong understanding of data modeling, data warehousing concepts, and dimensional modeling. • Excellent problem-solving, analytical, and communication skills. • Ability to work independently and as part of a collaborative team. We work closely with • Data Wrangling • ETL • Talend • Jasper • Java • Python • Unix • AWS • Data Warehousing • Data Modeling • Database Migration • RBAC model • Data migration Our Process • Schedule a 15 min Video Call with someone from our Team • 4 Proctored GQ Tests (< 2 hours) • 30-45 min Final Video Interview • Receive Job Offer If you are interested in reaching out to us, please apply and our team will contact you within the hour.
This job posting was last updated on 10/11/2025