via Workable
$0K - 0K a year
Designing and implementing scalable data pipelines, leading data engineering teams, and ensuring data quality and security.
7+ years in data engineering or related field, proficiency in Python/Java/Scala, strong SQL skills, experience with ETL and cloud platforms, and knowledge of big data technologies.
At Accellor, we strive to be the premier consultancy that leverages cutting-edge Cloud technology to enhance customer engagement and drive business effectiveness across various sectors including Finance, Retail, High Tech, Healthcare, and beyond. Our culture fosters curiosity, continuous learning, and tenacity. We empower our team to flourish and pursue their passions, cultivating an environment rich in collaboration, autonomy, and accountability. We value our people's commitment and pride in their work, as well as their enthusiasm and drive to construct optimal solutions while keeping the overarching goals in mind. As a Lead Data Engineer, you will play a crucial role in architecting and implementing robust data engineering solutions. Your expertise in managing and optimizing data flows, migrations, and transformations will be vital in supporting our projects. The ideal candidate will possess a mix of technical skills and strategic vision, ensuring that we leverage data to its fullest potential, thus achieving our business objectives. Responsibilities: Design and build scalable, high-performance data pipelines and architecture. Lead and mentor a team of data engineers, fostering a culture of best practices in data management and engineering. Collaborate with cross-functional teams to grasp business needs and translate them into technical requirements. Ensure data quality, integrity, and security throughout the data lifecycle. Stay updated with emerging technologies and industry trends to innovate and enhance our data practices. Contribute to the continuous improvement of our data infrastructure and processes. 7+ years of experience in data engineering or a related field. Proficiency in programming languages such as Python, Java, or Scala. Strong knowledge of SQL and experience with database technologies (e.g., PostgreSQL, MySQL). Hands-on experience with ETL tools and data integration techniques. Excellent understanding of data modeling concepts and best practices. Ability to work collaboratively and communicate effectively with team members and stakeholders. Experience with cloud platforms (AWS, Azure, GCP) is a plus. Strong analytical, problem-solving skills, and attention to detail. Experience with big data technologies like Hadoop or Spark is advantageous. Certification in relevant technologies (e.g., AWS, Azure) would be beneficial.
This job posting was last updated on 12/16/2025