via Dice
$NaNK - NaNK a year
Analyzing data and developing dashboards and workflows for operational insights.
Proficiency in SQL, Python, data analysis, and visualization, with experience in BI tools and data workflows.
Title: - Data Engineer Location: Remote Duration: Long-term Mandatory Skills: Java, Spring Boot, Kafka, Python and Airflow Job Description • Strong hands-on experience with Apache Airflow and Python • Proficient in Python scripting for Airflow DAGs • Experience with Airflow plugins, custom operators, and sensors • Knowledge of data pipeline orchestration best practices • Experience with Apache NiFi is nice to have • Familiarity with Cloud platforms AWS, Google Cloud Platform, Azure and on-premises environments • Understanding of CICD pipelines and version control, e.g., Git, is a plus • Exposure in Apache Kafka brokers, topics, partitions, replication preferred • Experience with Kafka Connect Schema Registry and Kafka Streams preferred • Exposure to Data Warehousing concepts and SQL • Experience with Data Lakes preferred • Agile Scrum experience in live projects • Ability to develop, execute tests, analyse data, identify defects and fix issues • Exposure to the Banking domain, especially Treasury, is a plus
This job posting was last updated on 1/8/2026