via Indeed
$83K - 125K a year
Design and maintain scalable data pipelines and Python backend services collaborating with cross-functional teams to deliver data-driven solutions.
5+ years Python backend development, 3-5 years data engineering experience, strong SQL and cloud platform skills, and experience with containerization and data processing frameworks.
Data Engineer with strong Python backend development expertise Plano, TX (Remote– 5 days/week) Job Summary: We are seeking a highly skilled Data Engineer with strong Python backend development expertise to join our team in Plano, TX. The ideal candidate will have hands-on experience designing and building scalable data pipelines, integrating with APIs and databases, and developing robust backend systems. This role requires close collaboration with data scientists, analysts, and application developers to deliver high-performance data-driven solutions. Key Responsibilities: • Design, build, and maintain scalable data pipelines and ETL/ELT processes for large-scale data integration. • Develop and optimize Python-based backend services, APIs, and microservices for data processing and delivery. • Work with data ingestion frameworks to extract data from multiple structured and unstructured sources. • Implement data transformation and cleansing logic to ensure data quality, accuracy, and consistency. • Collaborate with cross-functional teams to define data architecture, schemas, and performance improvements. • Deploy, monitor, and maintain backend and data processing applications in cloud environments (AWS/Azure/GCP). • Optimize queries and pipelines for performance, reliability, and scalability. • Implement CI/CD automation, testing, and documentation for data and backend workflows. • Integrate security, logging, and error-handling best practices into all backend components. Required Skills & Experience: • 5+ years of experience in Python development with solid knowledge of backend frameworks such as FastAPI, Flask, or Django. • 3–5 years of experience as a Data Engineer, building and maintaining production-grade ETL pipelines. • Strong proficiency in SQL and data modeling (Relational & NoSQL). • Experience with Cloud platforms (AWS, Azure, or GCP) and data services (S3, Lambda, Redshift, BigQuery, Snowflake, etc.). • Solid understanding of data processing frameworks (e.g., Spark, Airflow, Kafka, Glue). • Hands-on experience with containerization and orchestration tools (Docker, Kubernetes). • Strong experience with API design, authentication, and integration. • Knowledge of Git, CI/CD pipelines, and DevOps practices. • Excellent debugging, optimization, and analytical skills Job Types: Full-time, Contract Pay: $40.00 - $60.00 per hour Expected hours: 40 per week Work Location: Remote
This job posting was last updated on 11/24/2025