US
0 suggestions are available, use up and down arrow to navigate them
What job do you want?

Apply to this job.

Think you're the perfect candidate?

Data Engineer

Careers Integrated Resources Inc Painted Post, NY (Onsite) Contractor


Duration: 12 months
Pay: $40 hourly

Job Description:




Education and Experience

This position focuses on Data Pipelines & Workflows

  • Bachelors degree in computer science, information systems, data engineering, or related field, or equivalent practical experience. An Associates degree may be considered if the candidate has an additional 35 years of relevant experience beyond the stated requirement.

  • 2+ years of professional experience in data engineering, ETL development, or related work, or equivalent hands-on experience.

  • Experience or interest in scientific software, materials science, research environments, or technically complex domains is a plus.



Work Schedule

  • Typical 40 hours per week.

  • May require working weekends, holidays, or longer days to support projects.




Product of Position

  1. Embed within a cross-functional Agile team, participating in sprint planning, stand-ups, backlog refinement, and technical discussions.

  2. Design, build, troubleshoot, and maintain ETL/ELT workflows supporting application functionality, analytics, reporting, and scientific workflows.

  3. Develop and manage data pipelines using Apache Airflow, ensuring reliable orchestration, scheduling, monitoring, and recovery of data processes.

  4. Collaborate with software developers, scientists, and engineers to understand data sources, workflow requirements, and downstream data needs.

  5. Extract, transform, validate, and load data across systems, including relational databases such as PostgreSQL and Oracle.

  6. Write, optimize, and maintain complex SQL queries, scripts, and transformation logic for operational and analytical use cases.

  7. Troubleshoot data quality issues, ETL failures, pipeline bottlenecks, and schema inconsistencies; identify root causes and implement durable solutions.

  8. Support database exploration, data validation, and troubleshooting using DBeaver or similar database tools.

  9. Evaluate and help adopt new data tools and technologies, including lightweight analytics and transformation solutions such as DuckDB.

  10. Collaborate with engineering teams to support reliable integration between data pipelines, applications, APIs, and downstream consumers.

  11. Assist with schema evolution, data modeling, migration planning, and data consistency across systems.

  12. Document pipeline logic, data dependencies, transformation rules, and operational procedures to support maintainability and knowledge sharing.

  13. Improve data engineering standards, observability, testing practices, and operational reliability across the team.

  14. Regularly interact with scientists and engineers to understand research and technical workflows; experience in scientific or research environments is a plus.




Technical Skills 2+ Years (or Equivalent Experience)

  1. Experience designing, building, and troubleshooting ETL/ELT pipelines.

  2. Hands-on experience with workflow orchestration tools, preferably Apache Airflow.

  3. Strong SQL development and optimization skills.

  4. Experience working with relational databases, especially PostgreSQL and Oracle.

  5. Ability to develop and maintain data transformations, validation steps, and pipeline logic across multiple systems.

  6. Experience with database tools such as DBeaver for query development, exploration, and troubleshooting.

  7. Familiarity with modern data processing and analytical tools such as DuckDB, or interest in evaluating emerging data technologies.

  8. Understanding of data modeling, schema design, data integrity, and performance tuning.

  9. Experience troubleshooting pipeline failures, performance issues, and inconsistent or incomplete datasets.

  10. Familiarity with scripting or programming for pipeline development and automation; Python experience strongly preferred.

  11. Understanding of version control and collaborative development workflows.

  12. Experience supporting production data systems with an emphasis on reliability, maintainability, and clear documentation.




Team Skills

  1. Confident collaborating with developers, scientists, analysts, and product stakeholders.

  2. Ability to gather and clarify technical and data requirements and translate them into scalable data solutions.

  3. Strong communication skills regarding pipeline status, data quality issues, dependencies, and tradeoffs.

  4. Comfortable handling ambiguity, improving incomplete processes, and helping define best practices.

  5. Proactive in identifying opportunities to improve data workflows, tooling, performance, and operational stability.




Soft Skills

  1. Strong analytical and problem-solving skills.

  2. High attention to detail and commitment to data quality, consistency, and reliability.

  3. Demonstrated initiative in troubleshooting issues and improving pipeline robustness.

  4. Curiosity and willingness to evaluate and adopt new tools, technologies, and approaches.

  5. Ability to balance immediate operational needs with long-term maintainability and scalability.

  6. Comfortable proposing improvements, collaborating across teams, and building trust through reliable execution.

Get job alerts by email. Join Our Talent Network!

Job Snapshot

Employee Type

Contractor

Location

Painted Post, NY (Onsite)

Job Type

Engineering

Experience

Not Specified

Date Posted

05/14/2026

Job ID

26-11656

Apply to this job.

Think you're the perfect candidate?