US
0 suggestions are available, use up and down arrow to navigate them
PROCESSING APPLICATION
Hold tight! We’re comparing your resume to the job requirements…

ARE YOU SURE YOU WANT TO APPLY TO THIS JOB?
Based on your Resume, it doesn't look like you meet the requirements from the employer. You can still apply if you think you’re a fit.
Job Requirements of Data Engineer:
-
Employment Type:
Contractor
-
Location:
Houston, TX (Onsite)
Do you meet the requirements for this job?
Data Engineer
Careers Integrated Resources Inc
Houston, TX (Onsite)
Contractor
Title: Data Engineer
100% Remote
Contract: 4 Months and Possibility of extension
Must Have:
AWS, Snowflake, SQL, DBT, Python
Job Summary
The Data Engineer will be responsible for developing ETL and data pipeline using AWS, Snowflake & DBT. The ideal candidate is an experienced data pipeline builder using ELT methodology . They must be self-directed and comfortable supporting the data needs of multiple teams, systems, and products.
Essential Duties and Responsibilities
Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using Cloud Integration ETL Tools, Cloud Data Warehouse, SQL, and AWS technologies.
Design and develop ELT, ETL, Event drivern data integration architecture solutions
Work with the Data Analysts, Data Architects, BI Architects, Data Scientists, and Data Product Owners to establish an understanding of source data and determine data transformation and integration requirements
Troubleshoot and tune complex SQL
Utilize On-Prem and Cloud-based ETL platforms, Cloud Datawarehouse, AWS, GitHub, various scripting languages, SQL, querying tools, data quality tools, and metadata management tools.
Develop data validation processes to ensure data quality
Demonstrated ability to work individually and as a part of the team in a collaborative manner
Qualifications
The requirements listed below are representative of the qualifications necessary to perform the job.
Education and Experience
Bachelor's degree (or foreign equivalent) in Computer Science, Computer Engineering, or a related field.
8+ years of experience with Data Engineering, ETL, data warehouse/data mart development, data lake development
Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases.
Experience working with Cloud Datawarehouse like Snowflake, Google BigQuery, Amazon Redshift
Experience with AWS cloud services: EC2, S3, Lambda, SQS, SNS, etc.
Experience with Cloud Integration Tools like Matillion, Dell Boomi, Informatica Cloud, Talend, AWS Glue
Experience with GitHub and its integration with the ETL tools for version control
Experience with I nformatica PowerCenter, various scripting languages, SQL, querying tools
Familiarity with modern data management tools and platforms including Spark, Hadoop/Hive, NoSQL, APIs, Streaming, and other analytic data platforms
Experience with object-oriented/object function scripting languages: Python, Java, Scala, etc., a plus.
Experience with Agile/Scrum is valuable.
Other Knowledge, Skills, or Abilities Required
Must Speak English.
Ability to work with business as well as IT stakeholders.
Strong written and oral communication skills with the ability to work with users, peers, and management.
Strong interpersonal skills.
Ability to work independently and as part of a team to successfully execute projects.
Highly motivated, self-starter with problem-solving skills.
Ability to multitask and meet aggressive deadlines efficiently and effectively.
Extraordinary attention to detail and accuracy.
100% Remote
Contract: 4 Months and Possibility of extension
Must Have:
AWS, Snowflake, SQL, DBT, Python
Job Summary
The Data Engineer will be responsible for developing ETL and data pipeline using AWS, Snowflake & DBT. The ideal candidate is an experienced data pipeline builder using ELT methodology . They must be self-directed and comfortable supporting the data needs of multiple teams, systems, and products.
Essential Duties and Responsibilities
Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using Cloud Integration ETL Tools, Cloud Data Warehouse, SQL, and AWS technologies.
Design and develop ELT, ETL, Event drivern data integration architecture solutions
Work with the Data Analysts, Data Architects, BI Architects, Data Scientists, and Data Product Owners to establish an understanding of source data and determine data transformation and integration requirements
Troubleshoot and tune complex SQL
Utilize On-Prem and Cloud-based ETL platforms, Cloud Datawarehouse, AWS, GitHub, various scripting languages, SQL, querying tools, data quality tools, and metadata management tools.
Develop data validation processes to ensure data quality
Demonstrated ability to work individually and as a part of the team in a collaborative manner
Qualifications
The requirements listed below are representative of the qualifications necessary to perform the job.
Education and Experience
Bachelor's degree (or foreign equivalent) in Computer Science, Computer Engineering, or a related field.
8+ years of experience with Data Engineering, ETL, data warehouse/data mart development, data lake development
Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases.
Experience working with Cloud Datawarehouse like Snowflake, Google BigQuery, Amazon Redshift
Experience with AWS cloud services: EC2, S3, Lambda, SQS, SNS, etc.
Experience with Cloud Integration Tools like Matillion, Dell Boomi, Informatica Cloud, Talend, AWS Glue
Experience with GitHub and its integration with the ETL tools for version control
Experience with I nformatica PowerCenter, various scripting languages, SQL, querying tools
Familiarity with modern data management tools and platforms including Spark, Hadoop/Hive, NoSQL, APIs, Streaming, and other analytic data platforms
Experience with object-oriented/object function scripting languages: Python, Java, Scala, etc., a plus.
Experience with Agile/Scrum is valuable.
Other Knowledge, Skills, or Abilities Required
Must Speak English.
Ability to work with business as well as IT stakeholders.
Strong written and oral communication skills with the ability to work with users, peers, and management.
Strong interpersonal skills.
Ability to work independently and as part of a team to successfully execute projects.
Highly motivated, self-starter with problem-solving skills.
Ability to multitask and meet aggressive deadlines efficiently and effectively.
Extraordinary attention to detail and accuracy.
Get job alerts by email.
Sign up now!
Join Our Talent Network!