US
0 suggestions are available, use up and down arrow to navigate them
PROCESSING APPLICATION
Hold tight! We’re comparing your resume to the job requirements…
ARE YOU SURE YOU WANT TO APPLY TO THIS JOB?
Based on your Resume, it doesn't look like you meet the requirements from the employer. You can still apply if you think you’re a fit.
Job Requirements of Data Engineer 4:
-
Employment Type:
Contractor
-
Location:
Remote, OR (Onsite)
Do you meet the requirements for this job?
Data Engineer 4
Careers Integrated Resources Inc
Remote, OR (Onsite)
Contractor
Job Description: Bachelor s degree or higher or combination of relevant education, experience, and training in Computer Science.
6+ years experience in Data Engineering.
4+ years of experience working with Python specifically related to data processing with proficiency in Python Libraries - Pandas, NumPy, PySpark, PyOdbc, PyMsSQL, Requests, Boto3, SimpleSalesforce, Json.
3+ years of experience in Data Warehouse technologies Databricks and Snowflake.
Strong Data Engineering Fundamentals (ETL, Modelling, Lineage, Governance, Partitioning & Optimization, Migration).
Strong Databricks-specific skills (Apache Spark, DB SQL, Delta Lake, Delta Share, Notebooks, Workflows, RBAC, Unity Catalog, Encryption & Compliance).
Strong SQL (SQL, performance, Stored Procedures, Triggers, schema design) skills and knowledge of one of more RDBMS and NoSQL DBs like MSSQL/MySQL and DynamoDB/MongoDB/Redis.
Cloud Platform Expertise: AWS and/or Azure.
Experience in one or more ETL tools like Apache Airflow/AWS Glue/Azure Data Factory/Talend/Alteryx.
Excellent knowledge of coding and architectural design patterns.
Passion for troubleshooting, investigation and performing root-cause analysis.
Excellent written and verbal communication skills.
Ability to multitask in a high energy environment.
Agile methodologies and knowledge of Git, Jenkins, GitLab, Azure DevOps and tools like Jira/Confluence.
Nice to have:
Tools like - Collibra, Hackolade.
Migration Strategy and Tooling
Data Migration Tools: Experience with migration tools and frameworks or custom-built solutions to automate moving data from Snowflake to Databricks.
Testing and Validation: Ensuring data consistency and validation post-migration with testing strategies like checksums, row counts, and query performance benchmarks
Comments for Suppliers: Preferred Beaverton WHQ - Please note if local
If Remote, must work PST hours - please confirm in notes willingness to work PST hours
Will be working with 2 lead engineers, one FTE engineer and 4 other ETWs
6+ years experience in Data Engineering.
4+ years of experience working with Python specifically related to data processing with proficiency in Python Libraries - Pandas, NumPy, PySpark, PyOdbc, PyMsSQL, Requests, Boto3, SimpleSalesforce, Json.
3+ years of experience in Data Warehouse technologies Databricks and Snowflake.
Strong Data Engineering Fundamentals (ETL, Modelling, Lineage, Governance, Partitioning & Optimization, Migration).
Strong Databricks-specific skills (Apache Spark, DB SQL, Delta Lake, Delta Share, Notebooks, Workflows, RBAC, Unity Catalog, Encryption & Compliance).
Strong SQL (SQL, performance, Stored Procedures, Triggers, schema design) skills and knowledge of one of more RDBMS and NoSQL DBs like MSSQL/MySQL and DynamoDB/MongoDB/Redis.
Cloud Platform Expertise: AWS and/or Azure.
Experience in one or more ETL tools like Apache Airflow/AWS Glue/Azure Data Factory/Talend/Alteryx.
Excellent knowledge of coding and architectural design patterns.
Passion for troubleshooting, investigation and performing root-cause analysis.
Excellent written and verbal communication skills.
Ability to multitask in a high energy environment.
Agile methodologies and knowledge of Git, Jenkins, GitLab, Azure DevOps and tools like Jira/Confluence.
Nice to have:
Tools like - Collibra, Hackolade.
Migration Strategy and Tooling
Data Migration Tools: Experience with migration tools and frameworks or custom-built solutions to automate moving data from Snowflake to Databricks.
Testing and Validation: Ensuring data consistency and validation post-migration with testing strategies like checksums, row counts, and query performance benchmarks
Comments for Suppliers: Preferred Beaverton WHQ - Please note if local
If Remote, must work PST hours - please confirm in notes willingness to work PST hours
Will be working with 2 lead engineers, one FTE engineer and 4 other ETWs
Get job alerts by email.
Sign up now!
Join Our Talent Network!