US
0 suggestions are available, use up and down arrow to navigate them
PROCESSING APPLICATION
Hold tight! We’re comparing your resume to the job requirements…

ARE YOU SURE YOU WANT TO APPLY TO THIS JOB?
Based on your Resume, it doesn't look like you meet the requirements from the employer. You can still apply if you think you’re a fit.
Job Requirements of Data Engineer 4:
-
Employment Type:
Contractor
-
Location:
Salem, OR (Onsite)
Do you meet the requirements for this job?
Data Engineer 4
Careers Integrated Resources Inc
Salem, OR (Onsite)
Contractor
Project Title: Data Engineer 4
Project Duration: 03 Months with Possible Extension
Project Location: Remote
Job Description:
Artifacts that you are Accountable/Responsible for:
PySpark and SQL
Maintaining codebase for data pipelines
Init scripts or Makefiles (if required for any part of the build or pipeline)
General Workflow/job/task configurations
Compliance job configurations
Domain-specific business logic
Unit tests
Integration tests
Performance testing utilities where required
Data Product DDL/DML
Code Dependencies
Python wheels
JARs (e.g., *** OAuth or other compliance)
DBX
Specific objects/entities related to the Unity Catalog
Volumes
Checkpoints
Workspaces/Notebooks (Python, SQL, Scala)
Processes as code for interface with Sole utilities
Infrastructure
Cluster configuration as code
JSON
Other IaC for deployments
Terraform modules and scripts
CI/CD and GitHub integrations
System integration profiles
QA test scripts and results
Performance tuning
High-volume data management via spark streaming
Observability
Dashboard configurations
Alerts & queries for monitoring
Technical Documentation (ReadMe, Wiki, Lucid, Confluence)
Responsibilities are to:
Convert SQL script provided by Data and Analytics manager into PySpark
Write Python script for DAG in airflow and workflow (orchestration).
Manage quality assurance testing (systems integration testing, regression testing, etc.)
Conduct upgrades for existing products
Manage data migration services
Lead Production support, Low-Level Design, Code development / Scheduling, Unit Testing, Quality Assurance, Best Practices, Documentation and PS Handoff, Continuous Improvement, 3rd Level Prod Support, Engineering Excellence
Notes:
We are seeking individuals who possess the ability to swiftly adapt and commence their work responsibilities.
Project Duration: 03 Months with Possible Extension
Project Location: Remote
Job Description:
Artifacts that you are Accountable/Responsible for:
PySpark and SQL
Maintaining codebase for data pipelines
Init scripts or Makefiles (if required for any part of the build or pipeline)
General Workflow/job/task configurations
Compliance job configurations
Domain-specific business logic
Unit tests
Integration tests
Performance testing utilities where required
Data Product DDL/DML
Code Dependencies
Python wheels
JARs (e.g., *** OAuth or other compliance)
DBX
Specific objects/entities related to the Unity Catalog
Volumes
Checkpoints
Workspaces/Notebooks (Python, SQL, Scala)
Processes as code for interface with Sole utilities
Infrastructure
Cluster configuration as code
JSON
Other IaC for deployments
Terraform modules and scripts
CI/CD and GitHub integrations
System integration profiles
QA test scripts and results
Performance tuning
High-volume data management via spark streaming
Observability
Dashboard configurations
Alerts & queries for monitoring
Technical Documentation (ReadMe, Wiki, Lucid, Confluence)
Responsibilities are to:
Convert SQL script provided by Data and Analytics manager into PySpark
Write Python script for DAG in airflow and workflow (orchestration).
Manage quality assurance testing (systems integration testing, regression testing, etc.)
Conduct upgrades for existing products
Manage data migration services
Lead Production support, Low-Level Design, Code development / Scheduling, Unit Testing, Quality Assurance, Best Practices, Documentation and PS Handoff, Continuous Improvement, 3rd Level Prod Support, Engineering Excellence
Notes:
We are seeking individuals who possess the ability to swiftly adapt and commence their work responsibilities.
Get job alerts by email.
Sign up now!
Join Our Talent Network!