US
0 suggestions are available, use up and down arrow to navigate them
PROCESSING APPLICATION
Hold tight! We’re comparing your resume to the job requirements…

ARE YOU SURE YOU WANT TO APPLY TO THIS JOB?
Based on your Resume, it doesn't look like you meet the requirements from the employer. You can still apply if you think you’re a fit.
Job Requirements of Cloud ETL Developer:
-
Employment Type:
Contractor
-
Location:
Richmond, VA (Onsite)
Do you meet the requirements for this job?
Cloud ETL Developer
Careers Integrated Resources Inc
Richmond, VA (Onsite)
Contractor
Job Title: Sr. Cloud ETL Developer
Location: Richmond VA (Hybrid - 3 days/week)
Duration: 2+ months Contract (Possible extension)
Interview mode: Both Phone and In Person
Job Description:
Position Summary:
The Client's Information Technology Division is seeking a senior ETL developer to ingest/transform and load Data Assets and implement a cloud-based data management platform that will support the agency.
ETL developer to extract business data and load it into a data warehousing environment. Design, program and test the performance of the system. Consult with various teams to understand the agency s data storage needs and develop data warehousing options. Deep knowledge of coding languages, such as Azure Data Factory, Databricks, python, XML, and SQL. Well-versed in warehousing architecture techniques such as MOLAP, ROLAP, ODS, DM, and EDW.
Responsibilities:
Preferred Skills:
Computer Skills/Ms Office/Software:
Technologies Required:
Skills:
Desired:
Location: Richmond VA (Hybrid - 3 days/week)
Duration: 2+ months Contract (Possible extension)
Interview mode: Both Phone and In Person
Job Description:
Position Summary:
The Client's Information Technology Division is seeking a senior ETL developer to ingest/transform and load Data Assets and implement a cloud-based data management platform that will support the agency.
ETL developer to extract business data and load it into a data warehousing environment. Design, program and test the performance of the system. Consult with various teams to understand the agency s data storage needs and develop data warehousing options. Deep knowledge of coding languages, such as Azure Data Factory, Databricks, python, XML, and SQL. Well-versed in warehousing architecture techniques such as MOLAP, ROLAP, ODS, DM, and EDW.
Responsibilities:
- Designs and develops integrations for the Enterprise Data Asset program, ETL processes, and business intelligence.
- Develop data engineering processes that leverage a cloud architecture and will extend or migrate our existing data pipelines to this architecture as needed.
- Design and supports the DW database and table schemas for new and existent data sources for the data hub and warehouse. Design and development of Data Marts.
- Work closely with data analysts, data scientists, and other data consumers within the business in an attempt to gather and populate data hub and data warehouse table structure, which is optimized for reporting.
- The Data developers partners with Data modeler and Data architect in an attempt to refine the business s data requirements, which must be met for building and maintaining Data Assets.
- Understanding of Agile methodologies and processes
Preferred Skills:
- Advanced understanding of data integrations.
- Strong knowledge of database architectures
- Strong analytical and problem solving skills
- Ability to build strong relationships both internally and externally
- Ability to negotiate and resolve conflicts
- Ability to effectively prioritize and handle multiple tasks and projects
- Strong written and verbal communication skills
- Desire to learn, innovate and evolve technology
Computer Skills/Ms Office/Software:
- Excellent computer skills and be highly proficient in the use of Ms Word, PowerPoint, Ms Excel, MS Project, MS Visio, and MS Team Foundation Server, which will all be necessary in the creation of visually and verbally engaging ETL, data designs and tables as well as the communication of documentation and reporting.
- Deep passion for data analytics technologies as well as analytical and dimensional modeling. The candidate must be extensively familiar with ETL (Extraction, Transformation & Load), data warehousing, and business intelligence tools such as business objects,PowerBI and Tableau.
- The candidate must also have vast knowledge of database design and modeling in the context of data warehousing.
- Experience with key data warehousing architectures including Kimball and Inmon, and has a broad experience designing solutions using a broad set of data stores (e.g., HDFS, Azure Data Lake Store, Azure Blob Storage, Azure SQL Data Warehouse, Databricks
Technologies Required:
- Data Factory v2,Data Lake Store, Data Lake Analytics, Azure Analysis Services, Azure Synapse, Snowflake
- IBM Datastage, Erwin, SQL Server (SSIS, SSRS, SSAS), ORACLE, T-SQL, Azure SQL Database, Azure SQL Datawarehouse.
- Operating System Environments (Windows, Unix, etc.).
- Scripting experience with Windows and/or Python, Linux Client scripting
Skills:
- Designs and develops systems for the maintenance of the Data Asset Program, ETL processes, and business intelligence.
- Design and supports the DW database and table schemas for new and existent data sources for the data hub and warehouse. Design and development of Data
- Work closely with data analysts, data scientists, and other data consumers within the business in an attempt to gather and populate data hub and data
- Advanced understanding of data integrations. Strong knowledge of database architectures, strong understanding of ingesting spatial data
- Ability to negotiate and resolve conflicts, Ability to effectively prioritize and handle multiple tasks and projects
- Excellent computer skills and be highly proficient in the use of Ms Word, PowerPoint, Ms Excel, MS Project, MS Visio, and MS Team Foundation Server
- Experience with key data warehousing architectures including Kimball and Inmon, and has a broad experience designing solutions using a broad set of da
- expertise in Data Factory v2,Data Lake Store, Data Lake Analytics, Azure Analysis Services, Azure Synapse
- IBM Datastage, Erwin, SQL Server (SSIS, SSRS, SSAS), ORACLE, T-SQL, Azure SQL Database, Azure SQL Datawarehouse.
- Operating System Environments (Windows, Unix, etc.). Scripting experience with Windows and/or Python, Linux Client scripting
- Experience in AZURE Cloud engineering
Desired:
- Experience with Snowflake
Get job alerts by email.
Sign up now!
Join Our Talent Network!