US
0 suggestions are available, use up and down arrow to navigate them
PROCESSING APPLICATION
Hold tight! We’re comparing your resume to the job requirements…
ARE YOU SURE YOU WANT TO APPLY TO THIS JOB?
Based on your Resume, it doesn't look like you meet the requirements from the employer. You can still apply if you think you’re a fit.
Job Requirements of Cloud AWS Engineer:
-
Employment Type:
Contractor
-
Location:
Richmond, VA (Onsite)
Do you meet the requirements for this job?
Cloud AWS Engineer
Careers Integrated Resources Inc
Richmond, VA (Onsite)
Contractor
Understand technology vision and strategic direction of business needs
Understand our current data model and infrastructure, proactively identify gaps, areas for improvement, and prescribe architectural recommendations with a focus on performance and accessibility.
Partner across engineering teams to design, build, and support the next generation of our analytics systems.
Partner with business and analytics teams to understand specific requirements for data systems to support both development and deployment of data workloads ranging from Tableau reports to ad hoc analyses.
Own and develop architecture supporting the translation of analytical questions into effective reports that drive business action.
Automate and optimize existing data processing workloads by recognizing patterns of data and technology usage and implementing solutions.
Solid grasp of the intersection between analytics and engineering while maintaining a proactive approach to assure solutions demonstrate high levels of performance, privacy, security, scalability, and reliability upon deployment.
Provide guidance to partners on effective use of the database management systems (DBMS) platform through collaboration, documentation, and associated standard methodologies.
Design and build end to end automation to support and maintain software currency
Create automation services for builds using Terraform, Python, and OS Client scripts.
Develop validation and certification process through automation tools
Design integrated solutions in alignment with design patterns, blueprints, guidelines, and standard methodologies for products
Participate in developing solutions by incorporating cloud native and 3rd party vendor products
Participate in research and perform POCs (proofs of concept) with emerging technologies and adopt industry best practices in the data space for advancing the cloud data platform.
Develop data streaming, migration and replication solutions
Demonstrate leadership, collaboration, exceptional communication, negotiation, strategic and influencing skills to Product consensus and produce the best solutions.
Engage with Senior leadership, business leaders at the Client and the Board to share the business value.
Quals--
Demonstrates mutual respect, embraces diversity, and acts with authenticity
Bachelor s degree in Computer Science, Management Information Systems, Computer Engineering, or related field or equivalent work experience; advance degree preferred
Seven or more years of experience in designing and building large-scale solutions in an enterprise setting in both
Three years in designing and building solutions in the cloud
Expertise in building and maintaining Sagemaker infrastructure while deploying AWS Sagemaker components using Terraform as Infrastructure as Code
Work closely with Data Scientists to ensure seamless deployment of Jupyter notebooks
Deep SQL expertise, data modeling, and experience with data governance in relational databases
Experience with the practical application of data warehousing concepts, methodologies, and frameworks using traditional (Vertica, Teradata, etc.) and current (SparkSQL, Hadoop, Kafka) distributed technologies
Refined skills using one or more scripting languages (e.g., Python, bash, etc.)
Experience using ETL/ELT tools and technologies such as Talend, Informatica a plus
Embrace data platform thinking, design and develop data pipelines keeping security, scale, uptime and reliability in mind
Expertise in relational and dimensional data modeling
UNIX admin and general server administration experience required
Presto, Hive, SparkSQL, Cassandra, or Solr other Big Data query and transformation experience a plus
Experience using Spark, Kafka, Hadoop, or similar distributed data technologies a plus
Able to expertly express the benefits and constraints of technology solutions to technology partners, business partners, and team members
Experience with leveraging CI/CD pipelines
Experience with Agile methodologies and able to work in an Agile manner is preferred
One or more cloud certifications.
Get job alerts by email.
Sign up now!
Join Our Talent Network!