US
0 suggestions are available, use up and down arrow to navigate them
PROCESSING APPLICATION
Hold tight! We’re comparing your resume to the job requirements…
ARE YOU SURE YOU WANT TO APPLY TO THIS JOB?
Based on your Resume, it doesn't look like you meet the requirements from the employer. You can still apply if you think you’re a fit.
Job Requirements of Software Engineer 3:
-
Employment Type:
Contractor
-
Location:
Remote, OR (Onsite)
Do you meet the requirements for this job?
Software Engineer 3
Careers Integrated Resources Inc
Remote, OR (Onsite)
Contractor
Job Description: We are looking for a Senior Software Engineer on Data Enablement team who has the knowledge, experience, and passion to deliver solutions that amaze our consumers, who will understand the needs of managing a modern data environment while maintaining privacy, compliance, user consent, & adhering to local jurisdiction governing usage of data. Our software comprises services, APIs, and solutions that address those challenges and enable our customers to deliver heir overall objectives and support enterprise consumers of their data.
You will be working on software projects related to privacy & governance that support Analytics, Supply Chain and Commerce among others. Your strong problem-solving, collaboration and interpersonal communication skills, desire to learn, share knowledge with others will position you well for this role.
In this role, you will work with a variety of talented *** teammates and be instrumental in delivering innovative and quality products that enable a consistent, maintainable and highly scalable solutions. We don t need you to just code, you must also be able to solve problems and be the champion for ensuring our software is of the highest quality. You will be an advocate of new technology and development techniques and be a driving force for building world-class solutions for *** Technology and its business partners.
Role responsibilities:
Technology competencies:
Design and implement scalable software in collaboration with product owners, data engineers, and business partners using Agile/Scrum methodology
Maintain and improve existing software
Experience delivering software on AWS or other cloud provider
Provide work estimates and represent work progress and challenges
Profile and analyze data for the purpose of designing scalable solutions
Utilize continuous integration and deployment frameworks including automated unit tests and integration testing
Create integrations with various vendor software and in-house software
Experience with OTP & OLAP Databases as well as NoSql dbs and data wharehouses, datalake & lakehouse concepts
Engineering Delivery:
Design and implement highly scalable cloud-based data services
Develop frameworks that enable data ingestion using various patterns
Anticipate, identify and solve issues concerning data management to improve data quality, address operational & performance issues, remove technical bottlenecks and perform root cause analysis as needed.
Contribute to collaborative reviews of designs, code, and test plans
Support existing applications, resolve defects and enhancements
The following qualifications and technical skills are essential for this role:
Bachelor s degree in Computer Science, related degree, and/or relevant industry experience
5+ years of experience in software development building back end components and services
Experience building Services, API s REST, gRPC, GraphQL
Professional experience using AWS services such as: EMR, Lambda, Elasticsearch, RDS, Dynamo & Kinesis
Ability to code in Python, Java, Javascript Scala or other OO & FP languages
Experience with IaC (infra as code) using tools like Terraform and/or CloudFormation
Using build automation tools such as Jenkins
Experience with source code control tools like GitHub or Bitbucket
The following skills and experience are also relevant to our overall environment, and nice to have:
Experience using Jira
Experience using Splunk
Experience writing unit tests, integration tests & e2e tests
Knowledge of data wharehouse, data lake & lakehouse concepts
Preferred Skills:
Python, Java, Scala, Javascript, Sql, streams, streaming data, NoSql, data wharehousing, RDS, Postgresql, CoPilot, JetBrains, unit testing, integration testing, OOD, FP, Data Modeling, distributed systems.
Comments for Suppliers: Open to remote but must work PST hours
You will be working on software projects related to privacy & governance that support Analytics, Supply Chain and Commerce among others. Your strong problem-solving, collaboration and interpersonal communication skills, desire to learn, share knowledge with others will position you well for this role.
In this role, you will work with a variety of talented *** teammates and be instrumental in delivering innovative and quality products that enable a consistent, maintainable and highly scalable solutions. We don t need you to just code, you must also be able to solve problems and be the champion for ensuring our software is of the highest quality. You will be an advocate of new technology and development techniques and be a driving force for building world-class solutions for *** Technology and its business partners.
Role responsibilities:
Technology competencies:
Design and implement scalable software in collaboration with product owners, data engineers, and business partners using Agile/Scrum methodology
Maintain and improve existing software
Experience delivering software on AWS or other cloud provider
Provide work estimates and represent work progress and challenges
Profile and analyze data for the purpose of designing scalable solutions
Utilize continuous integration and deployment frameworks including automated unit tests and integration testing
Create integrations with various vendor software and in-house software
Experience with OTP & OLAP Databases as well as NoSql dbs and data wharehouses, datalake & lakehouse concepts
Engineering Delivery:
Design and implement highly scalable cloud-based data services
Develop frameworks that enable data ingestion using various patterns
Anticipate, identify and solve issues concerning data management to improve data quality, address operational & performance issues, remove technical bottlenecks and perform root cause analysis as needed.
Contribute to collaborative reviews of designs, code, and test plans
Support existing applications, resolve defects and enhancements
The following qualifications and technical skills are essential for this role:
Bachelor s degree in Computer Science, related degree, and/or relevant industry experience
5+ years of experience in software development building back end components and services
Experience building Services, API s REST, gRPC, GraphQL
Professional experience using AWS services such as: EMR, Lambda, Elasticsearch, RDS, Dynamo & Kinesis
Ability to code in Python, Java, Javascript Scala or other OO & FP languages
Experience with IaC (infra as code) using tools like Terraform and/or CloudFormation
Using build automation tools such as Jenkins
Experience with source code control tools like GitHub or Bitbucket
The following skills and experience are also relevant to our overall environment, and nice to have:
Experience using Jira
Experience using Splunk
Experience writing unit tests, integration tests & e2e tests
Knowledge of data wharehouse, data lake & lakehouse concepts
Preferred Skills:
Python, Java, Scala, Javascript, Sql, streams, streaming data, NoSql, data wharehousing, RDS, Postgresql, CoPilot, JetBrains, unit testing, integration testing, OOD, FP, Data Modeling, distributed systems.
Comments for Suppliers: Open to remote but must work PST hours
Get job alerts by email.
Sign up now!
Join Our Talent Network!