DevOps Engineer/DataOps Engineer needs 3+ years’ experience with AWS Core services (Lambda, S3, ECS, Glue, Kinesis, IAM, CloudWatch, EC2, EKS etc.)

DevOps Engineer/DataOps Engineer requires:

• Detroit, MI ; Charlotte, NC

• 4 years of experience working with CI/CD tools (Gitlab, GitHub etc.)

• Solid experience with multiple scripting languages (Python, Bash, PowerShell etc.)

• Minimum 2 years’ Experience with configuration management/infrastructure as code (Terraform, Ansible.)

• AWS – certification preferred

• Development experience with databases, SQL, and familiarity with Snowflake is highly desired

• Solid understanding of DevSecOps methodologies and practice

• Six or more years of experience with core development languages (Java, NodeJS, .NET, etc.)

• Experience with REST APIs

• Knowledge of common data serialization formats (YAML, JSON, XML, etc.)

• Automation experience with Unix/Red Hat Linux and Ansible is a plus

• Detail and results oriented

• Ability to work independently and prioritize tasks

• Working knowledge of Agile methodologies

• Prior work experience in financial services and/or other regulated environments is a plus.

• Bachelor’s degree in a technical field or 3+ years of experience in lieu of degree

DevOps Engineer/DataOps Engineer duties:

• Gather client requirements and design consistent, repeatable processes for complex operations

• Ability to collaborate with IT operations, development teams, and other Line of Business (LOB) partners

• Drive day-to-day DevOps activities: intake, manage priorities, deliverables, etc.

• Partner with development /data engineering and platform teams to ensure proper knowledge and implementation of CI/CD processes and tools

• Provide ad hoc support by troubleshooting DevOps pipeline issues, and deployment issues.

• Partner with Enterprise teams to ensure compliance with enterprise standards and patterns.