Software Engineer

Job Description

Project Description:

Join a high‑performing engineering squad building and operating production‑grade software and processing solutions on AWS. You will drive a shift‑left quality culture, strong automation, and resilient designs while owning features end to end from design to deployment and monitoring. The environment includes Python, PySpark and Spark on AWS (Glue, EMR, S3), Airflow for orchestration, and modern CI/CD. The team partners closely with product and platform groups to deliver reliable, observable services for financial services use cases. Skills required: Experience:

• 8+ years of hands-on programming and software development.
• Proven ability to design and build automation frameworks.
• Familiarity with observability tools and microservices architecture.

The successful applicant will use a broad range of tools, languages, and frameworks. We encourage applicants who might have a strong number of skills below, even if you do not know all of them.

• Python and scripting: Strong hands‑on development in Python plus pragmatic shell scripting in Linux environments.
• AWS stack: Commercial experience with AWS Glue, Spark/PySpark and S3 for large‑scale processing.
• Orchestration: Building, scheduling and operating pipelines with Airflow including DAG design, retries and SLAs.
• CI/CD and shift‑left testing, DevSecops: Unit, integration and contract tests embedded in pipelines using Git‑based workflows and common tools such as TeamCity, GitHub Actions, Jenkins or Octopus.
• Observability: Practical use of logging, metrics and tracing with tools like CloudWatch and Splunk to monitor production health.
• Cloud‑native engineering: Designing for scalability, reliability and cost on AWS, following security and governance standards. Nice To Have: • Ab Initio or SAS: Prior experience integrating or migrating legacy ETL workloads.
• API and microservices: Experience testing and integrating with RESTful services and event streams.
• Test automation frameworks: Familiarity with Playwright, DevTest, Appium, Sahi or similar, plus contract testing.
• Dashboards and reporting: Building engineering or quality dashboards for delivery and production health.
• Team leadership: Mentoring engineers and uplifting standards across squads. Responsibilities: • Design, build and maintain robust, scalable services and processing pipelines in Python and PySpark on AWS.
• Implement automation across testing, quality, security and deployment, embedding unit, integration and contract tests into CI/CD.
• Orchestrate and operate jobs and workflows using Airflow and AWS Glue, ensuring reliability, cost efficiency and observability.
• Collaborate with product, platform and QA to ensure systems are testable, observable and resilient, with clear SLOs and dashboards.
• Own features end to end including design, implementation, code reviews, performance tuning, deployment and production support.
• Contribute to engineering standards, code quality and governance, including DevSecOps practices and release processes.
• Mentor engineers, promote best practices and continuously improve tooling, pipelines and developer experience.