The opportunity
The Data Engineer role is a new position created to reflect the rapid growth of the business.
We are looking for a skilled AWS Data Engineer to take ownership of our existing cloud-based data infrastructure and evolve it into a best-practice, scalable, and business-critical platform. You will be responsible for refining and expanding our AWS-based data ecosystem, ensuring it supports the organisation’s growing analytical and reporting needs across Development, Finance, Operations, and beyond.
This is a hands-on role where you will define technical direction, embed best practices, and drive data quality, efficiency, and cost optimisation across the platform.
Job requirements
- Strong hands-on experience with AWS data services (e.g., S3, Glue, Lambda, Athena, Redshift, DynamoDB).
- Strong hands-on experience with data visualisation tools such as QuickSight
- Proficiency in Python for data processing, API integration, and automation.
- Strong SQL skills for data extraction and transformation.
- Experience working with REST APIs and JSON data structures.
- Practical experience with GitLab (version control, branching, and CI/CD pipelines).
- Solid understanding of data pipeline design and ETL best practices.
- Experience with infrastructure as code (e.g., Terraform or CloudFormation).
- Advantageous to have:o Knowledge of SalesForce and/or Sage Intacct data flow
To understand the full requirements for this opportunity, please read the full job spec.
Primary responsibilities
- Identify and implement improvements to our current data infrastructure, be it cost optimisation or adherence to best practice.
- Collaborate with business stakeholders – including Development, Finance, and Product teams – to identify new data opportunities and translate requirements into practical solutions.
- Design, build, and maintain data ingestion, transformation pipelines and visualisations using AWS services (e.g., S3, Glue, Lambda, Athena, Redshift, Step Functions, QuickSight).
- Develop Python scripts and ETL processes for data processing and automation.
- Integrate data from various APIs and third-party sources into cloud-based data systems.
- Write and optimise SQL queries for data extraction, transformation, and validation.
- Use GitLab for version control, CI/CD, and collaborative development.
- Implement best practices for data quality, scalability, and security.
- Monitor and control data platform costs.
What we offer
- 25 days holiday, rising to 28 days per annum with length of service
- Medical, dental and optical insurance cover
- An exciting and flexible working environment surrounded by friendly and committed co-workers
- UK: Electric Vehicle Scheme
- “Work from anywhere” 2 weeks per year policy
- Training and development opportunities
- Access to an employee assistance programme and wellbeing support hub
- Team events
- Ad-hoc incentives and competitions