Skip to content

careers

Senior Data Engineer

Remote with the option to use the Edinburgh office - Permanent/Full Time

Share:

BR-DGE is an award winning FinTech founded in Edinburgh. Our platform enables e-commerce and technology businesses to have the freedom and flexibility to redefine the way they handle payments.

Since our inception in 2018 we have been leading the way in the future of payment orchestration. Our products enable enterprise businesses to optimise their payment infrastructure and create frictionless digital payment experiences for their end users. Now with a global reach, our customer base is made up of incredible brands and household names from across the travel, retail and gambling sectors and it’s growing fast! Our world class partners include Visa and Worldpay and we’re continuing to build a strong partner network with the biggest players in the payments industry. It’s an exciting time to be part of BR-DGE!

The journey so far has been incredible, but we’re just getting started and with ambitious growth plans, we’re now looking for more exceptional talent to join our team.

Flexible and remote working

Remote working allowance

33 days holiday including public holidays

Your birthday as a day off

Family healthcare

Life insurance

Employee assistance programme

A culture that champions rapid career progression

Investment in your learning and development

Regular team events & socials

Why this role exists
1. Data is becoming a critical part of BR DGE’s next growth phase, powering internal analytics and customer facing insights and monitoring.
2. The data engineering space is largely greenfield. We need a production grade data platform that can ingest, transform, validate, and monitor data from core systems and operational tooling.
3. The robustness, scalability, and governance of our data architecture impacts our ability to grow safely and meet regulatory expectations.
4. This role owns the insights data platform, while partnering closely with Analytics, Product, and Engineering to ensure the platform delivers trusted datasets and timely signals.

What you will do:
1. Design and ship a tiered data platform that supports multiple latency needs, including low latency pipelines for operational monitoring and customer facing insights, plus batch pipelines for reporting and deeper analysis.
2. Build and own end-to-end ingestion patterns across batch, micro batch, and selected near real time use cases, with strong orchestration and dependency management.
3. Implement schema evolution, data contracts, and approaches for late arriving and corrected data so consumers can trust the outputs.
4. Treat curated datasets as products that are well defined, documented, reliable, and safe to use for both internal and external consumers.
5. Set platform standards for idempotent ingestion, deduplication, data quality, lineage, and observability.
6. Ensure the platform meets regulated fintech & payments expectations for access control, security, and governance while staying cost efficient as volumes grow.
7. Partner with Product and Engineering on event and domain modelling. Decide what data gets emitted and what latency and granularity is needed for analytics and product goals.
8. Support Data Science with reliable feature ready datasets and pragmatic collaboration, without owning reporting or business analysis.
9. Evolve the current lightweight tooling into a more observable, structured platform. Improve standards without creating unnecessary platform complexity.
10. Automate data infrastructure and workflows using infrastructure as code and CI CD practices.

What we are looking for:

Must have
1. Proven experience designing, building, and operating production grade data pipelines and platforms.
2. Strong SQL, specifically PostgreSQL, plus at least one programming language such as Python or Java.
3. Experience with data processing or orchestration tooling such as Spark, Airflow, or Kafka.
4. Experience designing data models for analytics and reporting workloads.
5. Practical knowledge of data quality, testing, observability, lineage, and governance patterns.
6. Strong experience with AWS based data platforms, with hands on use of services like S3, Glue, Athena, Redshift, Kinesis, EMR, or MSK.
7. Infrastructure as code experience using Terraform or CloudFormation, and comfort operating systems in production.
8. Ability to collaborate across Engineering, Product, Analytics, and Data Science, and drive standards through influence.

Nice to have

  1. 1. Experience building customer-facing data products where latency and correctness affect user outcomes.
  2. 2. Experience in regulated fintech or payments environments, especially around access control and auditability.
    3. Experience with cost and performance optimisation at scale in AWS data stacks.
    • Tech context
    • This role will work across ingestion, orchestration, modelling, governance, and observability in an AWS centric environment, with PostgreSQL and modern data tooling. Current tooling is intentionally lightweight, and the platform is evolving as BR-DGE grows. In some cases you do not need to be hands-on day to day, but you must be fluent enough to make strong technical decisions and review work.

  3. What We Offer:
  • Flexible, remote-first working 
  • 33 days holiday, including public holidays 
  • Birthday off 
  • Family healthcare 
  • Life insurance 
  • Employee assistance programme 
  • Investment in learning and development 
  • Regular team events and off-sites 
  • A collaborative culture where documentation is treated as a first-class product