Software Engineer - AI Workbench
PhysicsX
About us
The Role
PhysicsX is developing a platform used by Data Scientists and Simulation Engineers to build, train, and deploy Deep Physics Models. The core of this platform relies on handling massive volumes of complex simulation data, enabling high-fidelity multi-physics simulation through AI inference.
We are looking for a Software Engineer with a strong background in building data platforms to join our team. You will not just be moving data from A to B; you will be architecting and building the distributed systems, services, and APIs that form the backbone of our data strategy. You will bridge the gap between complex physical simulations and modern data infrastructure, implementing storage solutions for AI/ML pipelines and creating the analytical layers that allow our engineers to visualise and understand their results.
You will also play a key role in shaping technical direction — contributing to Technical Decision Records, collaborating with experienced engineers, and helping to drive the standards that keep our platform reliable, secure, and performant. This is a role for a builder who loves coding robust software as much as they love designing efficient data architectures.
What you will do
- Contribute to the design and build scalable microservices, APIs, and data pipelines for high-dimensional simulation data across the ML lifecycle, working within established architectural patterns.
- Build and maintain automated data ingestion and processing pipelines that power active learning loops, serving both no-code and pro-code users.
- Implement and integrate data infrastructure components (Data Warehouses, Data Lakes, storage solutions) for simulation and deep learning workloads.
- Build internal tools that enable BI dashboards and scientific data visualizations, making large datasets intuitive and accessible.
- Own features end-to-end — from implementation through testing, deployment, and maintenance — writing clean, well-tested, secure code.
- Contribute to reliability standards, performance monitoring, and quality of service metrics. Identify and help resolve performance bottlenecks.
- Follow and contribute to API schema standards, security practices, and data access control patterns.
- Participate in CI/CD pipeline maintenance, automated testing, and observability practices, including supporting zero-downtime deployments.
- Participate in code reviews, knowledge sharing, and cross-functional collaboration with data scientists and researchers.
- Contribute to Technical Decision Records and team discussions on tooling and architectural trade-offs.
What you bring to the table
- A passion for the craft — you're driven by engineering excellence and committed to fostering that culture across the team.
- Strong software engineering foundations — solid grasp of algorithms, data structures, and system design. You write clean, maintainable, testable code in Python with working knowledge of Golang or Rust.
- Data platform exposure — experience building or contributing to data processing systems in production. Familiarity with tools like Databricks, Snowflake, or BigQuery and concepts around Data Warehouses and Data Lakes.
- API and service design — experience working with multi-service architectures, understanding schema design and data access patterns.
- Security and reliability awareness — understanding of security fundamentals, monitoring, alerting, and quality of service in production systems.
- CI/CD familiarity — practical experience with CI/CD pipelines and deployment workflows.
- Safe code execution awareness — understanding of or interest in sandboxing, isolation, and security considerations for running user-submitted code.
- Problem-solving — ability to diagnose issues, debug across services, and optimize data processing performance.
- Communication and collaboration — strong communication skills for working with cross-functional teams. Comfortable participating in code reviews and supporting peers.
- Incremental mindset — you work in small steps, balance detail with the big picture, and proactively seek help when blocked.
Ideally
- Programming skills: strong Python expertise and experience with at least one compiled language (Golang, C++, or Rust).
- Data systems experience: practical experience building and maintaining data processing systems, working with diverse data types and running analytics on large datasets.
- Kubernetes and infrastructure: familiarity with Kubernetes concepts and infrastructure configuration tools (e.g., Helm, ArgoCD).
- Testing and debugging: solid debugging and troubleshooting skills, with exposure to automated testing practices.
- Bonus: understanding of 3D geometry processing, physics simulation data structures, or building software for different deployment environments (cloud, on-premises).
What We Offer
- Equity options – share in our success and growth.
- 10% employer pension contribution – invest in your future.
- Free office lunches – great food to fuel your workdays.
- Flexible working – balance your work and life in a way that works for you.
- Hybrid setup – enjoy our new Shoreditch office while keeping remote flexibility.
- Enhanced parental leave – support for life’s biggest milestones.
- Private healthcare – comprehensive coverage.
- Personal development – access learning and training to help you grow.
- Work from anywhere – extend your remote setup to enjoy the sun or reconnect with loved ones.