Data & Automation Engineer – Internship (4 months)
February 27, 2026
Join Pigment: The AI Platform Redefining Business Planning
Pigment is the AI‑powered business planning and performance management platform built for agility and scale. We connect people, data, and processes in one intuitive, feature‑rich solution, empowering every team—from Finance to HR—to build, adapt, and align strategic plans in real time.
Founded in 2019, Pigment is one of the fastest‑growing SaaS companies globally. Industry leaders like Unilever, Snowflake, Siemens, and DPD use Pigment daily to make more informed decisions and confidently navigate any scenario.
With a team of 500+ across Paris, London, New York, San Francisco, and Toronto, we’ve raised nearly $400M from top‑tier investors and were named a Visionary in the 2024 Gartner® Magic Quadrant™ for Financial Planning Software.
At Pigment, we take smart risks, celebrate bold ideas, and challenge the status quo—all while working as one team. If you’re driven by innovation and ready to make an impact at scale, we’d love to hear from you.
The Data & Automations Team
The Data & Automations team is designing, implementing, and maintaining intelligent automation solutions that streamline business processes and enhance data‑driven decision making. We empower business and corporate functions through technology integration, robust data architecture, and process optimization, enabling teams to focus on high‑value activities while ensuring accuracy and scalability.
Our team operates across multiple domains, maintaining shared services and supporting Pigment’s internal functions through automation solutions. We work with a variety of technologies to build reliable data pipelines and automation workflows that support Pigment’s growth and operational efficiency.
Role
This internship aims at working on several key projects relating to improving code quality, documentation, and developer experience for the Data & Automations team’s internal tools and practices.
The internship will be organized in phases, that progress from revisiting one of the key internal development library we rely on, better integrating it with the rest of our modern data stack, to documentation and typing work, to more complex infrastructure and orchestration tasks.
This role involves collaborating with stakeholders within different teams at Pigment and requires strong communication skills, in both French and English.
Tech environment
Our core stack relies on Google Cloud Platform (GCP) and Python, with components including BigQuery, Cloud Run, Cloud Composer (Airflow), as well as dbt, Fivetran, Hightouch.
You will have the opportunity to learn and grow your skills across these technologies while making a meaningful impact on the Pigment’s operational efficiency.
Internship Overview
The internship is structured in three progressive phases, moving from foundational improvements to more complex data infrastructure work:
Phase 1: Onboarding & Peacock Improvements
Focus: Transform Peacock into a core asset of our data engineering toolbox by improving typing, documentation, and refactoring, while enabling its direct usage within DAG components
Key projects:
- Type Peacock Objects – Add Python type hints to improve IDE support and catch bugs early
- Document Peacock Package – Create comprehensive documentation and usage examples
- Enable DAG Integration – Refactor Peacock to support direct usage from Airflow DAG components, eliminating the need for wrapper services
Skills learned: Python type hints, library design, cross‑service dependencies, technical writing, documentation best practices, Airflow integration patterns
Phase 2: Code Quality & Standards
Focus: Implement automated CI agents, enhance linting standards, and create unified frameworks
Key projects:
- Add Automated CI Agents – Build agents for PII detection, anonymization checks, and other data related agents
- Review Linting – Enhance SQLFluff, Yamllint, and Ruff configurations across repositories
- Shared Utils Documentation – Audit and document duplicated utility functions
- Unified Automation Framework – Design standardized framework for automation services
Focus: Implement data synchronization projects from the team backlog
Approach:
- Start with easy single‑direction syncs to build confidence
- Progress to medium complexity bi‑directional syncs
Skills learned: API integration, data mapping, ETL patterns, Airflow DAG development
Qualifications
Academic Background
Currently enrolled in 4th year of engineering curriculum
Strong academic foundation in Software Engineering, Data Engineering, or related technical field
4‑month internship
Technical Skills
- Solid programming skills in Python (experience with libraries for automation and data processing is a plus)
- Good understanding of SQL and database concepts
- Familiarity with version control systems (Git)
- Basic knowledge of cloud platforms (GCP experience is a plus but not required)
- Understanding of API integrations and RESTful services
Soft Skills
- Strong problem‑solving mindset and analytical thinking
- Excellent attention to detail and some understanding of what clean, maintainable code stands for
- Good communication skills in French and English (both written and verbal)
- Eagerness to learn new technologies and best practices
- Comfortable working in a collaborative, fast‑paced environment
Nice to Have
- Personal projects or open‑source contributions demonstrating coding skills
- Some exposure to modern data stack tools (dbt, Airflow, BigQuery)
- Understanding of CI/CD concepts and practices
- Experience with technical documentation and writing