Specialist / Manager Data and AI Engineering

Permanent employee, Full-time · Munich

Read job description in:
Your mission
You will be the technical backbone of Armira’s emerging Data & AI function, responsible for building and maintaining the firm’s internal data infrastructure and AI-powered workflows. This is a two-pillar builder role: you will be responsible for (1) designing and implementing Armira’s central data warehouse, ingestion pipelines, data models, and reporting layer, and (2) building AI- and LLM-powered internal tools and workflows that support the investment team’s day-to-day processes, leveraging existing APIs, agent frameworks, and workflow automation tools. Depending on your background, you may lean more toward one pillar initially – what matters is the ability and drive to work across both.
Your Responsibilities
Pillar 1: Data Warehouse & Infrastructure
  • Architect, build, and maintain a centralised data warehouse consolidating fragmented data sources across the firm
  • Design ETL/ELT pipelines to ingest, transform, and structure data from market databases, deal pipeline sources, and internal systems
  • Implement data quality frameworks and governance standards appropriate for a regulated financial services environment 
  • Build dashboards and reporting tools to enable self-service analytics for the investment team
 
Pillar 2: AI Workflow Development & Internal Tooling
  • Design and build AI-powered workflows (e.g., LLM integrations, n8n/Make automation) to automate and enhance deal sourcing, due diligence support, and internal reporting workflows
  • Develop internal tools and applications using LLM APIs, integrating with existing systems (CRM, document management, communication platforms)
  • Prototype, test, and iterate on AI-powered workflows, translating business requirements into technical solutions under the guidance of senior leadership
  • Stay current with the rapidly evolving AI/LLM ecosystem and evaluate and recommend new tools and approaches for implementation
Technical Approach & Stack Expectations

We expect you to leverage well-established, cloud-native tools and to keep the architecture simple, well-documented, and maintainable, so that another engineer could understand and operate key pipelines within a short onboarding period. We are not optimising for cutting-edge custom architectures, but for pragmatic, robust solutions. The expected tech stack aligns with well-established tools in data engineering:
  • Data Warehouse: e.g., Snowflake, Fabric
  • Cloud: Preferred Azure
  • Transformation & Orchestration: dbt / Airflow
  • Visualisation: Tableau / Power BI
  • Programming: Python, SQL
  • AI/LLM: OpenAI, Anthropic APIs; agent frameworks (LangChain, MCP, etc.); workflow automation (n8n, Make)
Your profile
Required Qualifications:
  • Degree in Computer Science, Data Science, Engineering, or a related quantitative field
  • 2+ years of professional experience (Specialist: 2–5 years; Manager: 5+ years) in data engineering, software development, or applied data science
  • Ideally, at least one end-to-end build of a data product, internal tool, or data platform in a professional setting (e.g., designing a data model, building pipelines, and putting dashboards or an internal application into production)
  • Strong programming skills in Python; solid experience with SQL and modern data stack tools (e.g., Snowflake, dbt, Airflow)
  • Experience with cloud platforms (AWS, GCP, or Azure)
  • Familiarity with LLM APIs (OpenAI, Anthropic, or similar) and willingness and experience to build AI-powered workflows (e.g. n8n, make.com)
  • Ability to work independently, manage ambiguity, and deliver end-to-end solutions
  • Fluent in English; German proficiency is a strong plus
Preferred Qualifications:
  • Experience in or exposure to financial services, consulting, or private equity
  • Familiarity with agentic AI frameworks (LangChain, CrewAI, or similar) and/or workflow automation platforms (n8n, Make)
  • Experience with data visualisation tools (Tableau, Power BI, or similar)
  • Understanding of PE workflows (deal sourcing, due diligence, reporting) as context for building effective internal tools
  • Track record of building data products or internal tools in a smaller-team environment
Why us?
What We Offer:
  • Unique opportunity to build a function from scratch at a leading DACH investment holding
  • Deep exposure to the investment team and PE deal-making: you will sit in on deal discussions, portfolio reviews, and strategy meetings, building a genuine understanding of how private equity works and developing your own professional network in the industry
  • Competitive compensation 
  • Munich-based role with work-from-home options, combined with a collaborative, entrepreneurial team culture
  • High autonomy with clear career growth path as the data function scales
  • Learning and development budget for conferences, courses, and certifications
About us
Armira is a Munich- and London-based investment holding with a truly differentiated DNA driven by entrepreneurial long-term capital. Backed by a unique capital base of entrepreneurs and entrepreneurial families, we invest with a long-term mindset and a highly flexible mandate. Across our three investment strategies, we partner with exceptional entrepreneurs across Europe - from fast-growing companies with EUR 10m+ in revenue to global family enterprises generating more than EUR 1.0bn in revenues. 
Our team of approx. 80 professionals brings extensive top-tier investment experience, including previous positions at leading private equity firms, strategy consultancies and investment banks such as KKR, Auctus, Blackstone, Advent, McKinsey, BCG, Bain & Company, Goldman Sachs, and J.P. Morgan.
Your Mission
You will be the technical backbone of Armira’s emerging Data & AI function, responsible for building and maintaining the firm’s internal data infrastructure and AI-powered workflows. This is a two-pillar builder role: you will be responsible for (1) designing and implementing Armira’s central data warehouse, ingestion pipelines, data models, and reporting layer, and (2) building AI- and LLM-powered internal tools and workflows that support the investment team’s day-to-day processes, leveraging existing APIs, agent frameworks, and workflow automation tools. Depending on your background, you may lean more toward one pillar initially – what matters is the ability and drive to work across both.

This role is not a research-focused data scientist position and does not involve training bespoke ML models. Instead, it combines data platform engineering with pragmatic use of modern LLM APIs, agent frameworks, and workflow automation platforms (such as n8n or Make) to ship production-ready tools that deliver immediate value to the investment team & the wider firm.

You will work closely with the Head of Origination (your role sponsor). While the Lead owns the “what” and “why” – identifying use cases, setting priorities, and engaging with portfolio companies – you own the “how”: building and maintaining the internal infrastructure and tools that power the firm’s data and AI capabilities. This gives you direct access to investment decision-makers and business context while maintaining high technical autonomy.
Your Responsibilities
Pillar 1: Data Warehouse & Infrastructure
  • Architect, build, and maintain a centralised data warehouse consolidating fragmented data sources across the firm
  • Design ETL/ELT pipelines to ingest, transform, and structure data from market databases, deal pipeline sources, and internal systems
  • Implement data quality frameworks and governance standards appropriate for a regulated financial services environment 
  • Build dashboards and reporting tools to enable self-service analytics for the investment team
 
Pillar 2: AI Workflow Development & Internal Tooling
  • Design and build AI-powered workflows (e.g., LLM integrations, n8n/Make automation) to automate and enhance deal sourcing, due diligence support, and internal reporting workflows
  • Develop internal tools and applications using LLM APIs, integrating with existing systems (CRM, document management, communication platforms)
  • Prototype, test, and iterate on AI-powered workflows, translating business requirements into technical solutions under the guidance of senior leadership
  • Stay current with the rapidly evolving AI/LLM ecosystem and evaluate and recommend new tools and approaches for implementation
Technical Approach & Stack Expectations

We expect you to leverage well-established, cloud-native tools and to keep the architecture simple, well-documented, and maintainable, so that another engineer could understand and operate key pipelines within a short onboarding period. We are not optimising for cutting-edge custom architectures, but for pragmatic, robust solutions. The expected tech stack aligns with well-established tools in data engineering:
  • Data Warehouse: e.g., Snowflake, Fabric
  • Cloud: Preferred Azure
  • Transformation & Orchestration: dbt / Airflow
  • Visualisation: Tableau / Power BI
  • Programming: Python, SQL
  • AI/LLM: OpenAI, Anthropic APIs; agent frameworks (LangChain, MCP, etc.); workflow automation (n8n, Make)
Your profile
Required Qualifications:
  • Degree in Computer Science, Data Science, Engineering, or a related quantitative field
  • 2+ years of professional experience (Specialist: 2–5 years; Manager: 5+ years) in data engineering, software development, or applied data science
  • Ideally, at least one end-to-end build of a data product, internal tool, or data platform in a professional setting (e.g., designing a data model, building pipelines, and putting dashboards or an internal application into production)
  • Strong programming skills in Python; solid experience with SQL and modern data stack tools (e.g., Snowflake, dbt, Airflow)
  • Experience with cloud platforms (AWS, GCP, or Azure)
  • Familiarity with LLM APIs (OpenAI, Anthropic, or similar) and willingness and experience to build AI-powered workflows (e.g. n8n, make.com)
  • Ability to work independently, manage ambiguity, and deliver end-to-end solutions
  • Fluent in English; German proficiency is a strong plus

Preferred Qualifications:
  • Experience in or exposure to financial services, consulting, or private equity
  • Familiarity with agentic AI frameworks (LangChain, CrewAI, or similar) and/or workflow automation platforms (n8n, Make)
  • Experience with data visualisation tools (Tableau, Power BI, or similar)
  • Understanding of PE workflows (deal sourcing, due diligence, reporting) as context for building effective internal tools
  • Track record of building data products or internal tools in a smaller-team environment

Why us?
What We Offer:
  • Unique opportunity to build a function from scratch at a leading DACH investment holding
  • Deep exposure to the investment team and PE deal-making: you will sit in on deal discussions, portfolio reviews, and strategy meetings, building a genuine understanding of how private equity works and developing your own professional network in the industry
  • Competitive compensation 
  • Munich-based role with work-from-home options, combined with a collaborative, entrepreneurial team culture
  • High autonomy with clear career growth path as the data function scales
  • Learning and development budget for conferences, courses, and certifications
Über uns
Armira is a Munich- and London-based investment holding with a truly differentiated DNA driven by entrepreneurial long-term capital. Backed by a unique capital base of entrepreneurs and entrepreneurial families, we invest with a long-term mindset and a highly flexible mandate. Across our three investment strategies, we partner with exceptional entrepreneurs across Europe - from fast-growing companies with EUR 10m+ in revenue to global family enterprises generating more than EUR 1.0bn in revenues. 
Our team of approx. 80 professionals brings extensive top-tier investment experience, including previous positions at leading private equity firms, strategy consultancies and investment banks such as KKR, Auctus, Blackstone, Advent, McKinsey, BCG, Bain & Company, Goldman Sachs, and J.P. Morgan.

Your application!
We appreciate your interest in Armira! Please remember to specify your availability and upload the required documents. We are looking forward to receiving your application.
Uploading document. Please wait.
Please add all mandatory information with a * to send your application.