EngineeringSeniorFull-time
Senior Data Engineer
Design and own the data platforms that feed our Agentic AI systems — from ingestion and warehousing to retrieval indices and feature stores — for enterprise and public-sector clients across Canada.
About the Role
At Triomind, we build Agentic AI frameworks that sit alongside enterprise systems to automate complex workflows across facilities, assets, and capital programs. This is a role for engineers who want to ship AI that real institutions depend on — not another demo.
What You'll Do
- Architect and implement batch and streaming data pipelines that unify data from enterprise systems (EAM, CMMS, ERP, CRM) and operational data stores into analytics- and AI-ready datasets.
- Design data warehouses, lakehouses, and retrieval indices (including vector stores) that support LLM, agent, and analytics workloads.
- Define data contracts, quality checks, lineage, and observability so downstream AI and product teams can trust what they consume.
- Partner with AI engineers on feature engineering, embedding pipelines, and evaluation datasets; partner with software engineers on operational integrations.
- Lead technical discovery during client engagements: assess existing data landscapes, propose target architectures, and translate them into actionable delivery plans.
- Establish standards for security, privacy, and compliance when handling sensitive government, university, and enterprise data.
What We're Looking For
- 5+ years of professional data engineering experience, including at least 2 years leading designs for non-trivial data platforms in production.
- Deep SQL skills and strong Python; comfortable with at least one modern transformation framework (dbt, Spark, or equivalent).
- Hands-on experience with orchestration tools (Airflow, Dagster, Prefect, or Azure Data Factory) and cloud data warehouses / lakehouses (Snowflake, BigQuery, Databricks, or Azure Synapse).
- Solid understanding of data modelling (dimensional, Data Vault, or similar), schema evolution, and performance tuning.
- Experience designing secure pipelines that respect row-level access, PII handling, and auditing requirements.
- Clear communicator who can work directly with client stakeholders and mentor other engineers.
- Must be legally authorized to work in Canada.
Nice to Have
- Experience with vector databases (pgvector, Pinecone, Weaviate, Azure AI Search) and embedding pipelines for RAG systems.
- Streaming experience with Kafka, Event Hubs, or Kinesis.
- Prior delivery into regulated environments (government, healthcare, post-secondary) and familiarity with FOIP, PIPA, or PIPEDA.
- Infrastructure-as-code skills (Terraform, Bicep) and mature CI/CD practices for data pipelines.
Tech We Use
PythonSQLdbtAirflowSparkPostgreSQLDatabricksSnowflakeAzureAWSDocker