Data Engineer
Job Description
Required Skills & Experience
- 5+ years of experience with database engineering -- building out and deploying pipelines, ideally working with customer data
- Strong experience with Azure (deployments, configurations, Storage Accounts)
- Hands-on experience with Azure Data Factory, Azure Databricks, Snowflake, DBT/DLT, and Medallion Architecture
- Strong Python (especially PySpark) for building and optimizing pipelines across GCP/Azure/Snowflake/Databricks
- Experience with CI/CD concepts & tooling (Azure DevOps; Repos; pipeline deployments)
- Experience working in an Agile environment
- Strong communication skills, both written and verbal
Nice to Have Skills & Experience
Experience with Fivetran, Feedonomics, or similar marketing technology tools
Experience with Terraform or ARM templates
Experience with Unity Catalog or similar governance tools
Experience with PII masking/encryption standards
Background with Snowflake secure views and enterprise data sharing
Job Description
Insight Global is looking to hire a Data Engineer for a retail client based in Vancouver. This role is a hybrid position and requires 3 days onsite/week in Downtown Vancouver. You will be joining a team that works heavily with customer data - owning the customer data pipelines that gather data coming from multiple sources and consolidating that data for different use cases. As a Mid Data Engineer, you will bring a high level of technical knowledge, but also an ability to spread knowledge to your co-workers. You will help form the core of the engineering practice by contributing to all areas of development and operations (pre-production to production). You will be an example of what good engineering looks like and help others around you refine their skills. You will be part of a day-to-day production release team and may perform on-call support functions as needed. Having a DevOps mindset is the key to success in this role, as Engineers are commonly part of full DevOps teams that “own” all parts of software development, release pipelines, production monitoring, security and support. Other duties and responsibilities include:
Build, modernize, and maintain data pipelines using Azure, Databricks, Snowflake, and GCP.
Create and publish secure Snowflake views to the Enterprise Data Exchange.
Migrate pipelines to Delta Live Tables (DLT) and Unity Catalog.
Deploy pipelines using existing CI/CD frameworks.
Ensure compliance with PII masking/encryption requirements.
Use ADF/Airflow/FiveTran for orchestration; SQL & Python for development.
Support streaming workloads (Kafka, EventHub, Spark Streaming).
Participate in DevOps: improve CI/CD, monitor production, handle failures, join on‑call rotations.
Collaborate with global engineering teams as part of an Agile, DevOps, SRE‑aligned culture.
How to Apply
Ready to start your career as a Data Engineer at Insight Global?
- Click the "Apply Now" button below.
- Review the safety warning in the modal.
- You will be redirected to the employer's official portal to complete your application.
- Ensure your resume and cover letter are tailored to the job description using our AI tools.
Frequently Asked Questions
Who is hiring?▼
This role is with Insight Global in Vancouver.
Is this a remote position?▼
This appears to be an on-site role in Vancouver.
What is the hiring process?▼
After you click "Apply Now", you will be redirected to the employer's official site to submit your resume. You can typically expect to hear back within 1-2 weeks if shortlisted.
How can I improve my application?▼
Tailor your resume to the specific job description. You can use our free Resume Analyzer to see how well you match the requirements.
What skills are needed?▼
Refer to the "Job Description" section above for a detailed list of required and preferred qualifications.