Forgotten Password

Primus Global Services

Primus Global Services is looking for a Snowflake Data Engineer – REMOTE WORK  #LookingForAJavaJobRemote #JavaCareerFinderRemote #DataEngineerJobs #EngineerJobs

Snowflake Data Engineer – REMOTE WORK – 51329 Pay Range – $45 – $50/hr We have an immediate long-term opportunity with one of our prime clients for a position of Snowflake Data Engineer to work on Remote basis. Primary Responsibilities: Create & maintain data pipelines using Azure & Snowflake as primary tools Create SQL Stored procs, Macros to perform complex transformation Creating logical & physical data models to ensure data integrity is maintained CI CD pipeline creation & automation using GIT & GIT Actions Tuning and optimizing data processes Design and build best in class processes to clean and standardize data Code Deployments to production environment, troubleshoot production data issues Modelling of big volume datasets to maximize performance for our BI & Data Science Team Create Docker images for various applications and deploy them on Kubernetes Qualifications Required Qualifications: Computer Science bachelor’s degree or similar Min 3-6 years of industry experience as a Hands-on Data engineer Excellent communication skills Excellent knowledge of SQL, Python Excellent knowledge of Azure Services such as – Blobs, Functions, Azure Data Factory, Service Principal, Containers, Key Vault etc. Excellent knowledge of Snowflake – Architecture, best practices Excellent knowledge of Data warehousing & BI Solutions Excellent Knowledge of change data capture (CDC), ETL, ELT, SCD etc. Knowledge of CI CD Pipelines using GIT & GIT Actions Knowledge of different data modelling techniques such as Star Schema, Dimensional models, Data vault Hands on experience on the following technologies: Developing data pipelines in Azure & snowflake Writing complex SQL queries Building ETL/ELT/data pipelines using SCD logic Exposure to Kubernetes and Linux containers (i.e. Docker) Related/complementary open-source software platforms and languages (e.g. Scala, Python, Java, Linux) Previous experience with Relational Databases (RDBMS) & Non- Relational Database Analytical and problem-solving experience applied to a Big Data datasets Good understanding of Access control and Data masking Experience working in projects with agile/scrum methodologies and high performing team(s) Exposure to DevOps methodology Data warehousing principles, architecture and its implementation in large environments Very good understanding of integration with Tableau Preferred Qualifications: Design and build data pipelines (in Spark) to process terabytes of data Very good understanding of Snowflake integration with data visualization tool such as Tableau Orchestrate in Airflow the data tasks to run on Kubernetes/Hadoop for the ingestion, processing and cleaning of data Terraform knowledge and automation Create real-time analytics pipelines using Kafka / Spark Streaming Work on Proof of Concepts for Big Data and Data Science Understanding of United States Healthcare data **ALL successful candidates for this position are required to work directly for PRIMUS. No agencies please only W2** For immediate consideration, please contact: Tanya PRIMUS Global Services Direct (972) 200-4514 Desk: (972) 753-6500 Ext. 258 Email:

Primus Global Services

Tagged as: DataEngineer, Engineer

Apply for job

Apply For This Job

To begin the application process, please provide your email address.


By continuing you agree to minneapolisjobsearch Cookies, Privacy and Terms

Job Overview