DE Jobs

Search from over 2 Million Available Jobs, No Extra Steps, No Extra Forms, Just DirectEmployers

Job Information

CAI Data Engineer in India

Data Engineer

Req number:

R2170

Employment type:

Full time

Worksite flexibility:

Remote

Who we are

CAI is a global technology services firm with over 8,500 associates worldwide and a yearly revenue of $1 billion+. We have over 40 years of excellence in uniting talent and technology to power the possible for our clients, colleagues, and communities. As a privately held company, we have the freedom and focus to do what is right—whatever it takes. Our tailor-made solutions create lasting results across the public and commercial sectors, and we are trailblazers in bringing neurodiversity to the enterprise.

Job Summary

We are looking for Data Engineer who is able to design and develop data lakes/Data warehouses . This position will be full-time and remote.

Job Description

What You’ll Do

· Design and develop data lakes, manage data flows that integrate information from various sources into a common data lake platform through an ETL Tool

· Code and manage delta lake implementations on S3 using technologies like Databricks or Apache Hoodie

· Triage, debug and fix technical issues related to Data Lakes

· Design and Develop Data warehouses for Scale

· Design and Evaluate Data Models (Star, Snowflake and Flattened)

· Design data access patterns for OLTP and OLAP based transactions

· Coordinate with Business and Technical teams through all the phases in the software development life cycle

· Participate in making major technical and architectural decisions.

· Maintain and Manage Code repositories like Git.​

What You'll Need

Required:

· Bachelor’s degree in computer science, information technology, data science, data analytics or related field

· 5+ Years of Experience operating on AWS Cloud with building Data Lake architectures

· 3+ Years of Experience with AWS Data services like S3, Glue, Lake Formation, EMR, Kinesis, RDS, DMS and Redshift

· 3+ Years of Experience building Data Warehouses on Snowflake, Redshift, HANA, Teradata, Exasol etc.

· 3+ Years of working knowledge in Spark

· 3+ Years of Experience in building Delta Lakes using technologies like Apache Hoodie or Data bricks

· 3+ Years of Experience working on any ETL tools and technologies

· 3+ Years of Experience in any programming language (Python, R, Scala, Java)

· Bachelor’s degree in computer science, information technology, data science, data analytics or related field

· Experience working on Agile projects and Agile methodology in general.

Preferred:

· AWS Certification is preferred.

· Strong RDBMS and data modelling skills​.

Physical Demands

· Sedentary work that involves sitting or remaining stationary most of the time with occasional need to move around the office to attend meetings, etc.

· Ability to conduct repetitive tasks on a computer, utilizing a mouse, keyboard, and monitor.

Reasonable accommodation statement

If you require a reasonable accommodation in completing this application, interviewing, completing any pre-employment testing, or otherwise participating in the employment selection process, please direct your inquiries to application.accommodations@cai.io or (888) 824 – 8111.

DirectEmployers