Data Engineer Job at Zettalogix

Zettalogix Remote

Job Role: Mid-Level Data Engineer

Location: Remote

Duration: Long Term Contract

JOB DESCRIPTION

Retail Data Engineering & Data Strategy

Position Summary

This Data Engineer role will collaborate with business partners to identity opportunities to leverage big data technologies in support of Personalization initiatives within CVS. You will build and architect next generation Big Data machine learning framework developed on a group of core Big Data technologies. You will design and build highly scalable and extensible Big Data platforms which enables collection, storage, modeling, and analysis of massive data sets from numerous channels. You will define and maintain data architecture, focusing on applying technology to enable business solutions. You will assess and provide recommendations on business relevance, with appropriate timing and deployment. You will perform architecture design, data modeling, and build/implement CVS Big Data platforms and analytic applications. You will bring a DevOps mindset to enable big data and batch/real-time analytical solutions that leverage emerging technologies. You will develop prototypes and proof of concepts for the selected solutions, and implement complex big data projects. You will apply a creative mindset to a focus on collecting, parsing, managing, and automating data feedback loops in support of business innovation.

JOB RESPONSIBILITIE

REQUIRED EXPERIENCE

Required Qualifications

  • 3+ years Hands-on experience with cloud based “big data” platforms including Hadoop (preferably Azure or GCP) and Spark Proficiency in “big data” technologies including Spark, Airflow, Kafka, Hbase, Pig, NoSQL databases, etc.
  • 3+ years Experience and background on traditional relational data warehouse technologies like Oracle, Teradata, DB2
  • 3+ years experience in one of the following programming languages: Python, Pyspark, Scala or Java Good knowledge and experience with SQL, including Analytical SQL functions Proficient in distributed computing framework; Dask, Kubernetes and Docker PREFERRED: DevOps Python Kubernetes Snowflake

REQUIRED EDUCATION Minimum of BS in any computer-related degree (Computer Science, Information Systems, etc.)

Job Type: Contract

Salary: $45.00 - $50.00 per hour

Schedule:

  • 8 hour shift
  • Monday to Friday

Experience:

  • Big data: 4 years (Preferred)
  • SQL: 3 years (Preferred)
  • DevOps: 3 years (Preferred)
  • Oracle: 3 years (Preferred)
  • Hadoop: 3 years (Required)

Work Location: Remote




Please Note :
epokagency.com is the go-to platform for job seekers looking for the best job postings from around the web. With a focus on quality, the platform guarantees that all job postings are from reliable sources and are up-to-date. It also offers a variety of tools to help users find the perfect job for them, such as searching by location and filtering by industry. Furthermore, epokagency.com provides helpful resources like resume tips and career advice to give job seekers an edge in their search. With its commitment to quality and user-friendliness, Site.com is the ideal place to find your next job.