Hi
Job Title: Enterprise Data Warehouse Scalable Engineer
Location: Princeton, NJ (Hybrid)
Duration: 6+ Months
USC OR GC
Job Description:
We are seeking a skilled Enterprise Data Warehouse Scalable Engineer with expertise in Hadoop and Python to join our team. The ideal candidate will have the following qualifications:
Required Skills and Experience:
- Over 5 years of experience with DBMS, RDBMS, and ETL methodologies.
- Experience in designing and implementing automated, scalable architectures in an enterprise environment.
- Proficiency in SQL, with strong database design knowledge and experience working with large-scale data volumes.
- Programming expertise in Python and PySpark.
- Familiarity with the Hadoop ecosystem (HDFS, Spark, Oozie).
- Strong understanding of data warehousing principles, ETL processes, and dimensional data modeling.
- Excellent problem-solving and troubleshooting skills.
- Knowledge of Apache Airflow is a plus.
- Bachelor’s, Master’s, or PhD in Computer Science, Engineering, or a related technology field.
Preferred Skills:
- Experience with MPP (Massively Parallel Processing) systems.
- Familiarity with streaming technologies such as Kafka.
Thanks & Regards
Stephen Mcfeely
Senior Delivery Manager
Email: [email protected]
linkedin.com/in/stephen-mcfeely-2952972b7
CADRE TECHNOLOGIES SERVICES LLC
www.cadretechservices.com
To unsubscribe from future emails or to update your email preferences click here