Los Angeles, CA / Data Architect. Pay Rate Range: $60 to $70/hourly C2C Local who ready to go F2F interview (USC/GC/H1B) first preference USC/GC

Position Overview: We are seeking a highly skilled Data Engineer/Architect with deep expertise in Python to join our dynamic team. The ideal candidate will play a critical role in designing, building, and maintaining scalable data architectures and pipelines. You will work closely with cross-functional teams to develop robust data solutions that enable data-driven decision-making across the organization. Key Responsibilities: • Design and Architecture: Lead the design and implementation of data architectures, ensuring scalability, reliability, and performance. Develop and optimize ETL/ELT pipelines. • Data Integration: Integrate data from various sources, including databases, APIs, and third-party platforms, into a cohesive data ecosystem. • Data Modeling: Develop and maintain complex data models that support business analytics and reporting needs. • Python Development: Utilize advanced Python programming skills to build custom data processing frameworks, automation scripts, and analytical tools. • Data Quality: Implement data quality checks and monitoring to ensure the accuracy and consistency of data across systems. • Collaboration: Work closely with data scientists, analysts, and business stakeholders to understand data requirements and deliver solutions that meet their needs. • Performance Optimization: Continuously improve the performance and efficiency of data processes and systems. Qualifications: • Experience: 5+ years of experience as a Data Engineer/Architect, with a strong focus on Python development. • Technical Skills: o Proficiency in Python for data processing, automation, and scripting. o Experience with data warehousing solutions such as Snowflake, Redshift, or BigQuery. o Expertise in designing and managing ETL/ELT pipelines using tools like Apache Airflow, Luigi, or similar. o Strong knowledge of SQL and experience with relational databases (e.g., PostgreSQL, MySQL). o Familiarity with cloud platforms such as AWS, Azure, or GCP, and related services (e.g., S3, Lambda, Dataflow). o Experience with data modeling, data lakes, and distributed data processing frameworks (e.g., Spark, Hadoop). • Soft Skills: o Strong problem-solving skills with the ability to think critically and troubleshoot complex issues. o Excellent communication skills, with the ability to convey technical concepts to non-technical stakeholders. o A collaborative mindset, with the ability to work effectively in a team-oriented environment. • Education: o Bachelor’s or Master’s degree in Computer Science, Data Engineering, or a related field.
Thanks Manish Rai To unsubscribe go here: https://go.madmimi.com/opt_out?pact=3789565-183077173-14871523658-9c2252de05d9789bb3cbbc9072c53a8bd4c9ec1f
0 0 votes
Article Rating
Subscribe
Notify of
guest
0 Comments
Most Voted
Newest Oldest
Inline Feedbacks
View all comments