Los Angeles, CA / Data Architect. Pay Rate Range: $60 to $70/hourly C2C Local who ready to go F2F interview (USC/GC/H1B) first preference USC/GC

Position Overview: We are seeking a highly skilled Data Engineer/Architect with deep expertise in Python to join our dynamic team. The ideal candidate will play a critical role in designing, building, and maintaining scalable data architectures and pipelines. You will work closely with cross-functional teams to develop robust data solutions that enable data-driven decision-making across the organization. Key Responsibilities: ā€¢ Design and Architecture: Lead the design and implementation of data architectures, ensuring scalability, reliability, and performance. Develop and optimize ETL/ELT pipelines. ā€¢ Data Integration: Integrate data from various sources, including databases, APIs, and third-party platforms, into a cohesive data ecosystem. ā€¢ Data Modeling: Develop and maintain complex data models that support business analytics and reporting needs. ā€¢ Python Development: Utilize advanced Python programming skills to build custom data processing frameworks, automation scripts, and analytical tools. ā€¢ Data Quality: Implement data quality checks and monitoring to ensure the accuracy and consistency of data across systems. ā€¢ Collaboration: Work closely with data scientists, analysts, and business stakeholders to understand data requirements and deliver solutions that meet their needs. ā€¢ Performance Optimization: Continuously improve the performance and efficiency of data processes and systems. Qualifications: ā€¢ Experience: 5+ years of experience as a Data Engineer/Architect, with a strong focus on Python development. ā€¢ Technical Skills: o Proficiency in Python for data processing, automation, and scripting. o Experience with data warehousing solutions such as Snowflake, Redshift, or BigQuery. o Expertise in designing and managing ETL/ELT pipelines using tools like Apache Airflow, Luigi, or similar. o Strong knowledge of SQL and experience with relational databases (e.g., PostgreSQL, MySQL). o Familiarity with cloud platforms such as AWS, Azure, or GCP, and related services (e.g., S3, Lambda, Dataflow). o Experience with data modeling, data lakes, and distributed data processing frameworks (e.g., Spark, Hadoop). ā€¢ Soft Skills: o Strong problem-solving skills with the ability to think critically and troubleshoot complex issues. o Excellent communication skills, with the ability to convey technical concepts to non-technical stakeholders. o A collaborative mindset, with the ability to work effectively in a team-oriented environment. ā€¢ Education: o Bachelor’s or Master’s degree in Computer Science, Data Engineering, or a related field.
Thanks Manish Rai To unsubscribe go here: https://go.madmimi.com/opt_out?pact=3789565-183077173-14871523658-9c2252de05d9789bb3cbbc9072c53a8bd4c9ec1f
0 0 votes
Article Rating
Subscribe
Notify of
guest
0 Comments
Most Voted
Newest Oldest
Inline Feedbacks
View all comments