Data Engineer, Hybrid/Dallas, TX (ONLY Green Card / GC EAD L2 EAD / H4 EAD) LOCAL ONLY

VISA -NO H1B,CPT,OPT,USC

Hi ,
Hope you are doing Good!!!

Please find the attached Job Description. If you feel comfortable then please send me your updated resume or call me back on 512-898-7112.

Position: : Data Engineer

Location: Hybrid/Dallas, TX

Duration: 4 Months Contract

Interview:  Phone/Video

 

Job Description ::

We are currently seeking a Data Engineer to join our team in McKinney, Texas (US-TX), United States (US).

Job Summary

We are looking for an experienced and talented technical individual to join our team as Cloud DevOps Engineer to support our Cloud Data Management and Advanced Analytics platforms. In this role, a person will work with data services, data analysts and data scientist teams to help organization build out secure, scalable, fault-tolerant and high performing cloud based architecture and make data-driven decisions.

Primary Duties & Responsibilities

Design, implement, and support Cloud Data Management and Advanced Analytics platforms.

Provide technological guidance on Data Lake and Enterprise Data Warehouse design, development, implementation and monitoring.

Understand, Design and Implement Data Security around cloud infrastructures.

Provide support and guidance to Data Services and other application development teams on various AWS Products.

Work with leadership on process improvement and strategic initiatives on Cloud Platform.

Knowledge, Skills, & Abilities

Strong coding and scripting experience with Python

Strong knowledge in SQL

Expertise in Redshift and AWS

Experience in Python

Extensive hands-on experience including design and implementation across broad range of Amazon Web Services (AWS).

Working knowledge with primary AWS Services like EC2, EBS, S3, Lambda, Batch, Glue, Athena CloudWatch, CloudTrail, ECS, ECR, EMR, IAM, SNS etc.

Good understanding of implementing datalake and data warehouse in Cloud.

Experience in creating and deploying CloudFormation Templates (CFTs)

Solid understanding of various Data Management tools available in AWS

Proficiency in using AWS CLI and/or and AWS Tools for PowerShell

Good Understanding of Windows and Linux Administration along with creating and managing images in AWS is a strong plus.

Experience with Lifecycle Management of S3 Buckets.

Clear Understanding of Cloud Security involving AWS IAM Users and access, IAM Roles and Policies, Federated Users and permissions.

Good Understanding of AWS Encryption methodologies and AWS KMS Services.

Broad understanding of networking concepts such as VPC, NAT, ACL, DNS, Proxies, Firewalls, VPC Endpoints, Direct Connect, VPN etc.

Experience with any ETL and Data Visualization tools will be a plus

 

Education & Work Experience required

Bachelor’s degree in computer science/engineering, Information Systems or equivalent work experience in a technical position

At least 5 yrs. of experience in Data Engineering and Big data

Strong knowledge in SQL

Expertise in Redshift and AWS

Experience in Python

3+ years of experience in AWS DevOps Engineering including EC2, S3, ECS, Lambda, DMS, Aurora, Batch, CloudFormation, RDS, VPC, IAM and Security.

Over 5 years of experience in Information Technology

Strong coding and scripting experience with PowerShell or similar languages.

Prior domain experience of Life Insurance, Annuity or Financial Services is a plus

AWS Certifications are considered a strong plus

Excellent verbal and written communication skills.

Ability to work independently and as part of team.

 

Thanks & Regards-

Tarun Gupta || Mob:- 512-898-7112

E-mail 📩 [email protected]

5900 Belcones drive Suit #100, Austin, TX , 78731

 

 

 
 
 

To unsubscribe from future emails or to update your email preferences click here

0 0 votes
Article Rating
Subscribe
Notify of
guest
0 Comments
Most Voted
Newest Oldest
Inline Feedbacks
View all comments