Hi
Greetings of the day,
This is Amar from Amaze Systems. I do have below excellent job opportunity for you with one of my Fortune 100 clients. Please help me with your updated resume. Else, you can reach me at ama[email protected].
Role: Solutions Data Architect (AWS/Fabric)
Location: Atlanta, GA(Onsite hybrid, Local to GA H1B only)
Duration: Long term
Experience Required (Must have):
??Experience in technical leadership in architecture, design, and engineering in the modernization of legacy Data Ingestion, ETL, and databases to new technologies in the Public Cloud (AWS) and Big Data Space
??Strong understanding across any of the Cloud (AWS) and its components (compute, storage, data, serverless compute,) to deliver end to-end ETL and infrastructure architecture solutioning for the clients
??Experience in defining technical direction and roadmap for Cloud migrations and managing the implementation and big data technical solutions.
??Good experience in Lambda, Glue, EMR and Redshift for Data warehousing.
??Experience in designing, development, or maintenance of a big data application.
??Produce solutions that are models of performance, scalability, and extensibility.
??Can create proof of concepts, demos, and/or scripts from scratch or leveraging reusable components.
??At least 5 years of experience in Data warehousing and designing solutions on Modern Data Lake.
??At least 5 years of Experience with major Big Data technologies and frameworks including but not limited to Hadoop, Apache Spark, Hive, Kafka, HBase/MongoDB.
Good Knowledge in Microsoft fabric Good Knowledge in Microsoft fabric ?
??Experience in technical leadership in architecture, design, and engineering in the modernization of legacy Data Ingestion, ETL, and databases to new technologies in the Public Cloud (AWS) and Big Data Space
??Strong understanding across any of the Cloud (AWS) and its components (compute, storage, data, serverless compute,) to deliver end to-end ETL and infrastructure architecture solutioning for the clients
??Experience in defining technical direction and roadmap for Cloud migrations and managing the implementation and big data technical solutions.
??Good experience in Lambda, Glue, EMR and Redshift for Data warehousing.
??Experience in designing, development, or maintenance of a big data application.
??Produce solutions that are models of performance, scalability, and extensibility.
??Can create proof of concepts, demos, and/or scripts from scratch or leveraging reusable components.
??At least 5 years of experience in Data warehousing and designing solutions on Modern Data Lake.
??At least 5 years of Experience with major Big Data technologies and frameworks including but not limited to Hadoop, Apache Spark, Hive, Kafka, HBase/MongoDB.
Good Knowledge in Microsoft fabric Good Knowledge in Microsoft fabric ?
Good Knowledge in Microsoft fabric
??Practical expertise in performance tuning and optimization
??Experience work on Production grade projects with Terabyte to Petabyte size data sets.
??Experience reviewing and implementing DevOps and Agile standards and practices.
??Practical expertise in performance tuning and optimization
??Experience work on Production grade projects with Terabyte to Petabyte size data sets.
??Experience reviewing and implementing DevOps and Agile standards and practices.
Secondary skills (Good to have):
??Proficiency in any of the programming languages: Scala, Python, or Java
??Good to have knowledge on working with Presto.
??Must have conceptual knowledge of Data Lake and ETL
??Experience in cloud data warehouses like a snowflake.
??Hands on Experience with Spark (Python/Scala), SQL.
??Good analytical and problem-solving skills for design, creation, and testing of programs.
??Good communication skills to interact with team members, support personnel, and provide technical guidance and expertise to customers and management.
??Good interpersonal skills to interact with customers and team members.
??Should be ready to explore or work with Cloud technologies.
??Ability to work in a self-directed work environment.
??Proficiency in any of the programming languages: Scala, Python, or Java
??Good to have knowledge on working with Presto.
??Must have conceptual knowledge of Data Lake and ETL
??Experience in cloud data warehouses like a snowflake.
??Hands on Experience with Spark (Python/Scala), SQL.
??Good analytical and problem-solving skills for design, creation, and testing of programs.
??Good communication skills to interact with team members, support personnel, and provide technical guidance and expertise to customers and management.
??Good interpersonal skills to interact with customers and team members.
??Should be ready to explore or work with Cloud technologies.
??Ability to work in a self-directed work environment.
Roles & Responsibilities:
??Total Experience: 12+ years
??Propose solution architectures and manage the deployment of cloud based big data and analytics solutions, according to complex customer requirements and implementation best practices.
??Review, validate and assess client's requirements, and product's specifications/features.
??Create the required artifacts (design document, diagrams, STTM, HLD, LLD, etc.)
??Architect, design, and implement data solutions in the Public Cloud (Azure/AWS) and Big Data Space
??Work closely with customer and internal teams to gather business and technical requirements.
??Develop a point of view, capture, prioritize and analyze technical (functional and non-functional) requirements
??Apply Architectural Methods, Techniques and Best Practices.
??Educate, mentor and coach teams in applying architectural techniques
??Successfully lead Customer's migration and transformation projects has context menu
??Total Experience: 12+ years
??Propose solution architectures and manage the deployment of cloud based big data and analytics solutions, according to complex customer requirements and implementation best practices.
??Review, validate and assess client's requirements, and product's specifications/features.
??Create the required artifacts (design document, diagrams, STTM, HLD, LLD, etc.)
??Architect, design, and implement data solutions in the Public Cloud (Azure/AWS) and Big Data Space
??Work closely with customer and internal teams to gather business and technical requirements.
??Develop a point of view, capture, prioritize and analyze technical (functional and non-functional) requirements
??Apply Architectural Methods, Techniques and Best Practices.
??Educate, mentor and coach teams in applying architectural techniques
??Successfully lead Customer's migration and transformation projects has context menu
Regards
Amar Verma
Amaze Systems Inc.
USA: 8951 Cypress Waters Blvd, Suite 160, Dallas, TX 75019
Note: Amaze Systems is an Equal Opportunity Employer (EOE), and does not discriminate based on age, gender, religion, disability, marital status, race and also adheres to laws relating to non-discrimination on the basis of national origin and citizenship status.
Change/Remove Subscription | Powered by Zoniac |
Amaze Systems, 8951 Cypress Waters Blvd, Suite 160, Dallas, Texas 75019 Phone: 669-888-7012