Need locals and only H1b candidate for the role of Azure Databricks Developer (Minimum 14+ years Ex required) @ Grand Rapids, US

Hi,
Hope you are doing well.
Please go through the job description and share resumes with current location, Visa status, contact details and LinkedIn profile link.
Note: • Only H1B candidates • Passport number is required • Candidate needs to be in the office 3 Days every week. Local or candidates from Nearby states only.
Role: Azure Databricks Developer Duration: 12 Months Location: Grand Rapids US (Hybrid 3 Days) Job Description: Must have: • Must have good Knowledge on Architecture Design and should be able to provide solutions on any New Requirement. • Defines functional and non-functional requirements including performance monitoring, alerting and code management and ensuring alignment with technology best practices and SLAs. • Hands on Experience both as Data engineer/Developer and Administration of Databirkc • Providing subject matter expertise and technical consulting support on either vendor or internal applications and interfaces including Azure – Data Factory, Log Analytics, Databricks, Synapse, Power BI, ADLS Gen2, Polybase and Machine Learning/AI. • Experience on Azure Service – Focused on Function App, Messaging Services, Data Factory, Azure Permissions, Cosmos DB, Logic Apps, Synapse DWH • Experience building and designing data and analytics on enterprise solutions such as Azure – Data Factory, Log Analytics, Databricks, Synapse, Power BI, ADLS Gen2, Polybase and Machine Learning/AI • Experience designing data pipelines, ingestion, storage, prep-train, model and serve using above technologies, Automating Azure Workloads, Data quality, Governance/standards, Security and legal compliance in the Azure architecture Experience in: • Evaluates architectural options and defines overall architecture of enterprise Data Lake and Data Warehouse. • Driving engagement with ITS Security and Infrastructure teams to ensure secure development and deployment of solutions. • Conducting industry research, facilitates new product and vendor evaluations, and assists in vendor selection. • Previous experience in Power BI, Data Modeling, Data Classification and zones, data movement, Data architecture and reporting • In-depth understanding of computer, storage, including backup, monitoring and DR environment requirement • knowledge and experience on Python or Scala or Powershell and API architecture in Azure • Experience with multiple, diverse technical configurations, technologies, and processing environments. • Good to have knowledge on Azure DevOps and Google Cloud.
0 0 votes
Article Rating
Subscribe
Notify of
guest
0 Comments
Most Voted
Newest Oldest
Inline Feedbacks
View all comments