Hi,
Hope you are doing well..
Only Local for CA
Requirement 1
Azure ML Engineer( (Data bricks, Data factory, Delta lake, Azure synapse)
Location: Santa Clara, CA- Onsite
Required skill set: Machine Learning, Azure service, LLM, NLP
Description:
• Experience in designing and implementing scalable and secure Azure cloud solutions using Data Bricks ML platform.
• Model development , deployment and job scheduling using unstructured data.
• Certification in databricks ML
• Expert level in Designing and Architect solutions in Azure Data factory, Azure Databricks, Azure Datalake, Delta Lake and Azure synapse analytics implementation.
• Experience in Azure cloud technologies like PySpark, Synapse, ADF, Databricks, Python, Scala and SQL.
• Have good experience configuring Microservices using Docker, Kubernetes on Azure Data Bricks
• Extensive Experience on working on Azure AI services including Data Bricks and Azure cognitive services
• Highly preferred experience on working with Azure OpenAI services
• Hands-on experience on design, and optimizing LLM, natural language processing (NLP) systems, frameworks, and tools.
Requirement 2
Role: ML Azure Data Bricks (Model deployments)
Location: Santa Clara, CA – Onsite opportunity
Must require skill set: Databrick, Retrieval-Augmented Generation (RAG), Python scripting, Generative AI (Gen AI)
Job Description :
- •At least 8+ years’ experience, ideally within a Data Science role.
- •Primary responsibility will be to develop and optimize custom ML/AI algorithms for new and existing applications.
- •Broad knowledge of computer vision, NLP, time series forecasting, anomaly detection
- •Knowledge of traditional ML algorithms such as, regression, classification, and clustering algorithms
- •Knowledge of state-of-the-art deep learning model architectures in the areas of computer vision (NLP would be a plus)
- •Experience in implementing and optimizing object detection and instance/semantic segmentation models
- •Experience in setting up end-to-end pipelines for model deployment
- •Experience in model performance tracking using appropriate KPIs
- •Strong fundamentals in Python programming
- •Good knowledge of OpenCV, Scikit-image, TensorFlow, Torch, Pillow, numpy, pandas, scikit-learn etc.
- •Understanding of SW development cycle, from requirements to testing, integration and delivery
- Familiarity with model shrinking techniques for deployment on edge devices with limited footprint
Nice to have:
- Experience in process improvement in manufacturing industries using ML/AI
- Experience in defect identification and root cause analysis in manufacturing domain
Requirement 3
Role: Python Developer
Location: Santa Clara, CA- Onsite
Must have skills: Elastic Search, Fast API, Tornado, (Django), Python Scripting, Docker, Kubernetes, MongoDB, Microservices and Rest APIs
Responsibilities:
1. Gathering early functional and non-functional requirements
2. Provide detailed technical architectural blueprints.
3. Setting quality standards & Change management.
4. Perform regular code reviews to ensure the design quality and avoid overly complicated structures.
5. Should have hands-on work on prototype development, code contributions, or technological assessment.
6. Collaborate and mentor development team and enhance their knowledge.
7. Knowledge of software development lifecycle, DevOps (build, continuous integration, deployment tools) and standard methodologies
8. Experience in working source control management systems like git, Bitbucket and managing packages using private registries like Jfrog
9. Understanding of fundamental design principles behind a scalable application
Tech Stack :
1. Full Stack, Python, Go, RestAPIs, Middleware – database middleware, application server middleware, message-oriented middleware, web middleware,
2. and transaction-processing monitors, Docker, Kubernetes
3. Data Structures, design patterns
4. Microservices & REST APIs – FastAPI/Django/Tornado
5. Databases & SQL – Postgres, Clickhouse, MSSQL, Mongo DB, Snowflake
6. Caching & Queuing – Kafka, Redis
7. Orchestrator tools