As a hands-on data engineer with experience in Snowflake, Azure Databricks, API development, and Kafka, your role would involve building and maintaining data pipelines and data processing workflows for healthcare organizations.
Your responsibilities may include:
1. Designing, building and maintaining data pipelines and data processing workflows using technologies such as Snowflake, Azure Databricks, and Kafka.
2. Developing and maintaining APIs for data access and integration with other systems. 3. Collaborating with cross-functional teams such as data architects, developers, data scientists, and business analysts to ensure data solutions meet business requirements. 4. Ensuring data security, privacy, and
compliance with regulations such as HIPAA. 5. Monitoring and troubleshooting data solutions to ensure they meet performance and scalability requirements.
To be successful in this role, you should have hands-on experience with Snowflake, Azure Databricks, API development, and Kafka, as well as a strong understanding of healthcare data and systems. You should be able to write code using programming languages like Python, Scala or Java. You should also have excellent communication and collaboration
Azure Data Engineer
Job Title: Data Engineer
Remote
HealthCare Experience Needed
NO GC GCEAD
EXP: 8+ Overall 5+ in health care
Subscribe
Login
0 Comments
Most Voted