Requirements:
• Experience in Data modeling and advanced SQL techniques
• Experience working on cloud migration methodologies and processes including tools like Databricks, Azure Data Factory, Azure Functions, and other Azure data services
• Expert in SQL, Python, Spark, Databricks
• Experience working with varied data file formats (Avro, json, csv) using PySpark for ingesting and transformation
• Experience with DevOps process and understanding of Terraform scripting
• Understanding the benefits of data warehousing, data architecture, data quality processes, data warehousing design and implementation, table structure, fact and dimension tables, logical and physical database design
• Experience designing and implementing ingestion processes for unstructured and structured data sets
• Experience designing and developing data cleansing routines utilizing standard data operations
• Knowledge of data, master data, metadata related standards, and processes
• Experience working with multi-Terabyte data sets, troubleshooting issues, performance tuning of Spark and SQL queries
• Experience using Azure DevOps/Github actions CI/CD pipelines to deploy code
• Microsoft Azure certifications are a plus
• Minimum of 7 years of hands-on experience working on design, configuration, implementation, and data migration for medium to large sized enterprise data platforms
Core Technologies
. Power BI
• Azure Functions
• Python
• SQL Server
• Azure Data Factory
• Azure Databricks
• Terraform
• Azure DevOps
• GitHub / GitHub Actions
• T-SQL and SQL stored procedures
• Azure Log Analytics
• Azure Data Lake Storage
• Azure Synapse