The QuaXigma was launched with a mission to make A.I accessible and affordable and deliver AI Products/Solutions at scale for enterprises by bringing the power of Data, AI, and Engineering to drive digital transformation. We believe without insights, businesses will continue to face challenges to better understand their customers and even lose them; Secondly, without insights businesses won’t’ be able to deliver differentiated products/services; and finally, without insights, businesses can’t achieve a new level of “Operational Excellence” is crucial to remain competitive, meeting rising customer expectations, expanding markets, and digitalization.
We are seeking a creative, collaborative, adaptable Azure Data Engineer to join our agile team of highly skilled data scientists, data engineers, and UX developers.
About the job
This role will be responsible for creating Data orchestration with Azure Data Factory Pipelines & Dataflows. Key role is to understand the business requirements and implement the requirements using Azure Data Factory.
Roles & Responsibilities:
- Understand the business requirement and actively provide inputs from Data perspective
- Understand the underlying data and flow of data.
- Build simple to complex pipelines & dataflows.
- Should be able to implement modules that has security and authorization frameworks.
- Recognize and adapt to the changes in processes as the project evolves in size and function.
- Expert level knowledge on Azure Data Factory
- Advance knowledge of Azure SQL DB & Synapse Analytics, Power BI, T-SQL, Logic Apps, Function Apps
- Should be able to analyze and understand complex data
- Monitoring day-to-day Data factory pipeline activity
- Knowledge of Azure data lake is required and Azure Services like Analysis Service, SQL Databases, Azure DevOps, CI/CD is a must.
- Knowledge of master data management, data warehousing, and business intelligence architecture.
- Experience in data modelling and database design with excellent knowledge of SQL Server best practices.
- Excellent interpersonal/communication skills (both oral/written) with the ability to communicate at various levels with clarity & precision.
- Should have clear understanding of DW lifecycle and contribute to preparing Design documents, Unit Test plans, Code review reports.
- Experience working in Agile environment (Scrum, Lean, Kanban) is a plus
- Knowledge of Big data technologies – Spark Framework, NoSQL, Azure Data Bricks, Python.
Qualifications & Experience:
Bachelor’s or Master’s degree in computer science or a related field.
At least 3-5 years of Data engineering or Software development experience