What are my responsibilities?
As a Data Engineer, you are required to:
• Work with large, complex data sets that meet both functional and non-functional business requirements.
• Build the necessary infrastructure for extracting, transforming, and loading data from various sources. Using technologies like Azure and SQL.
• Implement / Deploy large, structured and unstructured databases based on scalable cloud infrastructures.
• Port data across data-sources /DB systems to improve performance, scalability or to facilitate analytics.
• Develop technical solutions which combine disparate information to create meaningful insights for business, using Big-data architectures (supportive role along with Data Architect)
• Stay curious and enthusiastic about using related technologies to solve problems and enthuse others to see the benefit in business domain.
Qualification: Bachelor's or Master's in Computer Science & Engineering, or equivalent. Professional Degree in Data Science, Engineering is desirable.
Experience level: At least 3 - 4 years hands-on experience in Data Warehousing, ETL and RDBMS. Experience on Data Analytics and should be inclined towards cloud technologies like Azure/AWS.
Desired Knowledge & Experience:
• Expert knowledge and multi-year (3+) experience on Data Warehousing, ETL, Data Modeling & Data migration projects.
• Well-versed in relational database design and experience with processing and managing large data sets (multiple TB scale). (e.g. T-SQL, Microsoft SQL, Snowflake).
• Exceptional knowledge of Data Warehousing solutions (Snowflake).
• Exceptional knowledge in Data Warehousing concepts (Data Modelling, STAR Schema, Snowflake Schema and various Data Warehousing concepts).
• Strong working knowledge on Extract Transform & Load concept using Azure Data Factory.
• Good know-how & experience on DataOps (Data Orchestration / Workflow management & Monitoring Systems) and suggest improvements / inputs for CI/CD.
• Knowledge of any reporting/BI tool (e.g. Qlik, Tableau, PowerBI etc…).
• Knowledge of Azure cloud-based Data Storage and Services like (Azure BLOB, AZURE Databricks, Azure Data Factory, etc) is preferred.
• Sound knowledge in programming languages like Python, R will be an added advantage.
• Strong inclination towards working in any BigData projects.
• Strong written/communication skill to be work with partners along the globe.
Required Soft skills & Other Capabilities:
• Great attention to detail and good analytical abilities.
• Good planning and organizational skills
• Collaborative approach to sharing ideas and finding solutions.
• Ability to work independently and also in a global team environment.