Job Overview

Location
New York, New York
Job Type
Full Time
Date Posted
7 months ago

Additional Details

Job ID
22057
Job Views
149

Job Description

  • Design & Build data pipelines with an emphasis on scale, performance and reliability.
  • Provide technical expertise in the areas of design and implementation of Data Lake solution powered by Databricks on AWS cloud.
  • Ensure data governance principles adopted, data quality checks and data lineage implemented in each hop of the data
  • Partner with the data teams, enterprise architecture organization to ensure best use of standards for the key data domains and use cases
  • Continuous learner with an eye on emerging trends around data lake architecture and enterprise data solutions.
  • Ensure compliance through the adoption of enterprise standards and promotion of best practice / guiding principles aligned with organization standards
  • Bachelors or Masters degree in Computer Science or Information Technology
  • 8+ years of experience building solutions in big data technologies.
  • Strong experience programming with more than one of Java, Scala, Python, Spark.
  • Hands-on experience designing and building streaming data pipelines using Kafka, Confluent etc.
  • Strongly prefer experience building Data Lake & Data warehouse solutions using ETL,ELT pipelines on Databricks, Snowflake, Azure Data Lake etc.
  • Strong understanding of database and analytical technologies in the industry including MPP and NoSQL databases
  • Strong understanding of cloud platforms like AWS, Azure, or GCP, and their services (e.g., EC2, S3, AKS, EKS, etc.). AWS or any public cloud certification is a must.

Location

Similar Jobs

Full Time
Full Time

AXA XL

Cyber

Full Time

General Dynamics Land Systems

Cyber Security

Full Time

Cookies

This website uses cookies to ensure you get the best experience on our website. Cookie Policy

Accept