Job Overview

Location
Austin, Texas
Job Type
Full Time
Date Posted
1 month ago

Additional Details

Job ID
24869
Job Views
40

Job Description

Your day to day:

Technical Expertise: Deep understanding of data engineering principles, data modeling, ETL processes, and data warehousing/data lakes. Proficiency in SQL and relevant programming languages (e.g., Python, Java). 

Data Pipeline Tools: Experience with data pipeline tools such as Apache Airflow, Apache Kafka, or similar technologies, experience with UrbanCode Velocity is a plus. 

Cloud Platforms: Familiarity with cloud platforms such as AWS, Azure, or GCP, and their data services. 

Problem-Solving Skills: Strong analytical and problem-solving skills, with the ability to troubleshoot data issues and optimize data pipelines. 

Attention to Detail: Meticulous approach to data quality and accuracy, ensuring data integrity throughout the pipeline. 

Communication Skills: Ability to communicate technical concepts clearly and collaborate effectively with cross-functional teams. 

What you need to bring:


  • 5+ years of experience in data engineering, with a focus on data pipeline development and data architecture. 

  • Experience using IBM UrbanCode Velocity or similar enterprise-scale release management applications 

  • Familiarity with data visualization tools (e.g., Tableau, Looker, powerBI) and analytics frameworks.  

  • Strong Proficiency in SQL and experience with big data technologies (e.g., Hadoop, Spark, Kafka). 

  • Proficiency with at least one programming language (eg, Java) 

  • Experience with cloud platforms (e.g., AWS, GCP, Azure) and data storage solutions (e.g., Redshift, BigQuery, Snowflake). 

  • Customer-Centric Mindset with a track record of synthesizing customer insights, constructing product roadmap, demonstrated ability to empathize with the developers, and influence developers and stakeholders at all levels of the organization.   

  • Excellent problem-solving skills and the ability to work collaboratively in a fast-paced environment. 

  • Data Pipeline Development: Design, build, and maintain robust data pipelines to collect, process, and analyze data from various developer tools and the SDLC utilizing tools such as UrbanCode Velocity to streamline and optimize the pipeline development process.
  • Data Modeling: Design and implement data models that support efficient data storage, retrieval, and analysis.
  • Data Quality Assurance: Implement best practices for data quality, integrity, and security to ensure reliable analytics and reporting.
  • Data Warehousing/Data Lake: Design and manage the data warehouse or data lake, ensuring data is organized and optimized for reporting and analysis.
  • ETL Processes: Develop and maintain efficient ETL processes to transform raw data into usable formats for reporting and analysis.
  • Performance Optimization: Continuously monitor and optimize data pipelines and queries to ensure optimal performance and resource utilization.
  • Collaboration: Work closely with product managers, software engineers, and other stakeholders to understand data requirements and translate them into effective data solutions.

Qualification

Bachelor’s degree

Experience Requirements

fresher experience

Location

Similar Jobs

Full Time
Full Time

Thomson Reuters

Data Science

Full Time
Full Time

Cookies

This website uses cookies to ensure you get the best experience on our website. Cookie Policy

Accept