Responsibilities

  • The desired candidate should have 5+ years of experience with strong knowledge of Data Storage, Data structures, Data Processing, and Data Security
  • Ability to design, develop, and implement end-to-end data pipelines for efficient data extraction and transformation
  • Experience in Workflow management using Airflow and dealing with large volumes of data
  • Should write complex SQL queries and optimize database performance for various data retrieval needs
  • Collaborate with the infrastructure team to ensure the overall health and scalability of the data platform
  • Ability to tackle complex problems, learn quickly and independently, and proactively recommend solutions to improve efficiency or effectiveness
  • Good team player, quick learner, capable of working in high-demand work environments, and ready to accept any challenges
  • Good verbal/written communication skills with the ability to work in large teams and interact with technical teams