Skip to main content
Desktop Dark Background
Word Cloud

DevOps Engineer - Hadoop

Mast Jobs

The role:

We are looking for a Hadoop – DevOps Engineer to join our busy and dynamic team based in our India office.

Are you our next star player?

As part of our data platform team, you will be responsible for driving the design and development of core platform frameworks that enable the delivery and construction processes for the Data Management, Data Discovery and Analytics group, and using emerging big data technologies. You will be required to apply your depth of knowledge and expertise to all aspects of the software development lifecycle, as well as partner continuously with your different business partners daily to stay focused on common goals.

Why we need you

  • You will efficiently translate architecture and low-level requirements to design. You will perform optimizations on Big Data and investigation of job bottlenecks.

  • You will be responsible for the documentation, design, development of Hadoop applications and handle the installation, configuration, and support of Hadoop cluster nodes.

  • You will maintain and support backend MapReduce, Hive, Storm, Flink applications and Hadoop cluster. You will convert hard and complex techniques as well as functional requirements into detailed designs.

  • As a dedicated team member, you will propose best practices and standards, handover to the operations. You will test software prototypes and transfer them to the operational team. You will maintain data security and privacy across the data warehouse.

Who are we looking for

  • Familiarity with Hadoop ecosystem and its components. Like HDFS, KAFKA, FLINK, HIVE, YARN, HBASE etc with exceptional knowledge of Hadoop administration.
  • Have experience with  AWS cloud
  • Hands on experience in installation, configuration, supporting and managing Hadoop Clusters using Apache, Cloudera or Hortonworks.
  • Have experience in Development Operations (DevOps), Software Configuration Management, Build and Release Management.
  • Have knowledge in Zookeeper, journal Nodes, Hadoop HA, Hadoop HA Federation, Hue, MapReduce, HBase, Hive, Apache Ranger, Apache Sentry, Kerberos and Apache Knox.
  • Good experience in setting up the High-Availability Hadoop Clusters. focusing on high-availability, fault tolerance, and auto-scaling.
  • Good knowledge of (Windows / Linux) system, better if RedHat/CentOS.
  • Proficiency with at least one scripting language like Bash/PowerShell/Python etc.


  • Knowledge and experience with DevOps automation tools and on CI using either Maven, Nexus or Jenkins.
  • Knowledge on Docker, Kubernetes and Ansible/Salt Stack to automate Configuration management & Applications.
  • Understanding of networking and firewalls and storage systems (DAS, NAS, SAN, FC etc) and file systems.
  • Good understanding of SDLC and distributed data / systems / architectures.
  • Knowledge on Job Automation and Monitoring like Grafana, Ganglia, Kibana, Nagios
  • Analytical and problem-solving skills; the implementation of these skills in Big Data domain.

What’s in it for you?

Our experience-based salaries are competitive.

Your package will include:

  • Discretionary annual performance bonus

  • 30 days paid leave

  • Annual allowance for health and dental services

  • The option to join our company pension scheme

  • A personal interest allowance to let you learn something new or pursue a hobby

  • Looking to extend your family? You will receive a cash gift of 34,000 INR for your new addition whilst working for us

  • 26 weeks Maternity leave at 100% pay & 4 weeks secondary leave pay (paternity) at 100% pay

  • External learning support of up to £2,000 or equivalent in local currency, dedicated 4 learning “Power Hours” every month during office time, full access to the Udemy and Mindtools platforms, in-house leadership program and many other training opportunities for developing your skills and progressing your career.

What happens next?

We will aim to get back to you as soon as possible. If you meet the criteria, then we’ll invite you to a phone interview and if that goes well we’ll meet you for a zoom/face-to-face interview.

The Group

PokerStars is part of Flutter Entertainment Plc, a global sports betting, gaming and entertainment provider headquartered in Dublin and part of the FTSE 100 index of the London Stock Exchange. Flutter brings together exceptional brands, products and businesses and a diverse global presence in a safe, responsible and ultimately sustainable way.

We are an equal opportunity employer that values diversity. We do not discriminate on any protected characteristic as defined by applicable law.

We will look to provide reasonable accommodation for applicants with disabilities to participate in the job application or interview process. If you need assistance, please contact:  

Please note we cannot accept general applications; this inbox is just for providing support to those who need it.

DevOps Engineer - Hadoop

  • India Office - Hyderabad
Apply Now

Your Rewards

Here at Flutter International, if you’re up for a challenge, the rewards are great.

Work your way

We don't tell our brands what to do, we empower and support them to create the best results possible.

It's the same for our people too. We'll work with you to find the arrangement that brings out your best and make it a reality.


Be part of our talent community

Join our talent community to be the first to hear the latest opportunities across Flutter International.

Join now
Be part of our talent community