Press ENTER to skip to the job description.
Director, Data and Dev Operations

Director, Data and Dev Operations
CT, Stamford

Job Description

Title: Director, Data and Dev Operations

Type: Full Time Permanent

Location: Stamford, CT

Must Have: Experience in AWS, DevOps, Data, CI/CD, and optimization/automation of dev timelines.

US citizens and Green Card Holders and those authorized to work in the US are encouraged to apply. We are unable to sponsor H1b candidates at this time”

The Director, Data & Dev Operations is responsible for orchestration of the WWE’s multi-cloud big data pipeline and integration platform. Reporting to the head of Data Technology, he/she will manage a team of production support engineers and operations for our big data management platforms. This role will be key to the rollout and continuous optimization of WWE’s Data platform and will partner with Development, cloud operations and privacy teams to manage and scale operations. Our environment is dynamic, fast-paced, and lots of fun.

  • Key Responsibilities:

    • Develop a Data CI/CD framework setting up best practices across the data management platform in AWS and Google Cloud Platform
    • Manage the technical flow of data from various collection sources through final delivery of product
    • Building Technology operations standard operating procedures across change and release management and data pipeline operations to incorporate industry best practices (SOX and other compliance bodies)
    • Implement Cloud Security Administration working with Cloud Ops
    • Responsible for creating and maintaining inbound and outbound Data Delivery SLA’s
    • Implement monitoring tools, measure metrics and improve performance for WWE’s data assets and analytics services in the cloud
    • Manage data operations across the company’s AWS enterprise data warehouse (EDW) and big data management platform (DMP)
    • Continuously re-assess the cloud compute capabilities to optimize environment, infrastructure and get the best value and performance out of database assets
    • Setup processes for user access to data and compute at scale while maintaining visibility into infrastructure uptime, performance and cost implications
    • Provide Support 24/7 support for operations
    • Plan and transition new products from Product Development into the Data Operations team
    • Define, implement and improve workflow processes and tooling leveraging economies of scale in the cloud across Informatica, Talend, pyspark, Glue
    • Investigate and troubleshoot data pipeline issues, triage with technical teams and data partners as needed.
    • Represent data technology department considerations to broader WWE cloud operations discussions and ensure alignment with overall enterprise cloud strategy
    • Be up to date with improvements in Technology across AWS, Google Cloud, MS Cloud as well as Open source tools to help with Sys Ops
    • Manage analytics and data consumption environments leveraging serverless computing (e.g., Qubole, Athena, or similar)


    • 7+ years experience in Data operations and Dev ops
    • 7+ years of technology management experience of onshore and offshore development team
    • 4+ years of experience with Hadoop Ecosystem including Spark, Storm, HDFS, Hive, HBase and other NoSQL databases
    • Experience in developing spark streaming applications analyzing the data through Spark (conducted ETL processes and connected to different SQL and Redshift databases)
    • 5+ years of solid experience in programming languages such as Python, Java, Ruby, C/C++, Node.JS, Bash etc., database technologies such as MySQL, SQL Server etc.
    • 3+ years of experience in design and development of packages using Puppet and/or Chef
    • Experience writing, maintaining and deploying RESTful services and Queue management systems such as AWS SQS
    • Experience in building and automating CI/CD pipeline
    • Experience in TeamCity Implementation
    • Experience in writing queries for moving data from HDFS to Hive and analyzing data
    • Solid understanding Partitions, Hive Query optimization, Bucketing etc.
    • Experience in Sqoop for moving data between RDBMS and HDFS
    • Strong understanding of programming paradigms such as distributed architectures and multi-threaded program design
    • AWS Dev ops and Sys operations certification is a plus
    • BS in Computer Science or similar technical degree
    • Media industry experience is a plus
    • AWS Solution Architect Certification is a plus
    • Experience in implementing user access, security and user activity logging across data platforms

Apply Now