Pyspark Developer

Pyspark Developer
CT, Stamford

Job Description

Pyspark Developer

Stamford, CT

6 Months Contract


  • 8 + years experience in Data Warehousing 
  • 3 + year of technology  experience of onshore and offshore reporting development team
  • 3 + years of Agile development
  • 3 + years Experience with Hadoop Ecosystem including Spark, Storm, HDFS, Hive, HBase and other NoSQL databases
  • BS in Computer Science or similar technical degree
  • Experience in developing Spark Streaming applications analyzing the data through Spark (conducted ETL processes and connected to different SQL and Redshift databases)
  • Experience in writing queries for moving data from HDFS to Hive and analyzing data
  • Understanding Partitions, Hive Query optimization, Bucketing etc.
  • Experience in Sqoop for moving data between RDBMS and HDFS
  • Strong understanding of programming paradigms such as distributed architectures and multi-threaded program design
  • Should be very strong in algorithms, collection framework and build high performance engine to handle large amount of data
  • Experience working in a Data warehouse environment and ETL is a huge plus

 Please email resume to

Apply Now