Role: Hadoop Tech Lead
Location : Charlotte, NC
Duration: 12-18 months
Pay Rate: $65 to $68 per hour on W2
• Expertise in Data technologies and Big Data tools, like Spark, Kafka, Hive, HBase, Sqoop, Pig, Impala, Flume, Oozie, MapReduce, etc.
• Solid knowledge of state-of-the art programming languages like Java, Python, Scala and object-oriented approaches in designing, coding, testing and debugging programs.
• Design, build and maintain Big Data workflows/pipelines to process billions of records in large-scale data environments with experience in end-to-end design and build process of Near-Real Time and Batch Data Pipelines.
• Leads code review sessions to validate adherence with development standards and benchmark application performance by capacity testing.
• Experience with software testing frameworks.
• Leverages DevOps techniques and Experience with DevOps tools - GitHub, Jira, Jenkins, Crucible for Continuous Integration, Continuous Deployment and build automation.
• Develop, implement and optimize streaming, data lake, and big data analytics solutions
• Support reusable framework and data governance processes by partnering with LOBs for any code/requirements remediation
• Engage in application design and data modeling discussions also participate in developing and enforcing data security policies
hadoop, big data, bigdata, hbase, Hive, spark, lead, developer, kafka, Sqoop, Pig, Impala, Flume, Oozie, MapReduce