Hadoop Data Engineer

San Jose, CA 94088

Job Category: Information Technology Job Number: 121348

Job Title           Hadoop Data Engineer

Location          San Jose, CA - 94088

Duration          6mnths

About Our Client: Part of the Group, client is a US$4.6 billion company with over 115,000 employees across 90 countries. It provides services to customers which include Fortune 500 companies. It is also one of the Fab 50 companies in Asia, a list compiled by Forbes. client was ranked #5 in India's software services (IT) firms and overall #111 in Fortune India 500 list for 2012. client, on 25 June 2013, announced the completion of a merger with other company.

Job Description:

  • the Data and Analytics Platform team resides within Adobe's Information and Data Services team and we are looking for a Hadoop Platform Engineer who will be responsible for the implementation and ongoing administration of Hadoop infrastructure including designing, deploying, monitoring, tuning and troubleshooting.
  • The challenge:
  • The Cloud Technology organization builds platform and client services that are foundational building blocks for many other Adobe products and services. Areas of focus include: identity, security, cloud storage, e-commerce, workflow management, synchronization, customer facing web apps, scalability, infrastructure management and search, just to name a few. Our mission is to build highly scalable, highly available and highly resilient services that fulfill the business objectives of Adobe.
  • The Data and Analytics Platform team resides within Adobe's Information and Data Services team and we are looking for a Site Reliability Engineer who will be responsible for the implementation and ongoing administration of Hadoop infrastructure including monitoring, tuning and troubleshooting.
  • Responsibilities:
  • Own the platform architecture and drive it to the next level of effectiveness to support current and future requirements
  • Design, implement and maintain security, data capacity, node forecasting and planning.
  • Provide hardware architectural guidance, plan and estimate cluster capacity, and create roadmaps for the Hadoop cluster deployment.
  • Closely work with the Hadoop development, infrastructure, network, database, and business intelligence teams.
  • Participate in a 12x7 rotation for production issue escalations.
  • Communicate effectively with people at all levels of the organization.
  • Qualifications:
  • Experience in deploying/configuring Cloudera/HDP/core Apache Hadoop and related infrastructure.
  • Experience in tools integration, automation, configuration management systems.
  • Experience with networking, systems administration skills.
  • 3+ years of hands on working experience on Hadoop infrastructure stack (Ex: HDFS, MapReduce, HBase, Flume, Spark, Pig, Hive, Oozie, YARN, Zookeeper, Presto, etc).
  • 2+ years of experience in Python or Perl.
  • BA/BS degree in Computer Science or related technical discipline, or equivalent practical experience.         Sub Con        65

About ASK: ASK Staffing is an award-winning technology and professional services recruiting firm servicing Fortune 500 organizations nationally. With 5 nationwide offices, two global delivery centers, and employees in 42 states-ASK Staffing connects people with amazing opportunities

#ASK123 

Contact :

Yamini:       yaminis@askstaffing.com       -     678-203-2377

                    techmhyd@askstaffing.com

Yamini Seelam
Sr Client Service Manager

Send an email reminder to:

Share This Job:

Related Jobs:

Login to save this search and get notified of similar positions.