Big Data Administrator

All vacancies of AustraliaInformation & Communication TechnologyBig Data Administrator

Cognizant is hiring Big Data Admin who can Design and implement Big Data and Cloud-based data processing and access solutions.

Summary about this job

Database Development & Administration

Company: Cognizant Technology Solutions Australia Pty Ltd

Location: Sydney

Work type: Full Time

Salary: n\a

Phone: +61-8-9435-3477

Fax: +61-3-4746-8645

E-mail: n\a

Site:

Detail information about job Big Data Administrator. Terms and conditions vacancy

  • Work with Cognizant - one of the world's fastest growing and largest IT company
  • Join a dynamic, diverse and global team
  • Permanent role based in Sydney

About Cognizant

Cognizant (NASDAQ: CTSH) is a leading provider of information technology, consulting, and business process services, dedicated to helping the world's leading companies build stronger businesses. Headquartered in Teaneck, New Jersey (U.S.), Cognizant combines a passion for client satisfaction, technology innovation, deep industry and business process expertise, and a global, collaborative workforce that embodies the future of work. Cognizant is a member of the NASDAQ-100, the S&P 500, the Forbes Global 2000, and the Fortune 500 and is ranked among the top performing and fastest growing companies in the world.

Our Culture

Your passion, integrity and experience are integral to Cognizant's success. You will be welcomed into a dynamic and expanding global leader in IT and Business consultancy where you will be valued for who you are. We take pride in our partnership with our clients, so your ability to add value and provide exceptional service to our clients are fundamental to your success. In return, you will be empowered with opportunities to develop your career and collaborate with talented colleagues in a supportive, diverse environment.

Cognizant is hiring Big Data Admin who can design and implement Big Data and Cloud-based data processing and access solutions.

Job Responsibilities: 

  • Deploying a hadoop cluster, maintaining a hadoop cluster, adding and removing nodes using cluster monitoring tools like Ganglia Nagios or Cloudera Manager, configuring the NameNode high availability and keeping a track of all the running hadoop jobs.
  • Implementing, managing and administering the overall hadoop infrastructure.
  • Takes care of the day-to-day running of Hadoop clusters
  • Participate in the architectural discussions, perform system analysis which involves a review of the existing systems and operating methodologies. Participate in the analysis of latest technologies and suggest the optimal solutions which will be best suited for satisfying the current requirements and will simplify the future modifications
  • Design appropriate data models for the use in transactional and big data environments as an input into Machine Learning processing
  • Design and Build the necessary infrastructure for optimal ETL from a variety of data sources to be used on GCP services
  • Develop data and semantic interoperability specifications
  • Collaborate with the business to scope requirements. 
  • Collaborate with several external vendors to support data acquisition
  • Analyse existing systems to identify appropriate data sources
  • Identify, Implement and continuously enhance the data automation process
  • Support continuous improvement in DevOps Automation
  • Provide design expertise with Master Data Management, Data Quality and Meta Data Management. 

Mandatory Skills: 

  • 8 + years of Data Engineering experience working with both distributed architectures, ETL, EDW and Big Data technologies
  • Extensive experience working with SQL across a variety of databases
  • Experience working with both structured and unstructured data sources using cloud analytics (Cloudera, Hadoop, Google Cloud Big Query, Big Table, TensorFlow, etc.)
  • Experience with Data Mapping and Modelling
  • Experience with Data Analytics tools
  • Demonstrated ability in one or more of the following programming or scripting languages- Python, JavaScript, Java, R, UNIX Shell, php, ruby.
  • Experience with NoSQL databases, such as HBase, Cassandra, MongoDB or similar
  • Experience with Big Data tools such as Pig, Hive, Impala, Sqoop, Kafka, Flume, Jupitor
  • Experience with Hadoop, HDFS
  • Experience with Google Cloud Services such as Streaming + Batch, Cloud Storage, Cloud Dataflow, Data Proc , DFunc, Big Query & Big Table. 
  • Knowledge and demonstrated use of contemporary data mining, cloud computing and data management tools including but not limited to Microsoft Azure, AWS Cloud, Google Cloud, hadoop, HDFS, mapr and spark.

Next Steps

If you would like to express interest in role, please click on the APPLY button now. Due to a high number of applicants, only shortlisted candidates will be contacted for a further discussion within 3-5 business days. We thank you for taking interest in this opportunity with us. For a complete list of opportunities with Cognizant visit http://www.cognizant.com/careers 
 

Cognizant is committed to providing Equal Employment Opportunities. Successful candidate will be required to undergo a background check.

Responds for Big Data Administrator on FaceBook

Read all comments for Big Data Administrator. Leave a respond Big Data Administrator in social networks. Big Data Administrator on Facebook, LinkedIn and Google+