Software Engineering Specialist – Big Data, Dev Specialist (Pune+128890)

Company Name: Amdocs Inc

Location: Pune, MH, IN

Job Duration: 2021-10-14 to 2021-11-13


Job ID: 128890 
Required Travel :Minimal 
Managerial – No
Location: India- Pune (Amdocs Site) 

Who are we?

If you’re a smartphone user then you are part of an ever more connected and digital world. At Amdocs, we are leading the digital revolution into the future. From virtualized telecommunications networks, Big Data and Internet of Things to mobile financial services, billing and operational support systems, we are continually evolving our business to help you become more connected. We make sure that when you watch a video on YouTube, message friends on Snapchat or send your images on Instagram, you get great service anytime, anywhere, and on any device. We are at the heart of the telecommunications industry working with giants such as AT&T, Vodafone, Telstra and Telefonica, helping them create an amazing new world for you where technology is being used in amazing new ways every single day.

In one sentence

Responsible for the design, development, modification, debugging and/or maintenance of software systems. Works on specific modules, applications or technologies, and deals with sophisticated assignments during the software development process.

What will your job look like?

  • · Data Platform senior technical lead in both Data Engineering and Data Platform Administration
  • · Design, Create, Enhance & Support Hadoop data pipelines for different domain using Big Data Technologies.
  • · Develop big data infrastructure platform enhancements
  • · Perform application analysis and propose technical solution for application enhancement, optimizing existing ETL processes etc.
  • · Handle and resolve Production Issues (Tier 2 & weekend support) & ensure SLAs are met.

All you need is…

  • · 4+ years of Hadoop architecture and other relevant Big data tools like HBase, HDFS, Hive, Map-reduce etc.
  • · Superb communication and collaboration skills
  • · Independent and Self-learning attitude
  • · Strong Experience with
  • · Hadoop platform (Ambari etc.)
  • · Kafka message queuing technologies
  • · Apache Nifi Stream Data Integration
  • · RESTful APIs and open systems.
  • · Object-oriented/Object function scripting using languages such as R, Python, Scala, or similar.
  • · Informatica BDM or other Big Data ETL tool to implement complex data transformations.
  • · Strong ability to design, build and manage data pipelines in Python and related technologies.
  • · Hands on in SQL, Unix & advanced Unix Shell Scripting.
  • · Knowledge of handling xml, json, structured, fixed-width, un-structured files using custom Pig/Hive.
  • · Knowledge of any cloud technology (AWS/Azure/GCP

Good to have skills:

  • · Experience working with Data Discovery, Analytics and BI software tools like Tableau, Power BI.
  • · Knowledge of handling xml, json, structured, fixed-width, un-structured files using custom Pig/Hive.
  • · Understanding/ experience with Data Virtualization tools like TIBCO DV, Denodo etc.
  • · Understanding/ experience with Data Governance tools like Informatica Data Catalog etc.
  • · Knowledge of any cloud technology (AWS/Azure/GCP) is a plus

Why you will love this job:

  • The chance to serve as a specialist in software and technology.
  • You will take an active role in technical mentoring within the team.
  • We provide stellar benefits from health to dental to paid time off and parental leave!

 Amdocs is an equal opportunity employer. We welcome applicants from all backgrounds and are committed to fostering a diverse and inclusive workforce