KEY RESPONSIBILITIES AND REQUIRED SKILLS
- Knowledge of ReactJs/Angular is a plus.
- Big Data Engineer with solid background with the larger Hadoop ecosystem and real-time analytics tools including PySpark/Scala-Spark/Hive/Hadoop CLI/MapReduce/Storm/Kafka/Lambda Architecture.
- Comfortable with using the larger Hadoop eco system.
- Familiar with job scheduling challenges in Hadoop.
- Experienced in creating and submitting Spark jobs.
- Experienced with Kafka/Storm and real-time analytics.
- Core Java and Python/Scala background and their related libraries and frameworks.
- Experienced with Spring Framework and Spring Boot.
- Unix/Linux expertise; comfortable with Linux operating system and Shell Scripting.
- PL/SQL, RDBMS background with Oracle/MySQL.
- Familiarity with ORMs a plus.
- Design, Development, Configuration, Unit and Integration testing of web applications to meet business process and application requirements.
- Familiar with config management/automations tools such as Ansible/Chef/Puppet.
- Comfortable with microServices, CI/CD, Dockers, and Kubernetes.
- Familiarity with AT&T’s ECO platform is a plus.
- Comfortable tweaking/using Jenkins and deployment orchestration.
- Creating/modifying Dockers and deploying them via Kubernetes.
EXPERIENCE – 8 To 10 Years
LOCATION – Bangalore
REGISTRATION LINK – https://forms.gle/JVZkhqmaSfs3wbuj7