Job Details
Hiring for Spark,Hadoop with Java - Singapore
Hadoop
Singapore
Singapore
2020-09-09 09:49:25

Job Duties & responsibilities (List the principal duties. Use concise statements that provide a clear understanding of the level of responsibility, complexity, creativity and analysis performed in this position.)

 

 

  1. Design and Develop data pipelines from Ingestion to Analytics/Compute using Spark and other tools. Ability to understand the data problem at hand and implement it according to the direction from the architect.
  2. Server a point of contact for the customers, management, and tech team and ensure the transparency in product status and strive for successful delivery
  3. Able to estimate the efforts, lead the team and help resolve issues 
  4. Come up with best practice and ensure the team follow all the standards and engineering principles
  5. python/java/spark experience AND RDBMS understanding to build the business logic 
  6. python/spark engineer for DAG development
  7. Expose data as a service (DaaS)– python/java/REST/sql experience
  8. Adoptability to Agile methodology and scrum rituals

 

 

 

Required Experience (Indicate nature and extent of work experience including minimum number of years required.)

  1. Overall 10+ years of Data initiative experience 
  2. 6+ years of hands on experience in distributed data architectures is a must
  3. 6+ years of Core Java 8 or Scala or Python is a must
  4. Working knowledge of Spark is a must
  5. Working knowledge of Hadoop is a must
  6. Should be strong in SQL
  7. Exposure to data visualization tools