Job description
We are seeking a dynamic team player to support multiple service offerings in the High Performance Computing (HPC) service, join multiple HPC service offering teams, collaborating closely with colleagues in both North America and Asia-Pacific. Provide day-to-day support and guidance to software engineers, data analysts and data scientists on using GCP Dataproc and Dataflow services.
1. Python/Pyspark
2. Apache Beam Coding Skills
3. Apache Airflow
4. Terraform Coding Skills
5. Tekton Pipeline Experience
6. SQL, Java, Shell scripting
7. Docker/Kubernetes experience
1. Provide day-to-day support and guidance to software engineers, data analysts and data scientists on using GCP Dataproc and Dataflow services i.e. job and cluster creation, error handling, etc..
2. Help troubleshoot issues on GCP and Hadoop platforms including but not limited to Dataproc, Dataflow, Hive, Spark, Oozie, Kafka, NiFi, etc.
3. Perform testing and PoCs for different Dataproc and Dataflow use cases
4. Participate in monthly On-Call rotation
Industry Type: Auto Components
Department: Data Science & Analytics
Employment Type: Full Time, Permanent
Role Category: Business Intelligence & Analytics
Education
UG: Any Graduate
PG: Any Postgraduate Apply Here :Cilck here
Comentarios