Role: Big Data Engineer with GCP
Location: Phoenix, AZ (Day 1 onsite)
Client: Confidential
Job Description:
We are seeking a highly skilled Big Data Engineer with strong experience in Google Cloud Platform (GCP) and Apache Spark technologies. The ideal candidate should have a solid background in Scala, Spark Streaming, and hands-on development experience with big data processing pipelines.
Required Skills:
- 8+ years of overall IT experience in data engineering and big data technologies.
- Strong hands-on experience with Apache Spark and Spark Streaming.
- Proficiency in Scala is a must.
- Solid experience working with Google Cloud Platform (GCP) services (e.g., Dataflow, BigQuery, Pub/Sub, Cloud Storage).
- Familiarity with Java and Python for data processing tasks (good to have).
- Proven ability to design and implement scalable, distributed big data systems.
- Experience with real-time/streaming data processing.
Additional Information:
- Excellent problem-solving and communication skills.
- Must be available to work onsite from Day 1 in Phoenix, AZ.
- Immediate joiners or candidates with short notice preferred.
Job Type: Contract
Pay: $50.00 - $55.00 per hour
Expected hours: 50 per week
Benefits:
- 401(k)
- Dental insurance
- Health insurance
Schedule:
Work Location: In person