Click here to join our community of experts to get information on job search, salaries and more.

Sibitalent Corp

Big Data Developer

Company: Sibitalent Corp

Location: Plano, TX

Posted on: November 21

#W2Position

Job Posting: Big Data Engineer

Client: Bank of America

Location: Plano, Texas ( Open to relocate Jackson, Charlotte )

Job Type: 6 months Contract

Visa: USC/GC/H4EAD/GCEAD

Work Hours: 8:00 AM to 5:00 PM

Interview : 2 Round Of Technical


Job Summary:

We are seeking an experienced Big Data Engineer to join the team at Bank of America. This role focuses on developing and implementing solutions for data warehousing, data marts, and master data management within a Hadoop ecosystem. The ideal candidate will have expertise in designing scalable platforms using modern big data technologies.


Responsibilities:

  • Analyse and redesign the current Hadoop-based Master Data Management platform, workflows, and transformations to create scalable solutions using streaming data pipelines.
  • Reengineer traditional database systems and stored procedures using Big Data services.
  • Develop and implement solutions using Confluent Kafka, including connectors, Kafka Streams, and KSQL.
  • Design and optimize PySpark and Spark SQL solutions.
  • Evaluate and performance-tune Hive managed tables and implementations.
  • Design, implement, and tune Apache Phoenix and HBase solutions.
  • Create and maintain shell scripts (e.g., Bash, Python) for automation.
  • Manage multiple priorities and ensure timely delivery of solutions.
  • Package, promote, and maintain code across development, test, and production environments.


Required Skills:

  • 10+ years of IT experience, with at least 5 years in Data Warehousing, Data Marts, or Master Data Management.
  • In-depth knowledge of the Hadoop ecosystem, including HDFS, Spark, Sqoop, Oozie, Kafka, Hive (Managed & Iceberg formats), Phoenix, and HBase.
  • Expertise in Python, PySpark, Spark SQL, and object-oriented programming concepts.
  • Advanced SQL skills for query optimization and database management.
  • Experience with RDBMS (Oracle, DB2, SQL Server).
  • Proficiency in SDLC and development best practices.
  • Ability to translate mid-level design documentation into low-level design and deliver effective solutions.


Must have Skills:

  • Experience with Trino.
  • Experience with Apache Flink.


Qualifications:

  • Minimum of 10 years of relevant experience in IT and Big Data Engineering.
  • Bachelors Degree in Computer Science, Information Technology, or a related field.




Thanks & Regards

Akash Pandey

Technical Recruiter

Mob:+19362893006 EXT: 006

E-Mail: [email protected]

Website: www.sibitalent.com

Similar Jobs