Click here to join our community of experts to get information on job search, salaries and more.

HCL Global Systems Inc

Now Hiring: Senior Data Engineer / Data Analyst / Informatica Developer

Company: HCL Global Systems Inc

Location: Hybrid

Posted on: September 16

Role Summary

We are actively seeking experienced Data Engineers / Data Analysts / Informatica Developers / Reporting Analysts with strong hands-on skills across the data ecosystem. Youll design, optimize, and maintain data pipelines, ETL/ELT workflows, reporting frameworks, and dashboards in a fast-paced, enterprise-level environment.


Due to client requirements, applicants must be willing and able to work on a W2 basis. For our W2 consultants, we offer a comprehensive benefits package including Medical, Dental, and Vision coverage


.

Requirements Recap:

  • Employment Type: W2 only (consultants must be willing to work under our W2)
  • Visa: Open to all work authorizations
  • No C2C / No Third Parties
  • ?? Location: Westlake, TX; Smithfield, RI; Durham, NC; Merrimack, NH; Boston, MA; Jersey City, NJ.
  • Work Model: Hybrid 2 Weeks Onsite (Alternate Weeks), 2 Weeks Remote


Key Responsibilities

  • Design, build, and maintain end-to-end data pipelines (ETL/ELT).
  • Develop & optimize Informatica mappings and workflows.
  • Perform data extraction, transformation, and loading using SQL, PL/SQL, and Informatica.
  • Build and manage reporting dashboards using Power BI, Tableau, or QlikView.
  • Gather & translate business requirements into technical solutions.
  • Automate data ingestion & validation via scripts.
  • Enforce data governance, quality, and security best practices.
  • Participate in code reviews, documentation, and process optimization.


Required Skills

  • Strong expertise in SQL & PL/SQL (Oracle or equivalent RDBMS).
  • Informatica PowerCenter / Informatica Cloud hands-on experience.
  • Proficient in data warehousing, data modeling, and ETL/ELT processes.
  • Skilled in reporting/BI tools: Power BI / Tableau / QlikView / OBIEE.
  • Python or Shell scripting for data processing (plus).
  • Experience with large-scale structured & unstructured datasets.
  • Familiarity with cloud platforms (AWS, Azure, Snowflake, Databricks preferred).
  • Knowledge of version control (Git) and CI/CD (Jenkins).
  • Experience in Agile (JIRA, Confluence).


Nice-to-Have

  • Exposure to data lake/data mesh architecture.
  • Experience with Databricks / Snowflake / Azure Data Factory.
  • Financial domain background (asset management, trading, investments).