Click here to join our community of experts to get information on job search, salaries and more.

Kellton

Data Engineer

Company: Kellton

Location: Charlotte, NC

Posted on: January 10

Hi

Greetings !

We are looking for Sr. Data Engineer/Analytics Engineer for our Direct client & 100% Remote Opportunity

Below are more details on it.

Please do let me know if you/your friends would be interested/available.

Thank you


Job Title: - Sr. Data Engineer/Analytics Engineer

Assignment Type: 6 month contract-to-hire

Location: Remote, must work EST hours

Job Summary

We are looking for a hands-on Senior Data Engineer/Analytics Engineer with expertise in developing data pipelines and transforming data to be consumed downstream. This role is crucial in designing, building, and maintaining our data infrastructure, focusing on creating scalable pipelines, ensuring data integrity, and optimizing performance.

Key skills include strong Snowflake expertise, advanced SQL proficiency, data extraction from APIs using Python and AWS Lambda, and experience with ETL/ELT processes. Workflow automation using AWS Airflow is essential, and experience with Fivetran and dbt or similar tools are also a must have.

Job Responsibilities

Design, build, test, and implement scalable data pipelines using Python and SQL.

Maintain and optimize our Snowflake data warehouses performance, including data ingestion and query optimization.

Design and implement analytical data models using SQL in dbt and Snowflake, focusing on accuracy, performance, and scalability.

Own and maintain the semantic layer of our data modeling, defining and managing metrics, dimensions, and joins to ensure consistent and accurate reporting across the organization.

Collaborate with the internal stakeholders to understand their data needs and translate them into effective data models and metrics.

Perform analysis and critical thinking to troubleshoot data-related issues and implement checks/scripts to enhance data quality.

Collaborate with other data engineers and architects to develop new pipelines and/or optimize existing ones.

Maintain code via CI/CD processes as defined in our Azure DevOps platform.


Job Qualifications

Highly self-motivated and detail-oriented with strong communication skills.

5+ years of experience in Data Engineering roles, with a focus on building and implementing scalable data pipelines for data ingestion and data transformation.

Expertise in Snowflake, including data ingestion and performance optimization.

Strong experience using ETL software (Fivetran, dbt, Airflow, etc.).

Strong SQL skills for writing efficient queries and optimizing existing ones.

Proficiency in Python for data extraction from APIs using AWS Lambda, Glue, etc.

Experience with AWS services such as Lambda, Airflow, Glue, S3, SNS, etc.

Similar Jobs