Keller Postman LLC

Data Engineer

Keller Postman LLC

full-time

Posted on:

Origin:  • 🇺🇸 United States

Visit company website
AI Apply
Manual Apply

Salary

💰 $160,000 - $170,000 per year

Job Level

Mid-LevelSenior

Tech Stack

AzureCloudETLKafkaPythonSQLTerraform

About the role

  • Develop, construct, test, and maintain data architectures, including databases and large-scale processing systems.
  • Design, build, and optimize data pipelines and ETL/ELT processes leveraging Snowflake and Azure Services.
  • Develop and maintain Snowflake data warehouses, ensuring efficient data modeling, partitioning, and performance tuning.
  • Implement data flow processes that automate and streamline data collection, processing, and analysis.
  • Ensure data governance, quality, and security best practices across all data platforms.
  • Collaborate with analytics and business teams to improve data models that feed business intelligence tools.
  • Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability.
  • Provide operational support for existing data infrastructure and develop new solutions as needed.
  • Monitor, troubleshoot, and optimize system performance in Azure and Snowflake environments.
  • Support CI/CD pipelines and automation for data workflows and deployments.
  • Keep current with industry trends and innovations in data engineering and propose changes to the existing landscape.

Requirements

  • Proficient in Snowflake, Databricks, or similar tools and experience in data warehousing
  • Skilled in SQL, ETL design, and data modeling
  • Proficiency in SQL (complex queries, stored procedures, optimization) and familiarity with Python for data engineering tasks
  • Experience with Salesforce data integration is a plus
  • Strong analytical skills with attention to detail and accuracy
  • Strong knowledge of ETL/ELT patterns, orchestration, and workflow automation
  • Familiarity with Sigma Computing for reporting, data visualization, and business user self-service analytics
  • Understanding of data governance, security, and compliance frameworks (e.g., GDPR, HIPAA)
  • Experience with streaming data technologies (Kafka, Event Hubs, or similar)
  • Exposure to DevOps practices and Infrastructure as Code (e.g., Terraform, ARM templates)
  • Adept at queries, report writing, and presenting findings
  • Excellent problem-solving and troubleshooting skills
  • Ability to work in a fast-paced environment and manage multiple projects simultaneously
  • Strong communication skills, capable of conveying complex data issues to non-technical team members
  • Bachelor’s degree in Computer Science, Engineering, Information Technology, or a related field
  • Minimum of 5 years of experience in a data engineering role
  • Experience working with Azure cloud services and data warehousing technologies
  • Relevant certifications in Azure or other cloud technologies are beneficial
  • Must be able to read, write, and speak fluent English