Penn Interactive

Software Engineer, Data

Penn Interactive

full-time

Posted on:

Location Type: Remote

Location: Remote • 🇺🇸 United States

Visit company website
AI Apply
Apply

Salary

💰 $115,000 - $160,000 per year

Job Level

JuniorMid-Level

Tech Stack

AWSAzureBigQueryCloudDistributed SystemsGoogle Cloud PlatformJestKubernetesPythonRSpecSQL

About the role

  • As part of the Data Engineering team, you will be working with a team of smart, friendly, and dedicated Data Engineers, Analysts, and Data Scientists determined to develop high-quality and sustainable data-driven solutions to improve profitability, growth and the user experience.
  • Collaborate with Data Science, Reporting, Analytics, and other engineering teams to build data pipelines, infrastructure, and tooling to support business initiatives
  • Oversee the design and maintenance of data pipelines and contribute to the continual enhancement of the data engineering architecture
  • Collaborate with the team to meet performance, scalability, and reliability goals
  • Write out tests and thorough documentation for processes and tooling
  • Adapt to working with new technologies and frameworks, sometimes headlining the investigation into their usefulness to the team
  • Maintain and expand existing systems, tooling, and infrastructure
  • Maintain the data warehouse ensuring data integrity and performance
  • Take ownership of projects, plan and collaborate with other members of the Analytics and Reporting teams or others within the company
  • Other duties as required.

Requirements

  • A solid foundation in computer science, with strong competencies in data structures, distributed systems, algorithms, and software design
  • 2+ years of experience in data engineering
  • Strong knowledge of Python
  • Strong knowledge of relational databases and SQL
  • Experience building out a scalable infrastructure to fit the needs of a growing company
  • Experience in data ingestion, processing, and orchestration. Including Third-Party APIs, Change Data Capture, Streaming Data.
  • Experience with BigQuery, Snowflake, or other cloud, SQL-based data warehouses
  • Experience with at least one major cloud platform (AWS, GCP, Azure)
  • Experience with Kubernetes
  • Experience with testing frameworks such as RSpec, Jest, pytest or equivalent
  • Strong organization and collaboration skills
  • Excellent written and oral communications skills.
Benefits
  • Competitive compensation package
  • Fun, relaxed work environment
  • Education and conference reimbursements.
  • Opportunities for career progression and mentoring others

Applicant Tracking System Keywords

Tip: use these terms in your resume and cover letter to boost ATS matches.

Hard skills
data structuresdistributed systemsalgorithmssoftware designPythonrelational databasesSQLdata ingestiondata processingdata orchestration
Soft skills
organizationcollaborationwritten communicationoral communication