PowerSecure, Inc.

Data Solutions Engineer

PowerSecure, Inc.

full-time

Posted on:

Location Type: Remote

Location: CaliforniaUnited States

Visit company website

Explore more

AI Apply
Apply

About the role

  • Design, implement, and support end‑to‑end ELT pipelines (ingest → transform → publish) in Databricks/ADF
  • Implement data quality checks (DLT expectations, unit tests) with alerting and remediation runbooks
  • Build curated, analytics‑ready Delta tables using dimensional modeling for consumption by BI Developers
  • Implement CDC and deletion‑flag patterns; manage schema drift and partitioning/Z‑Ordering strategies
  • Operationalize jobs with monitoring, logging, alerting; participate in an on‑call rotation as needed
  • Partner with the Data Architect to align designs with standards for governance, security, and cost efficiency
  • Document pipelines, data contracts, and SLAs; continuously improve performance and reliability

Requirements

  • 2+ years of hands‑on data engineering (or comparable software engineering with significant data work)
  • 2+ years building pipelines on Azure and Databricks (or equivalent cloud + Spark)
  • Strong SQL (analytical queries, window functions), PySpark/Spark SQL, and data modeling fundamentals
  • Bachelor’s degree in MIS, Computer Science, Engineering, or equivalent experience
  • Proficiency with SQL and Python (PySpark), including performance tuning on large datasets
  • Experience with Azure Databricks, Delta Lake, Delta Live Tables (DLT), Azure Data Factory (or Fabric Data Pipelines), ADLS Gen2, and Azure DevOps/Git for CI/CD
  • Working knowledge of Unity Catalog and/or Microsoft Purview for governance, lineage, and security
  • Familiarity with data ingestion patterns (files, APIs, JDBC), schema evolution, CDC, and deletion detection patterns
  • Understanding of dimensional modeling to produce analytics‑ready datasets for Power BI
  • Exposure to orchestration/monitoring, cost optimization, alerting, and runbook‑driven operations
  • Data pipeline design (batch & streaming), DLT expectations for data quality, and robust error handling
  • Source control, branching strategies, and CI/CD for data assets (notebooks, jobs, workflows)
  • Practical understanding of privacy, security, and RBAC in cloud data platforms
  • Excellent communication, documentation, and cross‑functional collaboration skills
  • Analytical mindset; bias toward automation and measurable reliability
Benefits
  • Medical, dental, vision and life insurance coverage
  • Competitive pay and a matching 401 (k) plan
  • Vacation, Company Holidays, Paid Time Off (PTO- personal and sick days)
  • Flexible spending accounts/Health savings account
  • Wellness Incentive Programs
  • Employee Referral Program
  • Tuition Reimbursement
Applicant Tracking System Keywords

Tip: use these terms in your resume and cover letter to boost ATS matches.

Hard Skills & Tools
data engineeringSQLPySparkdata modelingperformance tuningdata pipeline designdimensional modelingdata quality checksCDCalerting
Soft Skills
communicationdocumentationcross-functional collaborationanalytical mindsetautomationmeasurable reliability
Certifications
Bachelor's degree in MISBachelor's degree in Computer ScienceBachelor's degree in Engineering