
Data Engineer, AWS, Databricks
NISC
full-time
Posted on:
Location Type: Hybrid
Location: Iowa, Montana, North Dakota • 🇺🇸 United States
Visit company websiteJob Level
Mid-LevelSenior
Tech Stack
AirflowApacheAWSCassandraCloudDynamoDBEC2ETLHadoopJavaJavaScriptKafkaNoSQLOraclePostgresPythonScalaSparkSpringSQL
About the role
- Assemble large, complex data sets that meet functional / non-functional business requirements.
- Understanding of Data Warehouse and Data Lakehouse paradigms.
- Design and build optimal data pipelines from a wide variety of data sources using AWS technologies.
- Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs.
- Create data tools for analytics and data scientist team members that assist them in building and optimizing a unified data stream.
- Work with other data engineering experts to strive for greater functionality while making data more discoverable, addressable, trustworthy, and secure.
- Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
- Create and maintain a culture of engagement and one that is conducive of NISC’s Statement of Shared Values.
- Commitment to NISC’s Statement of Shared Values.
Requirements
- Experience building and optimizing data pipelines, architectures, and data sets.
- Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.
- Strong analytic skills related to working with unstructured datasets.
- Build ETL processes supporting data transformation, data structures, metadata, dependency, and workload management.
- Working knowledge of message queuing, stream processing, and highly scalable data stores.
- Experience supporting and working with cross-functional teams in a dynamic environment.
- Candidate with experience in a Data Engineer role, who has attained a BS or MS degree in Computer Science, Statistics, Informatics, Information Systems, or another quantitative field. They should also have experience using the following software/tools:
- Experience with AWS: Lambda, S3, SQS, SNS, CloudWatch, etc.
- Experience with Databricks and Delta Lake.
- Experience with big data tools: Hadoop, Spark, Kafka, etc.
- Experience with relational SQL and NoSQL databases, including Oracle, Postgres Cassandra, and DynamoDb.
- Experience with data pipeline and workflow management tools: Hevo Data, Airflow, etc.
- Experience with AWS cloud services: EC2, Databricks, EMR
- Experience with stream-processing systems: Apache Spark, Kafka Streams, Spring Cloud, etc.
- Experience with object-oriented languages: Java, Scala.
- Nice-to-have
- Experience with scripting languages: Python, JavaScript, Bash, etc.
- Strong verbal and written communication skills.
- Ability to demonstrate composure and think analytically in high pressure situations.
Benefits
- Medical, Dental and Vision Insurance.
- Health Savings Account (HSA) with $100 monthly contributions from NISC.
- Like to walk? Improve your overall wellness knowledge? Ability to earn up to $800 additional dollars into your HSA each year through our Wellness Rewards program.
- Dependent Care Flexible Spending Account (FSA) thru Paylocity.
- Fully covered life insurance up to x3 annual base salary.
- Fully covered short- and long-term disability.
- 401(k), traditional or Roth, with employee match up to 6% and employer 4% salary base contributions.
- PTO accrual levels dependent on years of service, 120 Life Leave Event hours, 9 paid holidays and an annual holiday week.
- $2,500 Interest-FREE technology loan program.
- $25,000 employee educational assistance program.
- Volunteer, Wellness, Family Events and other employee fun supplied by our committees.
- Employee Assistance Program; assisting employees and dependents with virtually any life event
- Benevolence Committee to support employees with financial hardships like unexpected medical bills, funerals and other unfortunate hardships.
Applicant Tracking System Keywords
Tip: use these terms in your resume and cover letter to boost ATS matches.
Hard skills
data pipeline optimizationETL processesdata architectureroot cause analysisunstructured data analysismessage queuingstream processingrelational databasesNoSQL databasesobject-oriented programming
Soft skills
strong analytic skillsverbal communicationwritten communicationcomposure under pressurecross-functional collaboration