GE Vernova

Data Architect

GE Vernova

full-time

Posted on:

Location Type: Remote

Location: Remote • 🇺🇸 United States

Visit company website
AI Apply
Apply

Job Level

SeniorLead

Tech Stack

AirflowAmazon RedshiftApacheAWSAzureBigQueryCloudDockerGoogle Cloud PlatformJavaKafkaKubernetesPythonSparkSQL

About the role

  • Serve as a technical leader or mentor on complex, integrated customer implementations.
  • Write project documentation in accordance with deliverables, including detailed design specifications.
  • Provide training and mentorship to develop teams and customers.
  • Collaborate with Project Managers, Delivery Managers, Solution Leads, DevOps, and Services Consultants throughout the implementation of a project.
  • Drive excellence in execution through continuous improvement.
  • Lead discussions with business users, subject matter experts, and stakeholders to gather data needs and translate them into functional architecture components.
  • Define domain-driven data models aligned with electric utility processes.
  • Collaborate with solution architects to align logical data models with physical implementations.
  • Support data cataloging, lineage tracing, and business glossary implementation.
  • Implement data validation and monitoring processes to ensure data accuracy, consistency, and reliability.

Requirements

  • Bachelor's or Master's degree in Computer Science, Engineering, Information Systems, or a related technical field.
  • 10+ years of experience in industrial systems related field.
  • Advanced proficiency in programming languages commonly used in data engineering, such as Python or Java.
  • Strong knowledge of database systems and advanced SQL for data querying and optimization.
  • Experience with big data frameworks like Apache Spark or Kafka.
  • Familiarity with cloud platforms and their data services, such as AWS (e.g., S3, Glue, Redshift), Azure (e.g., Data Factory, Synapse), or Google Cloud Platform (e.g., BigQuery, Dataflow).
  • Knowledge of containerization technologies like Docker and Kubernetes.
  • Experience with workflow orchestration tools like Apache Airflow or Camel.
  • Experience with data modeling techniques and best practices.
  • Excellent analytical and problem-solving abilities.
  • Strong verbal and written communication skills with the ability to explain technical concepts to non-technical stakeholders.
Benefits
  • Relocation Assistance provided: No
  • Remote position

Applicant Tracking System Keywords

Tip: use these terms in your resume and cover letter to boost ATS matches.

Hard skills
PythonJavaSQLApache SparkKafkaAWSAzureGoogle Cloud PlatformDockerKubernetes
Soft skills
analytical skillsproblem-solvingverbal communicationwritten communicationmentorshipcollaborationleadershipcontinuous improvement