Tech Stack
BigQueryCloudETLGoogle Cloud PlatformPythonRealmSQL
About the role
- Assessments of existing data components, Performing POCs, Consulting to the stakeholders
- Proposing end to end solutions to an enterprise's data specific business problems, and taking care of data collection, extraction, integration, cleansing, enriching and data visualization
- Ability to design large data platforms to enable Data Engineers, Analysts & scientists
- Strong exposure to different Data architectures, data lake & data warehouse
- Define tools & technologies to develop automated data pipelines, write ETL processes, develop dashboard & report and create insights
- Continually reassess current state for alignment with architecture goals, best practices and business needs
- DB modeling, deciding best data storage, creating data flow diagrams, maintaining related documentation
Requirements
- Data Architect with recent expertise in SQL, Python, ERD, GCP(All services, especially BigQuery, GCS, Cloud Function, Composer), DBT with Active/Heavy hands-on.
- Must have worked in Big data (100TB+)
- Possess the knowledge of Modern Data Technology released.
- Design and optimize conceptual and logical database models
- Extensive experience in data modeling.
- Analyze system requirements, implementing data strategies, and ensuring the efficiency and securityImprove system performance by conducting tests, troubleshooting, and integrating new elements
- In-depth understanding of database structure principles
- Expertise in implementing and maintaining data pipelines
- Deep knowledge of data mining and segmentation techniques
- Familiarity with data visualization tools
- Candidates will have to attend code assessment test in SQL and Python.