Tech Stack
AWSAzureBigQueryCassandraCloudETLNoSQLOraclePythonRDBMSSQL
About the role
- Design, develop, and maintain conceptual, logical, and physical data models
- Translate business requirements into data requirements and structures
- Profile source data from internal and external systems to identify trends, patterns, and quality issues
- Analyze existing data structures and identify gaps for new requirements
- Collaborate with data architects, data engineers, report developers, AI/ML engineers, DBAs, and business stakeholders
- Create ERDs, data flow diagrams, and visualizations to represent data models
- Implement and maintain data models in databases, data warehouses, and data lakes
- Build and operationalize complex data solutions, apply transformations, and recommend data cleansing/quality solutions
- Conduct performance tuning and optimization of data models
- Incorporate data governance, data security, and data quality best practices
Requirements
- Bachelor's degree in computer science, engineering or similar domain
- Hands-on experience in data gathering, profiling, data modeling and data analysis
- Expert knowledge of data modeling concepts, methodologies, and best practices & proficiency in data modeling tools
- Familiarity with dimensional modeling and data warehousing concepts
- Expert level proficiency in designing SQL queries and Python scripts
- Hands-on experience with RDBMS (SAP, Oracle, SQL Server) and NoSQL (Cassandra, Mongo)
- Experience with ETL, Big Data/Cloud platforms (Microsoft Azure, Google BigQuery, Databricks, AWS, Cloudera)
- Experience with unstructured, semi-structured, and structured data
- Experience with Business Intelligence/Data visualization and Reporting tools such as Power BI
- Expert knowledge of statistical analysis techniques for descriptive and inferential statistics