Tech Stack
AzureCassandraERPETLHadoopMySQL.NETOraclePerlPythonShell ScriptingSQL
About the role
- Supports the creation of data models in a structured data format to enable analysis thereof.
- Assists with the design and development of scalable extract, transform and loading (ETL) packages from the business source systems and the development of ETL routines to populate data from sources.
- Participates in the transformation of object and data models into appropriate database schemas within design constraints.
- Interpret installation standards to meet project needs and produce database components as required.
- Takes direction from various stakeholders to create test scenarios and be responsible for participating in thorough testing and validation to support the accuracy of data transformations.
- Assists with running data migrations across different databases and applications, for example, MS Dynamics, Oracle, SAP and other ERP systems.
- Supports the definition and implementation of data table structures and data models based on requirements.
- Takes part in analysis, and development of ETL and migration documentation.
- Receives detailed instructions from various stakeholders to evaluate potential data requirements.
- Assists with the definition and management of scoping, requirements, definition, and prioritization activities for small-scale changes and assist with more complex change initiatives.
- Takes direction from various stakeholders, contributing to the recommendation of improvements in automated and non-automated components of the data tables, data queries and data models.
Requirements
- Bachelor's degree or equivalent in computer science, software engineering, information technology, or a related field
- Relevant certifications preferred such as SAP, Microsoft Azure, Certified Data Engineer, Certified Professional
- Basic experience as a data engineering, data mining within a fast-paced environment
- Familiarity with building modern data analytics solutions that delivers insights from large and complex data sets with multi-terabyte scale
- Basic experience with architecture and design of secure, highly available and scalable systems
- Familiarity with automation, scripting and proven examples of successful implementation
- Familiarity with scripts using scripting language (Perl, bash, Shell Scripting, Python, etc.)
- Basic experience with big data tools like Hadoop, Cassandra, Storm etc.
- Basic experience in any applicable language, preferably .NET
- Familiarity with working with SAP, SQL, MySQL databases and Microsoft SQL
- Basic experience working with data sets and ordering data through MS Excel functions, e.g. macros, pivots
- Knowledge of Microsoft Azure Data Factory, SQL Analysis Server, SAP Data Services, SAP BTP
- Understanding of database concepts, object and data modelling techniques and design principles and conceptual knowledge of building and maintaining physical and logical data models
- Knowledge of the definition and management of scoping requirements, definition and prioritization activities
- Analytical mindset with good business acumen skills
- Problem-solving aptitude with the ability to communicate effectively, both written and verbal
- Ability to build effective relationships at all levels within the organization