Focus on designing and building data infrastructure and systems to enable efficient data processing and analysis.
Responsible for developing and implementing data pipelines, data integration, and data transformation solutions.
Leverage your influence, expertise, and network to deliver quality results.
Motivate and coach others, coming together to solve complex problems.
Solve through complexity, ask thoughtful questions, and clearly communicate how things fit together.
Develop and sustain high performing, diverse, and inclusive teams, contributing to the success of the Firm.
Requirements
Strong proficiency in Python and experience with structured and unstructured data.
Strong proficiency in SQL and experience with relational databases.
Experience writing and maintaining FastAPI endpoints for scalable applications.
Strong understanding of AI techniques that enhance LLMs, such as AI Agents, Retrieval-Augmented Generation (RAG), etc.
Experience in prompt engineering for optimizing LLM outputs.
Experience with AI, GenAI, and machine learning and data science workflows.
Experience with machine learning and data science workflows is a plus.
Experienced in high software quality through developer-led testing, validation, and best practices.
Understanding of developer-led quality assurance, including automated testing, performance tuning, and debugging.
Knowledge of software development workflows and CI/CD pipelines.
Work with Docker, including writing Docker files and managing containerized deployments.
Develop and deploy scalable data storage solutions using AWS, Azure, and GCP services such as S3, Redshift, PostGresDB RDS, DynamoDB, Azure Data Lake Storage, Azure Cosmos DB, Azure SQL DB, GCP Cloud Storage, etc.
Knowledge of data integration solutions using AWS Glue, AWS Lambda, Azure Data Factory, Azure Functions, GCP Functions, GCP Dataproc, Dataflow, and other relevant services.
Design and manage data warehouses and data lakes, ensuring data is organized and accessible.
Design and implement comprehensive data architecture strategies that meet the current and future business needs.
Develop and document data or system models, flow diagrams, and architecture guidelines.
Ensure data architecture is compliant with data governance and data security policies.
Collaborate with business stakeholders to understand their data requirements and translate them into technical solutions.
Evaluate and recommend new data technologies and tools to enhance data architecture.
Implement IAM roles and policies to manage access and permissions within AWS, Azure, GCP.
Use AWS CloudFormation, Azure Resource Manager templates, Terraform for infrastructure as code (IaC) deployments.
Use AWS, Azure, and GCP DevOps services to build and deploy DevOps pipelines.
Optimize Cloud resources for cost, performance, and scalability.
Knowledge of data governance and data security best practices.
Strong analytical, problem-solving, and communication skills.
Ability to work independently and as part of a team in a fast-paced environment.
Benefits
medical
dental
vision
401k
holiday pay
vacation
personal and family sick leave
annual discretionary bonus
Applicant Tracking System Keywords
Tip: use these terms in your resume and cover letter to boost ATS matches.