Architect and implement serverless AI workflows leveraging AWS Lambda, Step Functions, and Bedrock APIs.
Design prompt and context engineering pipelines to generate recommendations dynamically from partial or tentative form responses.
Implement data ingestion, validation, and contextual augmentation using AWS services (S3, DynamoDB, API Gateway, Lambda, Step Functions, EventBridge, etc.).
Develop and deploy infrastructure as code using AWS CDK (TypeScript).
Integrate LLM-based inference into end-user apps and internal tools.
Optimize inference cost and latency through intelligent caching, grounding, and response evaluation mechanisms.
Collaborate with UX and product teams to embed generative capabilities seamlessly into existing workflows.
Requirements
5+ years in cloud-native development on AWS.
Expertise in serverless architecture, TypeScript, AWS CDK, and Bedrock (Claude, Titan, or third-party models).
Proven experience in prompt engineering, context orchestration, or agentic system design.
Experience with vector databases, contextual retrieval (RAG), and event-driven architectures.
Strong understanding of AI safety, model evaluation, and governance patterns.
Applicant Tracking System Keywords
Tip: use these terms in your resume and cover letter to boost ATS matches.