Document Question Answering with Generative AI and Elasticsearch
This project provides a hands-on demonstration of a DevOps workflow designed to automate and test local development environments with seamless integration into cloud-based DevOps processes. The focus lies on functional and integration testing specifically involving Elasticsearch and Ollama, rather than isolated unit testing.
The `llmdoc` Python application employs a Retrieval-Augmented Generation (RAG) approach. It leverages a Large Language Model (LLM) to summarize search results retrieved from Elasticsearch document queries.
Key Components:
Google Cloud Infrastructure for Document Question Answering
Document Question Answering System using Generative AI and Elasticsearch on Google Cloud
Key Components:
HashiCorp Vault deployment into Google Kubernetes Engine (GKE)
The project implements Continuous Delivery (CD) of Hashicorp Vault into a private GCP Kubernetes cluster (GKE).
GCP Cloud Build pipeline:
Developer Workstation implementation in Azure cloud VMs
The project solves the problem "It works on my workstation..." by creating identical development environments in Azure cloud VMs for all developers working on the application.
Implemented features:
DevOps is a set of practices that apply software development practices (Dev) to the IT operations (Ops) processes, focusing on automation and a short feedback cycle.