Module backend.ollama_connect
This file handles the functionality for generating cover letters and providing resume suggestions using Ollama. The system prompts used for each of these tasks are at the top of the file using the system_prompt_cv and system_prompt_suggest variables.
The model being used by Ollama can be changed by adjusting the model variable in the ChatOllama instantion of the llm variable at the top of the file.
Functions
def generate_cv(resume, job_desc, context='')
-
Generates a cover letter from a resume and a job description. The resume content is extracted from a pdf in the application, which is taken from the file tag in the request. In testing, the resume tag is used to pass in the sample resume. Context is also used as a user input to guide the generation.
Request: { resume: string, job_desc: string, file: string, context: string } Response: { status: boolean data: message (Success / Error message as per status) }
def resume_suggest(resume, job_desc)
-
Reviews a resume and provides suggestions to tailor it for a job description.
Request: { resume: string, job_desc: string, file: string } Response: { status: boolean data: message (Success / Error message as per status) }