llm_topic_modelling / requirements_cpu.txt

Commit History

Added model compatibility for OpenAI and Azure endpoints. Added some Bedrock models, now compatible with thinking models
3085585

seanpedrickcase commited on

Reverted requirements files to previous direct wheel links
f4c57ae

seanpedrickcase commited on

Added deduplication with LLM functionality. Minor package updates. Updated installation documentation.
6f3d42c

seanpedrickcase commited on

Optimised prompts. Updated Gradio. Added example for zero shot topics. Added support for Granite 4 local model
9e8c029

seanpedrickcase commited on

Updated output logging files. Updated examples and readme. Minor prompt adjustments and package updates
fff212b

seanpedrickcase commited on

Added framework of support for Azure models (although untested)
138286c

seanpedrickcase commited on

Enhanced app functionality by adding new UI elements for summary file management, put in Bedrock model toggle, and refined logging messages. Updated Dockerfile and requirements for better compatibility and added install guide to readme. Removed deprecated code and unnecessary comments.
ba1a951

seanpedrickcase commited on

Changed default requirements to CPU version of llama cpp. Added Gemini Flash 2.0 to model list. Output files should contain only final files.
b0e08c8

seanpedrickcase commited on

Topic deduplication/merging now separated from summarisation. Gradio upgrade
854a758

seanpedrickcase commited on

Added spaces to requirements for cpu run and AWS run.
a6d1841

seanpedrickcase commited on

Corrected references to extra-index-url in requirements/Dockerfile
3db2499

seanpedrickcase commited on

Added support for using local models (specifically Gemma 2b) for topic extraction and summary. Generally improved output format safeguards.
b7f4700

seanpedrickcase commited on